Just a link, but well done.
http://martinvalasek.com/blog/pictures-from-a-developers-life
Animated gifs to describe the day to day of a developer; or most personnel in an engineering level role of IT.
Just a link, but well done.
http://martinvalasek.com/blog/pictures-from-a-developers-life
Animated gifs to describe the day to day of a developer; or most personnel in an engineering level role of IT.
So this is a quick blog for a frustrating issue, that is actually very easy to resolve.
If you are on Windows Server 2012, and you’ve had to reinstall WSUS for any reason and receive the following error in your tmp log after attempting to finalize the installation:
2013-02-20 16:04:17 Creating default subscription.
2013-02-20 16:04:17 Instantiating UpdateServer
2013-02-20 16:04:19 CreateDefaultSubscription failed. Exception: System.Net.WebException: The request failed with HTTP status 503: Service Unavailable.
at System.Web.Services.Protocols.SoapHttpClientProtocol.ReadResponse(SoapClientMessage message, WebResponse response, Stream responseStream, Boolean asyncCall)
at System.Web.Services.Protocols.SoapHttpClientProtocol.Invoke(String methodName, Object[] parameters)
at Microsoft.UpdateServices.Internal.ApiRemoting.GetServerVersion()
at Microsoft.UpdateServices.Internal.DatabaseAccess.AdminDataAccessProxy.GetServerVersion()
at Microsoft.UpdateServices.Internal.BaseApi.UpdateServer.CreateUpdateServer(String serverName, Boolean useSecureConnection, Int32 portNumber)
at Microsoft.UpdateServices.Internal.BaseApi.UpdateServer..ctor(Boolean bypassApiRemoting)
at Microsoft.UpdateServices.Setup.StartServer.StartServer.CreateDefaultSubscription()
2013-02-20 16:04:19 StartServer encountered errors. Exception=The request failed with HTTP status 503: Service Unavailable.
2013-02-20 16:04:19 Microsoft.UpdateServices.Administration.CommandException: Failed to start and configure the WSUS service
at Microsoft.UpdateServices.Administration.PostInstall.Run()
at Microsoft.UpdateServices.Administration.PostInstall.Execute(String[] arguments)
It’s most likely an IIS issue.
Open the IIS console, delete the WSUS Site, and perform the post installation tasks again.
Something like that anyway….
I plan on posting more stuff in the future after things have normalized.
Soooo……
After GatherWriterMetadata SMS Writer status = FAILED_AT_PREPARE_BACKUP. SMS_SITE_BACKUP 2/4/2013 3:32:38 PM 8500 (0x2134)
Error: VSS_E_WRITERERROR_TIMEOUT. Error Code = 0x80,042,3f2. SMS_SITE_BACKUP 2/4/2013 3:32:38 PM 8500 (0x2134)
vssadmin list writers
hmmmm, it’s there. Let me check the sms_site_sql_backup service on the database servers.
Ok, lets check our permissions on the servers to make sure machines have local admin, share permissions, and that they have SA rights.
Hmmm, are the backup components installed on the SQL nodes?
They are, sorta, but aren’t installed on the static drives of our SQL server nodes, this is a problem…
So make sure it’s where it needs to be, like on SAN drives. Especially a SANS drive tied to SQL Cluster that will be failing over. You don’t want to manually re-point components; but in case you do (or in our case NEED to)…. here’s how:
Get on every drive that DOESN’T need sms site components installed to them and place NO_SMS_ON_DRIVE.SMS in their root.
Now open your registry on the site server and go to:
hklm\Software\Microsoft\SMS\Components\SMS_SITE_COMPONENT_MANAGER\Multisite Component Servers\<servername>\Installation Directory
Change this path to the preferred static drive on the respective servers. Now:
net stop sms_site_component_manager net stop sms_executive net start sms_site_component_manager
Now check sitecomp.log for your server name, verify the installation and connection. Now lets check the local drives of the component sql server and verify the installation path we declared earlier.
Is it there? If so, great, if not….
Open services and look for:
SMS_SITE_SQL_BACKUP_<site servername>
Go to properties and check it’s local path.
Stop the SMS_SITE_SQL_BACKUP_<site servername> service
Now open the registry and go to
hklm\System\CurrentControlSet\services\SMS_SITE_SQL_BACKUP_*\ImagePath
Specify the local drive and path you wish to use and now move the contents from the previous service to it’s new home.
You’ll need to verify and do the same for the log files:
hklm\Software\Microsoft\SMS\Tracing\SMS_SITE_SQL_BACKUP<site servername>\TraceFilename
Now:
Start the SMS_SITE_SQL_BACKUP_<site servername> service
Rinse, and repeat until they are where they need to be, and finally perform and verify the backup completes:
(from the site server)
net start sms_site_backup
and watch the smsbkup.log
Props to MS PFE Sean MAHONEEEEEEY! For his assistance
I’ve updated the inventory enforcement script and post for anyone who utilizes it. It should be cleaner now as it only depends on wmi for the inventory actions.
So in an attempt to quickly extract OS version and Service Pack for a few machines in an environment the idea was presented to pull the data from active directory. The properties exist so the logic seemed sound; and as we’ve discussed this before it’s a pretty easy task with the active directory module in PowerShell, and here’s the code:
$list = gc computers.txt Import-Module ActiveDirectory ForEach($item in $list){ $ado = (get-adcomputer $item -Properties *) $pso = New-Object PSObject $pso | Add-Member -Name "Computer Name" -MemberType NoteProperty ` -Value $ado.CN $pso | Add-Member -Name "Operating System" -MemberType NoteProperty ` -Value $ado.OperatingSystem $pso | Add-Member -Name "Service Pack" -MemberType NoteProperty ` -Value $ado.OperatingSystemServicePack $pso }
Assuming for the sake of example the name of this script is, get-adservicepack.ps1, and you’ve got your computers.txt file with your computer names in it then we’d run it like this.
./get-adservicepack.ps1 | export-csv -NoTypeInformation MyAdOutput.csv
So what’s happening?
First, we’re taking the get-content command to pull data from a local text file “computers.txt” into a data object and then iterating through it sequentially. We are then using the computer name as the lookup name with the get-adcomputer cmdlet along with all it’s ad properties and assign it to a variable called ado.
Now we create a PowerShell object and begin to give it some noteproperties with values pulled from our ad object we created from the ad cmdlet then echo it’s contents out by calling it.
When we run the script and pipe it’s output to export-csv –NoTypeInformation we are taking that output and putting it directly into a csv without any of the object information, otherwise it’s a tabled console output.
PowerShell is so boss sometimes…
Maybe we just do all this in one line?
gc computers.txt|ForEach-object{Get-ADComputer $_ -properties *|select -Property name,operatingsystem,operatingsystemservicepack}|export-csv -notypeinformation output.csv
Scroll that line, like a boss.
Ok, brief introduction on this. We are moving, and in the process of selling a home among other things. So in the process of staging our home we’ve packed up our desktops and servers here. Not a problem, we have 3 laptops, 1 is my personal, 1 is for my work, and the other is a shared one for the family. Recently while testing some stuff with hyper-v installed on the shared laptop I had imaged the machine then saved it to a portable drive. Well what does that have to do with anything? My wife wanted to use the shared laptop as a stand in until her desktop could come back out, and I’d since deleted the image and packed all my windows 7 discs etc. Ruh roh.
So being the FOSS guy that I am (outside of work), I decided I’d use this opportunity to throw a very user friendly Linux distribution onto the laptop and see what my wife thought of it.
The distribution:
Now this was a fairly easy setup to make for my wife. She is what I would consider an average user with some advanced usage areas. Primarily with video editing and photo editing (some graphic design stuff). However for her day to day she needed web, office suite, and email. Since I’m going to be on the road a lot, I needed her to have Skype as well as other chat programs.
I installed from a USB key using the live iso image. From windows this is easy to achieve, download Windows 32 Disk Imager and the desired ISO file, rename it to a .img then apply it to your flash device and viola.
After a very light crash course on navigating the system and the passwords I setup for her she was off and rocking. Our first hurdle however was the native mail client with KDE, Kmail. It’s functional, but she was having some issues with the large volume of emails she has on one of her accounts so I opted to install one of my favorite FOSS mail clients on their instead, Thunderbird.
We both are avid Chrome users, and have our accounts and profiles linked for easy setup across platforms, so that was a breeze for her. I imported her windows profile data to her home directory which got her back up and running with the data she needed as well.
For her Office needs we went with (and she’s adjusted quite easily to) using LibreOffice which has usurped OpenOffice as the FOSS Office Suite of choice with most Linux distros these days.
For her photo editing there is of course Gimp which has been an amazing free image manipulation application for over a decade. For her video editing needs there is Avidemux and OpenShot, neither of which she has made use of yet. Her video editing needs are slim at this time anyway.
Now, here is the mind blowing part for some of you Linux timids or FOSS nay sayers. Up until this point, I’d done everything from install of the OS to now without once going into the terminal. I had made it a point not to. If I was going to give this to my wife to use, regardless of my feelings about the console, she shouldn’t have to use it. All the application installs were performed from default package stores as well using only the Linux Mint Software Manager. Being a long time Linux user (since 97~98) I was personally blown away by this. I normally run to the console out of habit, but found that it was at a point where I genuinely didn’t have to. However there was a problem for usability for my wife that I felt needed to be addressed, and it would require me to script something for her.
In Windows, I have a PowerShell script my wife uses to pull file system object information for her pictures and looks to the creation date then renames the files for her. So this was something she of course wanted, and even though Dolphin and Konqueror have a really cool built in rename feature, this date method wasn’t possible for her out of the box. I’m not going to lie, I was personally elated with this task, but finding the best method was tricky so I went with a simple iteration script she could drop in the folder and double click to run. If she chooses to use Linux longer I’ll work on a possibly building her something a bit more robust, but here’s the existing code:
#!/bin/bash #Script to rename jpg by create date and number x=1 for i in *.jpg *.JPG do y=$(stat -c %y "$i" | sed 's/^\([0-9\-]*\).*/\1/') file=$y"("$x").jpg" mv "$i" $file x=$[$x+1] done
So there is a bit of regex usage here, but the end result is each jpg file in the existing folder has it’s date information pulled from the file using stat, then piped through sed to chop it down to a preferable string. That string is then concatenated into a file name with an incremental integer and extension then the existing file has it’s name changed to that newly created string.
So what did my wife gain from this interim transition? A new appreciation for another operating system and the flexibility and eye candy that comes with Linux environments? I don’t know, but it’s allowed us to retrofit some old hardware with her level of desired functionality at 0 additional cost. All in all, I’d say that’s pretty cool, and I’m still stoked my wife is rocking out with Tux.
The 9 traits of a veteran Unix Admin:
Now the original article from 2011 I read for this made those traits sound a lot better, but ultimately that’s about right.
These aren’t inherently bad traits for a systems admin (some are though), but they obviously require some polish before being acceptable in the workplace, but I thought this was worth sharing with the community at large.
So I haven’t written anything even mildly technical lately so I thought I’d show a small script I use to copy and restore my scripts and PowerShell profile across multiple machines. It’s fairly straight forward. It will backup my scripts directory from my desktop and PowerShell folder from my documents directory; and recreates the the folder structure in the directory the script is run from.
Conversely it will restore as well provided the content exists. It is USER specific, so if you use multiple user id’s like I do, it’s important that you manage them accordingly.
Option Explicit On Error Resume Next Dim oWShell, oFSO, oNet Dim strCurUser, strSourceDrive, strTargetDrive, strOption, strMsg Set oWShell = CreateObject("Wscript.Shell") Set oFSO = CreateObject("Scripting.Filesystemobject") Set oNet = CreateObject("Wscript.Network") strCurUser = oNet.UserName strTargetDrive = Left(WScript.ScriptFullName, Len(WScript.ScriptFullName)- _ Len(WScript.ScriptName)) & strCurUser strSourceDrive = oWShell.ExpandEnvironmentStrings("%userprofile%") strMsg = "Filecopy Summary: "& vbcrlf strOption = LCase(InputBox("Restore|Backup")) Select Case strOption Case "restore" Call Restore() Case Else Call Backup() End Select WScript.Echo strMsg '------------------- Function Backup() Dim strScripts, strPShell strScripts = "\Desktop\Scripts" : strPShell = "\Documents\windowspowershell" If Not CreateFolder(strTargetDrive & strScripts) Then strMsg = strMsg & "Failed Creating: "& strTargetDrive & strScripts & vbCrLf Else strMsg = strMsg & "Created: " & strTargetDrive & strScripts & vbCrlf End If If Not CreateFolder(strTargetDrive & strPShell) Then strMsg = strMsg & "Failed Creating: "& strTargetDrive & strPShell & vbCrLf Else strMsg = strMsg & "Created: " & strTargetDrive & strPShell & vbCrlf End If If Not CopyFolder(strSourceDrive & strScripts, strTargetDrive & strScripts) Then strMsg = strMsg & "Failed to Copy: " & strSourceDrive & strScripts & vbCrLf Else strMsg = strMsg & "Copied: " & strSourceDrive & strScripts & vbCrLf End If If Not CopyFolder(strSourceDrive & strPShell, strTargetDrive & strPShell) Then strMsg = strMsg & "Failed to Copy: " & strSourceDrive & strPShell & vbCrLf Else strMsg = strMsg & "Copied: " & strSourceDrive & strPShell & vbCrLf End If End Function '------------------- Function Restore() Dim strScripts, strPShell, temp1, temp2 strScripts = "\Desktop\Scripts" : strPShell = "\Documents\windowspowershell" temp1 = strSourceDrive : temp2 = strTargetDrive strTargetDrive = temp1 : strSourceDrive = temp2 If Not CreateFolder(strTargetDrive & strScripts) Then strMsg = strMsg & "Failed Creating: "& strTargetDrive & strScripts & vbCrLf Else strMsg = strMsg & "Created: " & strTargetDrive & strScripts & vbCrlf End If If Not CreateFolder(strTargetDrive & strPShell) Then strMsg = strMsg & "Failed Creating: "& strTargetDrive & strPShell & vbCrLf Else strMsg = strMsg & "Created: " & strTargetDrive & strPShell & vbCrlf End If If Not CopyFolder(strSourceDrive & strScripts, strTargetDrive & strScripts) Then strMsg = strMsg & "Failed to Copy: " & strSourceDrive & strScripts & vbCrLf Else strMsg = strMsg & "Copied: " & strSourceDrive & strScripts & vbCrLf End If If Not CopyFolder(strSourceDrive & strPShell, strTargetDrive & strPShell) Then strMsg = strMsg & "Failed to Copy: " & strSourceDrive & strPShell & vbCrLf Else strMsg = strMsg & "Copied: " & strSourceDrive & strPShell & vbCrLf End If End Function '----------- Function CreateFolder(strFolder) On Error Resume Next Dim FolderArray, Folder, Path, booErr FolderArray = Split(strFolder,"\") For Each Folder In FolderArray If Path <> "" Then If Not oFSO.FolderExists(Path) Then oFSO.CreateFolder(Path) End If End If Path = Path & LCase(Folder) & "\" If Err.Number <> 0 Then booErr = True End If Next If booErr Then CreateFolder = False Else CreateFolder = True End If End Function '----------- Function CopyFolder(strSource, strDest) On Error Resume Next If oFSO.FolderExists(strDest) Then oFSO.DeleteFolder strDest, True End If oFSO.CopyFolder strSource, strDest, True If Err.Number <> 0 Then CopyFolder = False Else CopyFolder = True End If End Function
There you go, a bit lacking on the sophistication side of things, but it gets the job done for me. Feel free to modify the script for your own personal file backup needs, it can really help to streamline the process.