Skip navigation

Category Archives: Technology

Discussion on any and all tech related items. New tech, tech usage, etc.

So I’ve been playing Planetside 2 lately.  I’ve been playing as New Conglomerate with an outfit called The Iron Wolves [TIW] on Waterson.  Coincidentally Planetside 2 is heading towards MLG and in the war report interview this past Sunday Higby in an interview with Alex “Jax” Conroy has revealed a lot of what that will entail (finally!).

Anyway, I’ve not made a post since mother’s day so I thought I’d link the youtube replay of our match from this past Friday against VREV and TRAF.  The second objective battle is by far one of the best (in around 10 minutes), with the third base capture having some entertaining points.

As some of you know, I’ve been working on CM 2012 for a while now, and establishing a hierarchy.  One of the unfortunate tasks with this job has been boundary creation.  Finally after a longer period than it should have taken I went to build a tool to create site boundaries for me out of a csv, similar to tools with SCCM 2007.

I found the sms_boundary class hadn’t changed (outside of an obsolete boundaryflag property) so I decided to test it out with powershell as a one liner, and it worked great.  I did a bit more research and stumbled across something already written by MVP Kaido Järvemets from Estonia, and enjoyed his minimalistic script for it so I followed his methodology (mostly) and ended up with this script that reads from a boundaries.csv file

boundary

 

$sitecode = "ABC"
$siteserver = "mysiteserver"
$boundarylist = Import-Csv '.\boundaries.csv'

foreach($Item in $boundarylist)
{Switch($item.'type')
	{"Subnet" 	{$Type = 0}    
	 "AD" 		{$Type = 1}
     "IPv6" 	{$Type = 2}
     "Range" 	{$Type = 3}}
$arrValues = @{DisplayName = $Item.description; BoundaryType = $Type; `
Value = $Item.boundary}
Set-WmiInstance -Namespace "Root\SMS\Site_$sitecode" -Class SMS_Boundary `
-Arguments $arrValues -ComputerName $siteserver}

Not to take credit for other peoples work, especially since this is a hacked up version of his original which can be found here.

Laziness is the true mother of necessity I think in IT, and the tedious act of viewing multiple properties pages brought about this one liner.  If you too are setting up diverse deployment sets and need to quickly verify multiple deployments for reboot supression state. Here’s a way to do it in powershell:

gwmi -namespace "root\sms\site_<sitecode>" 
-query "select assignmentname from sms_updategroupassignment 
where assignmentname like '%<NAME SEARCH VALUE>%' and suppressreboot = '3'"
 -ComputerName <SITESERVERNAME> |Select -Property assignmentname

<chopped up for readability sake>

Suppressreboot values:
0 = No Suppression
1 = Workstation Suppression
2 = Server Suppression
3 = Server & Workstation Suppression

replace <> values with your relevant search criteria.

More class information.

Just a link, but well done.

http://martinvalasek.com/blog/pictures-from-a-developers-life

Animated gifs to describe the day to day of a developer; or most personnel in an engineering level role of IT.

So this is a quick blog for a frustrating issue, that is actually very easy to resolve.

If you are on Windows Server 2012, and you’ve had to reinstall WSUS for any reason and receive the following error in your tmp log after attempting to finalize the installation:

2013-02-20 16:04:17  Creating default subscription.

2013-02-20 16:04:17  Instantiating UpdateServer

2013-02-20 16:04:19  CreateDefaultSubscription failed. Exception: System.Net.WebException: The request failed with HTTP status 503: Service Unavailable.

   at System.Web.Services.Protocols.SoapHttpClientProtocol.ReadResponse(SoapClientMessage message, WebResponse response, Stream responseStream, Boolean asyncCall)

   at System.Web.Services.Protocols.SoapHttpClientProtocol.Invoke(String methodName, Object[] parameters)

   at Microsoft.UpdateServices.Internal.ApiRemoting.GetServerVersion()

   at Microsoft.UpdateServices.Internal.DatabaseAccess.AdminDataAccessProxy.GetServerVersion()

   at Microsoft.UpdateServices.Internal.BaseApi.UpdateServer.CreateUpdateServer(String serverName, Boolean useSecureConnection, Int32 portNumber)

   at Microsoft.UpdateServices.Internal.BaseApi.UpdateServer..ctor(Boolean bypassApiRemoting)

   at Microsoft.UpdateServices.Setup.StartServer.StartServer.CreateDefaultSubscription()

2013-02-20 16:04:19  StartServer encountered errors. Exception=The request failed with HTTP status 503: Service Unavailable.

2013-02-20 16:04:19  Microsoft.UpdateServices.Administration.CommandException: Failed to start and configure the WSUS service

   at Microsoft.UpdateServices.Administration.PostInstall.Run()

   at Microsoft.UpdateServices.Administration.PostInstall.Execute(String[] arguments)

It’s most likely an IIS issue.

Open the IIS console, delete the WSUS Site, and perform the post installation tasks again.

Something like that anyway….

 

I plan on posting more stuff in the future after things have normalized.

Soooo……

After GatherWriterMetadata SMS Writer status = FAILED_AT_PREPARE_BACKUP. SMS_SITE_BACKUP 2/4/2013 3:32:38 PM 8500 (0x2134)
Error: VSS_E_WRITERERROR_TIMEOUT. Error Code = 0x80,042,3f2. SMS_SITE_BACKUP 2/4/2013 3:32:38 PM 8500 (0x2134)

vssadmin list writers

Capture

hmmmm, it’s there.  Let me check the sms_site_sql_backup service on the database servers.

Ok, lets check our permissions on the servers to make sure machines have local admin, share permissions, and that they have SA rights.

Hmmm, are the backup components installed on the SQL nodes?

They are,  sorta, but aren’t installed on the static drives of our SQL server nodes, this is a problem…

 


It all starts with our old friend NO_SMS_ON_DRIVE.SMS

 

So make sure it’s where it needs to be, like on SAN drives.  Especially a SANS drive tied to SQL Cluster that will be failing over.  You don’t want to manually re-point components; but in case you do (or in our case NEED to)…. here’s how:

Get on every drive that DOESN’T need sms site components installed to them and place NO_SMS_ON_DRIVE.SMS in their root.

Now open your registry on the site server and go to:

hklm\Software\Microsoft\SMS\Components\SMS_SITE_COMPONENT_MANAGER\Multisite Component Servers\<servername>\Installation Directory

Change this path to the preferred static drive on the respective servers.  Now:

net stop sms_site_component_manager
net stop sms_executive
net start sms_site_component_manager

Now check sitecomp.log for your server name, verify the installation and connection.  Now lets check the local drives of the component sql server and verify the installation path we declared earlier.

Is it there?  If so, great, if not….

Open services and look for:

SMS_SITE_SQL_BACKUP_<site servername>

Go to properties and check it’s local path.

Stop the SMS_SITE_SQL_BACKUP_<site servername> service

Now open the registry and go to

hklm\System\CurrentControlSet\services\SMS_SITE_SQL_BACKUP_*\ImagePath

Specify the local drive and path you wish to use and now move the contents from the previous service to it’s new home.

You’ll need to verify and do the same for the log files:

hklm\Software\Microsoft\SMS\Tracing\SMS_SITE_SQL_BACKUP<site servername>\TraceFilename

Now:

Start the SMS_SITE_SQL_BACKUP_<site servername> service

Rinse, and repeat until they are where they need to be, and finally perform and verify the backup completes:

(from the site server)

net start sms_site_backup

and watch the smsbkup.log


Props to MS PFE Sean MAHONEEEEEEY! For his assistance

I’ve updated the inventory enforcement script and post for anyone who utilizes it. It should be cleaner now as it only depends on wmi for the inventory actions.

Here’s the link back.

So in an attempt to quickly extract OS version and Service Pack for a few machines in an environment the idea was presented to pull the data from active directory. The properties exist so the logic seemed sound; and as we’ve discussed this before it’s a pretty easy task with the active directory module in PowerShell, and here’s the code:

$list = gc computers.txt
Import-Module ActiveDirectory
ForEach($item in $list){
                $ado = (get-adcomputer $item -Properties *)
                $pso = New-Object PSObject
$pso | Add-Member -Name "Computer Name" -MemberType NoteProperty `
		-Value $ado.CN
$pso | Add-Member -Name "Operating System" -MemberType NoteProperty `
		-Value $ado.OperatingSystem
$pso | Add-Member -Name "Service Pack" -MemberType NoteProperty `
		-Value $ado.OperatingSystemServicePack
$pso
}

Assuming for the sake of example the name of this script is, get-adservicepack.ps1, and you’ve got your computers.txt file with your computer names in it then we’d run it like this.

./get-adservicepack.ps1 | export-csv -NoTypeInformation MyAdOutput.csv

So what’s happening?

First, we’re taking the get-content command to pull data from a local text file “computers.txt” into a data object and then iterating through it sequentially.  We are then using the computer name as the lookup name with the get-adcomputer cmdlet along with all it’s ad properties and assign it to a variable called ado.

Now we create a PowerShell object and begin to give it some noteproperties with values pulled from our ad object we created from the ad cmdlet then echo it’s contents out by calling it.

When we run the script and pipe it’s output to export-csv –NoTypeInformation we are taking that output and putting it directly into a csv without any of the object information, otherwise it’s a tabled console output.

PowerShell is so boss sometimes…

Maybe we just do all this in one line?

gc computers.txt|ForEach-object{Get-ADComputer $_ -properties *|select -Property name,operatingsystem,operatingsystemservicepack}|export-csv -notypeinformation output.csv

Scroll that line, like a boss.

Gallery entry on Script Center if you want to rate it

Ok, brief introduction on this.  We are moving, and in the process of selling a home among other things.  So in the process of staging our home we’ve packed up our desktops and servers here.  Not a problem, we have 3 laptops, 1 is my personal, 1 is for my work, and the other is a shared one for the family.  Recently while testing some stuff with hyper-v installed on the shared laptop I had imaged the machine then saved it to a portable drive. Well what does that have to do with anything? My wife wanted to use the shared laptop as a stand in until her desktop could come back out, and I’d since deleted the image and packed all my windows 7 discs etc. Ruh roh.

So being the FOSS guy that I am (outside of work), I decided I’d use this opportunity to throw a very user friendly Linux distribution onto the laptop and see what my wife thought of it.

The distribution:

Linux Mint 13 with KDE

amanda-desk1

Now this was a fairly easy setup to make for my wife.  She is what I would consider an average user with some advanced usage areas.  Primarily with video editing and photo editing (some graphic design stuff).  However for her day to day she needed web, office suite, and email.  Since I’m going to be on the road a lot, I needed her to have Skype as well as other chat programs.

I installed from a USB key using the live iso image.  From windows this is easy to achieve, download Windows 32 Disk Imager and the desired ISO file, rename it to a .img then apply it to your flash device and viola.

After a very light crash course on navigating the system and the passwords I setup for her she was off and rocking.  Our first hurdle however was the native mail client with KDE, Kmail.  It’s functional, but she was having some issues with the large volume of emails she has on one of her accounts so I opted to install one of my favorite FOSS mail clients on their instead, Thunderbird

We both are avid Chrome users, and have our accounts and profiles linked for easy setup across platforms, so that was a breeze for her.  I imported her windows profile data to her home directory which got her back up and running with the data she needed as well.

For her Office needs we went with (and she’s adjusted quite easily to) using LibreOffice which has usurped OpenOffice as the FOSS Office Suite of choice with most Linux distros these days.

For her photo editing there is of course Gimp which has been an amazing free image manipulation application for over a decade.  For her video editing needs there is Avidemux and OpenShot, neither of which she has made use of yet.  Her video editing needs are slim at this time anyway.

Now, here is the mind blowing part for some of you Linux timids or FOSS nay sayers.  Up until this point, I’d done everything from install of the OS to now without once going into the terminal.  I had made it a point not to.  If I was going to give this to my wife to use, regardless of my feelings about the console, she shouldn’t have to use it.  All the application installs were performed from default package stores as well using only the Linux Mint Software Manager.  Being a long time Linux user (since 97~98) I was personally blown away by this.  I normally run to the console out of habit, but found that it was at a point where I genuinely didn’t have to.  However there was a problem for usability for my wife that I felt needed to be addressed, and it would require me to script something for her.

In Windows, I have a PowerShell script my wife uses to pull file system object information for her pictures and looks to the creation date then renames the files for her.  So this was something she of course wanted, and even though Dolphin and Konqueror have a really cool built in rename feature, this date method wasn’t possible for her out of the box.  I’m not going to lie, I was personally elated with this task, but finding the best method was tricky so I went with a simple iteration script she could drop in the folder and double click to run.  If she chooses to use Linux longer I’ll work on a possibly building her something a bit more robust, but here’s the existing code:

#!/bin/bash
#Script to rename jpg by create date and number
x=1
for i in *.jpg *.JPG
do
y=$(stat -c %y "$i" | sed 's/^\([0-9\-]*\).*/\1/')
  file=$y"("$x").jpg"
        mv "$i" $file
                x=$[$x+1]
done

 

So there is a bit of regex usage here, but the end result is each jpg file in the existing folder has it’s date information pulled from the file using stat, then piped through sed to chop it down to a preferable string.  That string is then concatenated into a file name with an incremental integer and extension then the existing file has it’s name changed to that newly created string.

So what did my wife gain from this interim transition?  A new appreciation for another operating system and the flexibility and eye candy that comes with Linux environments?  I don’t know, but it’s allowed us to retrofit some old hardware with her level of desired functionality at 0 additional cost.  All in all, I’d say that’s pretty cool, and I’m still stoked my wife is rocking out with Tux.