Skip navigation

Tag Archives: Linux

With the vulnerability and fixes supplied to BASH a few months back there’s been a need to update the service control script I had written for the Configuration Manager client.

The updated client can be found here.

Be sure to read the README file contained within the tarball and modify the cm-installer file as required for your environment.

If you want additional information, please review the original post found here.

I updated this last month, and didn’t make a point to highlight it. It’s fairly important that if you are using any of my service control scripts or cron jobs that you update with the newer version.

I updated this within the previous post which can be found here.

So there was a recent security update for RHEL that breaks a library dependency for the Configuration Manager client and OMI.

/opt/microsoft/configmgr/bin/ccmexec.bin: error while loading shared libraries:
 libssl.so.1.0.0: cannot open shared object file: No such file or directory

The issue is simple enough to fix with a sym link update.

sudo ln -sf /usr/lib64/libcrypto.so.10 /usr/lib64/libcrypto.so.1.0.0
sudo ln -sf /usr/lib64/libssl.so.10 /usr/lib64/libssl.so.1.0.0

Simple enough.  If you are on x86 then change /usr/lib64 to just /usr/lib/

Special thanks to Morten Vinding for the best library to use.

Post updated 1/24/15


With the SP1 release of Configuration Manager 2012 support for certain Linux distributions as a client platform has been introduced.  Interestingly enough, they’ve added a WMI type mock up that the client uses to interact with and gather data from these varied distributions.  It’s not a perfect solution, but certainly a step in the right direction.  I’ve spent a fair amount of time working through some of the Linux client problems (within Redhat) and have built an installer, and service control script, along with cron jobs to overcome some of the faults I’ve seen with the client failing to perform certain tasks in a timely fashion.

Before using any of my code, I recommend reviewing the Linux Configuration Manager installation documentation provided via Technet.  I’d also encourage you to read up on managing these clients from the Technet as well.  It’s fairly straight forward, but as I stated before it’s not a perfect solution.  I’ve found problems with zombied threads of the client on the box preventing policy updates, or needs for random restarts of the omiserver etc.

You are welcome to use parts of or all of the provided code as you see fit in your environments of course:

This first portion is a service control script that works for the Redhat distrubtions of the client.  I place this within the bin of all my assets to give simpler control of the services and for simplified cron entries.

#!/bin/bash

#CM Client Script
#Author: Daniel Belcher
#Date: 8/7/13 Modified: 1/19/15
#This script is intended for automation of services by cron and simpler
#asset management through the command line

#LDIR="/var/log/"
#DATE=`date '+%m%d%y'`
RUID=0
CCMEXEC="/opt/microsoft/configmgr/bin/ccmexec"
if [ "$UID" -ne "$RUID" ]
        then
        echo "User needs to be root to run $0 $1"
                exit 1
fi

start () {
$CCMEXEC
        sleep 1
echo
        exit 1
}

stop () {
$CCMEXEC -s
        sleep 1
echo
        if $(ps aux | grep [c]cmexec.bin) > /dev/null
        then
                kill $(ps aux | grep [c]cmexec.bin | awk '{print $2}')
        fi
exit 0
}

restart () {
$CCMEXEC -s
        sleep 2
                if [ $(ps aux | grep [c]cmexec.bin) ]
                then
                        kill $(ps aux | grep [c]cmexec.bin | awk '{print $2}')
                fi
        sleep 1
$CCMEXEC
        sleep 1
echo
        exit 0
}

trimlogs () {
        if [ ! $2 ];then
                SIZE=2048
        else
                SIZE=$(( $2 * 1024 ))
        fi

rollover $SIZE "/var/opt/microsoft/scxcm.log"
rollover $SIZE "/var/opt/mirorosft/scx/log/scx.log"
rollover $SIZE "/var/opt/microsoft/scx/log/scxcimd.log"
rollover $SIZE "/var/opt/microsoft/scxcmprovider.log"
}

rollover () {
FILESIZE=$1
LOGPATH=$2

if [ -f $LOGPATH ];then
LOGSIZE=$(du ${LOGPATH} | awk '{print $1}')
        if [ $LOGSIZE -gt $FILESIZE ];then
                cat /dev/null > $LOGPATH
                        echo "Clearing entries in $LOGPATH"
        fi
fi
}

policy () {
$CCMEXEC -rs policy
        sleep 1
echo
        exit 0
}

hinv () {
$CCMEXEC -rs hinv
        sleep 1
echo
        exit 0
}

sinv () {
$CCMEXEC -rs sinv
        sleep 1
echo
        exit 0
}

case "$1" in
        start)
                start
        ;;
        stop)
                stop
        ;;
        restart)
                restart
        ;;
        policy)
                policy
        ;;
        hinv)
                hinv
        ;;
        sinv)
                sinv
        ;;
        trimlogs)
                trimlogs $2
        ;;
        *)
                echo $"Usage: $0 (start|stop|restart|policy|hinv|sinv|trimlogs)"
                exit 1
esac

This next portion is a simplified installer script that can be used to build a unified installer for your environment that I’m currently using (it also places the script from above, and imports the cron jobs I’ve created).  It’s still required to place the client install files in the folder with this script of course:

#!/bin/bash

RUID=0
MP="management.point.server.com"
SITECODE="ABC"
if [ "$UID" -ne "$RUID" ]
        then
        echo "User needs to be root to run $0"
                exit 1
fi

if [ -f "fix-lib.sh" ]; then
        ./fix-lib.sh
fi

./install -mp $MP -sitecode $SITECODE -clean ccm-Universalx64.tar
        cp configmgr /bin/
#               crontab cm-crontab

sleep 5
configmgr stop
        sleep 30
cp scxcmprovider.conf /opt/microsoft/omi/etc/
if [ -f "/opt/microsoft/omi/scxcmprovider.log" ]; then
        echo "Moving scxcmprovider.log to /var/opt/microsoft/"
                mv /opt/microsoft/omi/scxcmprovider.log /var/opt/microsoft/
fi
configmgr start

These are the cron entries, to be used as an example:

#---Begin Configmgr Jobs---
0 0 * * 2,4,7 configmgr restart
1 * * * * configmgr policy
0 12 * * * configmgr hinv
0 8 * * 3 configmgr sinv
0 * * * * configmgr trimlogs 5
#---End Configmgr Jobs---

For more information regarding cron and what these entries mean, please read this.  They do a nice job of explaining this in a fairly straightforward manner.

Putting it all together….

This following link contains the tar.gz that can be used to install from.  Be mindful to read the README file and update the cm-installer script before you begin to insure you are pointing to a proper site.

Redhat CM12 Client Installer.

A further note that the Red Hat install I’m using here is based off the universal x64 binaries and will work for a lot of different distributions. Be sure to verify your distribution against the required package and substitute as needed.

Ok, brief introduction on this.  We are moving, and in the process of selling a home among other things.  So in the process of staging our home we’ve packed up our desktops and servers here.  Not a problem, we have 3 laptops, 1 is my personal, 1 is for my work, and the other is a shared one for the family.  Recently while testing some stuff with hyper-v installed on the shared laptop I had imaged the machine then saved it to a portable drive. Well what does that have to do with anything? My wife wanted to use the shared laptop as a stand in until her desktop could come back out, and I’d since deleted the image and packed all my windows 7 discs etc. Ruh roh.

So being the FOSS guy that I am (outside of work), I decided I’d use this opportunity to throw a very user friendly Linux distribution onto the laptop and see what my wife thought of it.

The distribution:

Linux Mint 13 with KDE

amanda-desk1

Now this was a fairly easy setup to make for my wife.  She is what I would consider an average user with some advanced usage areas.  Primarily with video editing and photo editing (some graphic design stuff).  However for her day to day she needed web, office suite, and email.  Since I’m going to be on the road a lot, I needed her to have Skype as well as other chat programs.

I installed from a USB key using the live iso image.  From windows this is easy to achieve, download Windows 32 Disk Imager and the desired ISO file, rename it to a .img then apply it to your flash device and viola.

After a very light crash course on navigating the system and the passwords I setup for her she was off and rocking.  Our first hurdle however was the native mail client with KDE, Kmail.  It’s functional, but she was having some issues with the large volume of emails she has on one of her accounts so I opted to install one of my favorite FOSS mail clients on their instead, Thunderbird

We both are avid Chrome users, and have our accounts and profiles linked for easy setup across platforms, so that was a breeze for her.  I imported her windows profile data to her home directory which got her back up and running with the data she needed as well.

For her Office needs we went with (and she’s adjusted quite easily to) using LibreOffice which has usurped OpenOffice as the FOSS Office Suite of choice with most Linux distros these days.

For her photo editing there is of course Gimp which has been an amazing free image manipulation application for over a decade.  For her video editing needs there is Avidemux and OpenShot, neither of which she has made use of yet.  Her video editing needs are slim at this time anyway.

Now, here is the mind blowing part for some of you Linux timids or FOSS nay sayers.  Up until this point, I’d done everything from install of the OS to now without once going into the terminal.  I had made it a point not to.  If I was going to give this to my wife to use, regardless of my feelings about the console, she shouldn’t have to use it.  All the application installs were performed from default package stores as well using only the Linux Mint Software Manager.  Being a long time Linux user (since 97~98) I was personally blown away by this.  I normally run to the console out of habit, but found that it was at a point where I genuinely didn’t have to.  However there was a problem for usability for my wife that I felt needed to be addressed, and it would require me to script something for her.

In Windows, I have a PowerShell script my wife uses to pull file system object information for her pictures and looks to the creation date then renames the files for her.  So this was something she of course wanted, and even though Dolphin and Konqueror have a really cool built in rename feature, this date method wasn’t possible for her out of the box.  I’m not going to lie, I was personally elated with this task, but finding the best method was tricky so I went with a simple iteration script she could drop in the folder and double click to run.  If she chooses to use Linux longer I’ll work on a possibly building her something a bit more robust, but here’s the existing code:

#!/bin/bash
#Script to rename jpg by create date and number
x=1
for i in *.jpg *.JPG
do
y=$(stat -c %y "$i" | sed 's/^\([0-9\-]*\).*/\1/')
  file=$y"("$x").jpg"
        mv "$i" $file
                x=$[$x+1]
done

 

So there is a bit of regex usage here, but the end result is each jpg file in the existing folder has it’s date information pulled from the file using stat, then piped through sed to chop it down to a preferable string.  That string is then concatenated into a file name with an incremental integer and extension then the existing file has it’s name changed to that newly created string.

So what did my wife gain from this interim transition?  A new appreciation for another operating system and the flexibility and eye candy that comes with Linux environments?  I don’t know, but it’s allowed us to retrofit some old hardware with her level of desired functionality at 0 additional cost.  All in all, I’d say that’s pretty cool, and I’m still stoked my wife is rocking out with Tux.

On a more regular basis I’d like to keep a stream of technical write ups, gaming news, theological thoughts, and or general “what’s going ons” with me and my family.  However with a work trip to Houston last week and general slap busy nature of my work since returning home; I’ve not had any time to collect some thoughts and formulate them into a blog post.  I want to hit some high points, and perhaps elaborate on them more in future posts.

High point #1 Samba DC

Ok, so people who have known me for any extended amount of time (from the age of 16 to 30) knows that I’m a Linux fan.  My work and lively hood mind you thrive around a Microsoft world, but I will never sell Linux short, nor fail to marvel at the amazing things that a thriving community of passionate individuals can create.  I also maintain a Linux server out of my home to manage DNS, DHCP, VOIP (TeamSpeak) and File sharing (NFS, iSCSI, and SMB).  I will also, on occasion, bring up outward facing game servers.  Just recently I decided to convert that server into a SAMBA DC for my, primarily, Windows 7 environment at home.

I run CentOS as my server distribution, which is a downstream of RHEL.  I’m running Samba version 3.5.4, at the time of this writing 3.6 is the latest stable release but didn’t offer enough improvements for me to go outside of my natively distributed yum version.

Also, aside from a few changes to the registry and local security policy that had to be made on the client side of the machines, the migration was fairly painless.

The first change resolves the issue of Windows 7 being able to find the domain for insertion, and the security policy solves the issue of Domain Trust at login.  It’s also wise to disable the password reset of the machine to DC to avoid potential relationship issues.  I’d not seen this issue myself, but until I see a confirmation it’s resolved (supposedly coming in samba 4) I’ll err to the side of caution.

My next step will be to integrate Open LDAP functionality into the DC, and an Apache http server.  I assume these will be fairly painless projects, but for risk of breaking my current domain environment I’ll need to wait till I have the time to deal with a potential ldap migration failure.  I also don’t have a strong enough list of pros for it since this is just a home network.  Mind you it’s more sophisticated than the average home network, it just seems a bit over engineered.  As for the Apache server, I really want to get back into some web development so I’d like the internal server for development purposes….

 

service httpd start

Ok, so now I’m running an Apache server off my server as well.  Linux is so hard.

 


 

High point #2 Admin Studio 10

So I was in Houston last week.  I’m now “officially” trained to use Admin Studio 10 for package (msi, app-v, xenapp, and thinapp) development, repackaging, and migration.

So what does that mean?

Well as most of you know I work with a product from Microsoft called SCCM.  One of the primary features of SCCM is application deployment.

So what is application deployment?

Simply put, it’s installing applications to multiple machines over a network.

Ok, I think I see.  So why would you need to do package development to deploy packages?

Well, you don’t have to.  One could feasibly shoehorn an installer given by a vendor, but ideally you want to build out a standardized installer or load for your company.  For us that means I’ll be building MSIs, MSTs, and App-v packages.  As well as isolating application installs that might otherwise break functionality of OTHER applications they share hard drive space with.

Wait, what?  Isolate, break, huh?

Almost all applications rely on libraries.  Think of them as a set of shared instructions that applications go to when asked what to do.  Well in most cases these libraries are shared by multiple applications.  And, sometimes one application wants a vanilla library, and another wants a chocolate.  Well these apps will fight, and one of them will win and another one will lose.  By isolating them I can give them what they want so they don’t break the system, or each other.

Our company will also leverage App-v packages which are essentially virtualized installs of these applications that, although they run locally on the machine, they are actually virtualized (or encapsulated) and are separate from the actual operating system.  Xenapp and Thinapp do the same thing.  I’m particularly excited about application virtualization, it can come with a bit of overhead, but it’s nice and contained.

Ok, I stopped caring somewhere around chocolate and vanilla.

Yea I figured as much.  Either way, it is a tangible notch to my hard skill set and I’m glad that I was able to get it done.

 


 

 High point #3 Gospel in Life

What does a Gospel centered life look like?

What does it mean to be in the world but not of the world?

Is the Gospel as narrow minded to culture as people often proclaim it to be?

What does a Gospel centered community look like?

These are part of the current bible study I’m involved in with my brothers and sisters in Christ called Gospel in Life by Timothy Keller. It’s a great study that forces you to take a look at your heart, your life, and your community and compare it to what and how it is defined in the Gospel. I would recommend this study to anyone who is a believer. Even if the information isn’t new to you, as most of it hasn’t been for me, it’s still food for the soul. A reminder of the higher purpose we are called to as Christians.

Truthfully, I’d encourage non-believers as well to read this study. If for nothing else, than to hold Christians accountable to the teachings that we claim to believe.

 


 

High Point #4 Ignoring my Family

I’ve taken way to long to blog this, and my wife has informed me that I should blog about how I’ve ignored my family, to blog.

When she’s right she’s right.  Thank God for her gentle reminders.

 


Alright, so continuing our discussion on scripting with Bash scripting in Unix and Linux.  In the same way that command interprets a script to perform actions in order so does bash.  What’s the difference then?  The sheer volume or tools available to script and manipulate the environment is greatly increased.  The manageability of output, the variety of conditionals, arrays, and integration of the shell into the environment.

It’s worth mentioning up front, I could spend days on the subject of any of these scripting languages.  However for the sake of space and interest, I will still keep things high level.  It’s ALSO worth mentioning that Bash/shell scripting is something of a nix requirement.  Not that you have to know shell scripting to use the OS, but it’s certainly a primary part of the culture.  Bash scripting also works within OSX environments (since it’s built on BSD), however some research and testing will be required to find the similarities between their nix and others which leads to my next topic…

Linux and Unix {

If you are reading this, I will assume you know what Linux and Unix are.  If not, here’s some reading material for Unix and Linux.  What you may or may not be aware of is that although there is a great level of standardization as maintained by IEEE POSIX standards, there is also a lot of distribution specific tools or practices.  In other words, a person can generally port work from one distribution to another, but should never assume a one size fits all solution across the board.  Which is precisely why there are so many distributions, because one size does not fit all, at least not to everyone’s tastes.

I also want to take a moment to point out that this is not a flaw. Although standardization is a good thing, limited options are not. There are other standards that GPL licensed software (generally linux)adheres to that gives it reliable standards or compliance. They also maintain partial POSIX compliance as well. It’s an interesting subject, and the differences are fairly minor in most cases. Do not be scared by the options, determine what it is you want and go from there if you are interested in trying.

Now a days the user experience one has come to expect from Windows or OSX is easily found with a popular Linux distribution (Ubuntu and Fedora immediately come to mind).

     Bash Scripting {

Now lets take a look at a very simple script that I’ve briefly modified from Mendel Cooper’s guide for an example on here:


#!/bin/bash#Log Cleanup
#Author: Daniel Belcher
#4/30/11
#Built for tutorial purposes
#Idea for script from beginner example
#in advanced bash scripting guide:
#http://tldp.org/LDP/abs/html/sha-bang.html#Variable declaration
LDIR="/var/log/"
DATE=`date '+%m%d%y'`
RUID=0
E_USER=1#Checking for root user, exiting with error code 1 if not root
if [ "$UID" -ne "$RUID" ]
then
          echo "User needs to be root"
exit $E_USER
fi
#Copying the current messages and wtmp logs to an archive with date code
cp $LDIR"messages" $LDIR"messages."$DATE;
     cp $LDIR"wtmp" $LDIR"wtmp."$DATE;
#Taking the last 10 lines from that log and writing it to the current
#messages and wtmp to reset them.
tail -n 10 $LDIR"messages."$DATE > $LDIR"messages"
          tail -n 10 $LDIR"wtmp."$DATE > $LDIR"wtmp"                
   sleep .5
#Finding all messages and wtmp files with an
#access time of 7 days or more and deleting them.
find $LDIR -iname messages.* -atime 7 -exec rm {} ; #could also swap –exec rm etc for –delete 
             find $LDIR -iname wtmp.* -atime 7 | xarg rm 


Save as a .sh

then

chmod +x scriptname.sh

Now same as with our batch script, lets focus on 4 things.  Hopefully some similarities begin to present themselves.

  1. Sequence
  2. Command usage
  3. Output usage
  4. Conditional logic

Sequence:

Now similarly to the command shell, Bash reads the script line for line.  However instead of GOTO statements in Bash, we would actually build functions. (I do not have an example in this script but the syntax is generally name_of_function() { work to perform }.  I’ll discuss this more with PowerShell and VB script).  All lines are read as if they were typed at the console.  Here you will see I also do something that we didn’t do within the batch file.  Variable declaration (LDIR=”/var/log/”) then I can call it again in the script as $LDIR.

This allows you to take a commonly used command, object, integer, or string and set it to something easily repeatable inside the script.  Variables can be declared at run time, as an argument, or produced from script output to name a few methods for declaration.  They are integral to programming in general.  As far as sequence is concerned, as long as the variable is declared before it’s called you are fine.

Command usage and Output usage:

Now hopefully you are seeing that the commands available in a shell are what does most of the heavy lifting in terms of work.  Something worth learning in command usage is piping.  Piping is taking the output of one command and using it as the input for another (to put it simply).  A visual I’ve always liked to use in understanding what is happening is this:

A lake brings water to your home.  A pipe exists between the lake and your faucet (very simple). On the course to your house a series of filters are applied to weed out what you don’t want from the water by the time it reaches your house.  In this case the lake is the source or original input, and the filters are the series of commands applied to the pipe before the final output at your faucet.

So in pseudo code: lake-water | filter-fish | filter-algae | filter-dirt > glass-of-water.drk

The end result is that water is transported through that pipe and all unwanted elements are removed until we finally have our desired glass of water to drink.

In this case we use:

find $LDIR -iname wtmp.* -atime 7 | xarg rm 

This will take the findings from the FIND command and pass it to xarg which initiates rm and deletes these files.

Hopefully this is beginning to paint a clearer picture of pipe and command usage.  Shells that allow for command piping are a God send and allow for some very elegant solutions to complex problems in simple one liners that a user could easily alias, but that’s another subject entirely.

Conditional Logic:

Now in an identical fashion to our batch file we are performing an IF statement.  IF statements are one of, if not the most, commonly used conditional statements.  Notice there is a very distinct difference in the syntax used here in comparison to the one we used in our batch.  The testing method and outcome are still the same however, so do not be deceived.  IF statements put simply are: if it’s this then do that.  Since conditional tests are capable of greater complexity (and will be with all other script engines we discuss going forward) it needs to be terminated so the script knows when the conditions or work statements are final. We close this IF statement in Bash script by spelling IF backwards FI.

The same is true for all conditionals within bash.  They will open spelled forward and close in reverse.  Elif … file case … esac; when you begin to learn what and how logic is applied in scripts you will know how to appropriately apply the syntax.

So what does this Bash script do?

The script opens with a shabang! (#!/bin/bash) which tells the shell that it’s a script to be interpreted by the bash shell found in the bin folder.

Next it declares 4 variables, LDIR (log directory path), DATE (current date no special characters month date year), RUID (Root user id), and E_USER (error code to use if the user fails our user check).

Now we check a shell variable $UID (current user’s ID #) and see if it’s different than the $RUID (-ne = not equal). If it’s not the root user we exit with the error code # assigned to $E_USER.

If we pass that check we will now copy the messages and wtmp files under the /var/log/ folder to their same folder as filename.currentdate.  Example: messages.050111

Now we take the last 10 lines of that file and output it to the primary log files messages and wtmp effectively overwriting the previous data and truncating that file.  > operator on the command line is a redirect (as we saw with our > nul in the batch file).

Finally we take and search that folder for all files following the archived file syntax and search for an access time greater than 7 days then we delete them.  I’ve used two different formats for deletion to show the different possible ways of doing this.

     } gnitidE shaB

} xinU dna xuniL

In my next post I’m going to break into vb scripting, which is a considerable departure from the shell relative scripting we’ve looked at so far.  However I will return to another form of shell scripting with PowerShell, stay tuned.


Just for another example, how would this script look utilizing a function?

#!/bin/bash#Log Cleanup 
#Author: Daniel Belcher 
#5/1/11#Variable declaration 
LDIR="/var/log/" 
DATE=`date '+%m%d%y'` 
RUID=0 
E_USER=1#Checking for root user, exiting with error code 1 if not root 
if [ "$UID" -ne "$RUID" ] 
then 
         echo "User needs to be root" 
exit $E_USER 
fi 
#Function to create backup, truncate, and purge old log files. 
cleanup () { 
                 cp $LDIR$1 $LDIR$1"."$DATE; 
                         tail -n 10 $LDIR$1"."$DATE > $LDIR$1 
                                 sleep .7 
                         find $LDIR -iname $1.* -atime 7 -exec rm {} ; 
                 return 0 
                 } 

#Calling function cleanup () and passing 1 argument

cleanup messages 
cleanup wtmp

exit 0

A lot cleaner.  Hopefully you can see the benefits of functions now, or even loops as we will examine in the future.


Stick to the Script Parts 1 – 4:

  1. Stick to the Script…
  2. Stick to the script #!/BIN/BASH…
  3. Stick to the script CreateObject(“Wscript.Shell”)…
  4. Stick to the Script PS C:> PowerShell …