Click an Ad

If you find this blog helpful, please support me by clicking an ad!

Friday, August 22, 2014

Get Internet Explorer Versions from a List of Computers

If you haven't been following the tech news, last week Microsoft announced that they were revicing which versions of Internet Explorer that they would support on different Operating Systems. You can find the original article with a list of what's what here. Having just completed testing for IE10, I have a lot of work to do this next year, as the new policy goes into effect on 1/12/2016. I don't have many internal websites to worry about, but the Finance departments reliance on stupid banks that seem incapable of being current is making things difficult. I'm hoping that Internet Explorer Enterprise mode will help me out, but I haven't had the time to really delve into it yet.

I track my client computers' software inventories in Spiceworks, and it was easy enough to get a report from Spiceworks showing who had what, but I don't track my servers in Spiceworks. For that, I needed a script to go through a list of my servers and tell me what version of IE they were running. I found a couple of script online, but they weren't to my liking. Some gave me inaccurate info, even. But they did point me to which registry tree I should look at, so I made my own, and here it is:

#=====================BEGIN SCRIPT==========================
$Computers = Get-content "C:\Temp\QueryInternetExplorerVersionsComputerList.txt"
$TempFile = "C:\Temp\IEVersion.csv"
$Delimiter = ","

Foreach ($Computer in $Computers){
$Version = (Invoke-Command -computername $Computer {$reg = Get-Item ('HKLM:\Software\Microsoft\Internet Explorer'); $reg.GetValue("svcVersion")})
$Computer + $Delimiter + $Version + $Delimiter | add-content $Tempfile

#=====================END SCRIPT==========================

  1. So, what we're doing here is as follows:
  2. Get the list of computers from a text file
  3. Specify a TempFile for output
  4. Specify a delimiter
  5. For each computer in the list, pull the SrcVersion value from the registry
  6. Combine the Computername and SrcVersion, along with use of the delimiters to make a CSV file, which I can then import into Excel.
  7. .....
  8. Profit!

Good luck and Godspeed, my fellow admins!

Windows Filtering Platform events in Security Log ...OR... Don't Screw with Advanced Audit Policy Configuration

I have 4 domain controllers, and the are set up to forward events in their Security Logs to a Kiwi Syslog Server (Solarwinds). I have a 30 day rotation, and I monitor the space usage in a dedicated pie chart on my heads-up display. This past week it started getting lower. After poking around, I noticed that the file sizes of each days' log files were getting bigger and bigger. I opened one up in the Kiwi log viewer (which is the most horrible part of this software, by the way), and noticed a TON of messages from "Windows Filtering Platform". These were basically noting success every time an event was sent to the syslog server. Well..... thanks I guess?

I rarely have to deal with auditing, so I consult the Google and find instructions for how to stop success auditing for Windows Filtering Platform Events. It seemed straight forward enough: Open you default domain controller group policy, and drill down into the Advanced Audit Policy Configuration, and there are two options there dedicated to it. I enable the auditing policy and check neither success or deny, so that the policy will be in effect and tell Windows auditing not to monitor either of them. I document my changes in the ticket I made for myself and close it out.

A few minutes later I get an email from our Netwrix AD auditing software telling me that auditing isn't configured correctly. I go into Active Directory Users and Computers, add a period to the end of a computer account's description, and run a new report. The change is noted, but there are big red warning letters telling me something with auditing isn't right. Huh. I decide to see what the overnight report says, make another benign change in ADUC, and head out for the day.

The next morning, the report comes in showing changes in AD, but it still has big red warning letters saying that auditing isn't configured correctly. I go to check my syslog file size, and it's 5KB (down from 4GB), and the only thing in there are messages over and over saying that the audit settings have been changed.

So, back to Google, and APPARENTLY, when you use any part of the Advanced Audit Policy Configuration, it supercedes ALL of the normal auditing settings. So, by simply turning off logging for the Windows Filtering Platform, I had negated all settings in the regular auditing settings. Super.

Now, I don't know if you've ever had to reverse a group policy setting, but it is not intuitive. Simply turning the setting off does not reverse what has been done. A group policy restore from backup cannot reverse what you have done; you actually need to reverse the setting. Likewise, clearing the checkboxes in the advanced auditing section would not restore auditing to the way it used to be. Sadly.

Here is what I had to do to reverse this calamity:

  1. Put the GPO back to the way it was.
  2. Get on a domain controller.
  3. Find out what you Group Policy Object's policyID is. I used get-gpo -name "<name>".
  4. Now, go to C:\Windows\SYSVOL\Domain\Policies\<PolicyID>\Machine\Microsoft\Windows NT
  5. In this folder delete the Audit folder. 
  6. Now get on a command shell, and type the following: auditpol /get /category:*
  7. What you see is that nothing is being audited.
  8. Run a gpupdate /force, then run the command again - SUCCESS!!!

While researching this I ran across the command that I should have used to get rid of those Windows Filtering Platform events, which is this:
auditpol /set /subcategory:"Filtering Platform Connection" /Success:disable

You can check the setting before and after you run that command with this command:
auditpol /get /subcategory:"Filtering Platform Connection"

Thursday, August 14, 2014

Starting a Backup Exec job from Powershell (After Veeam jobs finish)

When my weekly Veeam jobs finish, I have a post-job script that triggers a Backup Exec job that writes the Veeam backup files to tape. Yes, Veeam has built-in tape writing abilities now, but I still have physical servers to care and feed, and therefore Backup Exec still handles my tapes. Having just upgraded from Backup Exec 2010 to Backup Exec 2014, I learned that bemcli.exe had been removed. This was the old, atiquated way of running commands from the CLI to control Backup Exec. Smartly, they have replaced that with a Powershell module.

At a high level, here are the commands that your script needs to start a Backup Exec job:
#This command loads the module
Import-Module "C:\program files\symantec\backup exec\modules\bemcli\bemcli"

#This command starts the job, with inline confirmation
Start-bejob -inputobject "YourBackupExecJobName" -confirm:$false

For your entertainment, I'll now paste in my entire post job script, where I have built in a couple of handy features. Namely, an email telling me that whether the tape job has started or not, and an if statement that only starts the tape job if there are no failed backup jobs. If there's a failure, this gives me an opportunity to fix it before starting the tape-writing job.

Further commentary within this Veeam post-job script:

#-------------------------BEGIN SCRIPT------------------------
#Add the Veeam snap-in and the Backup Exec module
Add-PSSnapin VeeamPSSnapin
Import-Module "C:\program files\symantec\backup exec\modules\bemcli\bemcli"

#Set the initial counter, used to count Veeam job failures, used later in the IF statement
$Result = 0

#Email variables
$To = ""
$From = ""
$SMTPServer = ""

#Get all of the weekly Veeam jobs. My weekly jobs all have a "Weekly" prefix
$Statuses = (get-vbrjob | where {$_.Name -like "Weekly*"})

#Go through each Veeam job and increment $Result if a job has an unsuccessful status
#I'm not counting warnings, which can be triggered when a VM has been resized and Change Block Tracking has been reset
Foreach ($Job in $Statuses){
If ($Job.GetLastResult() -notlike "Success" -and $Job.GetLastResult() -notlike "Warning"){
} #End If
} #End Foreach

#If the $Result equals 0, all the jobs were successful. Start the tape copy and email me
If ($Result -eq 0){
        #The following line uses the old (Backup Exec 2010) method for starting the tape job
#start-process "C:\Program Files\Symantec\Backup Exec\bemcmd.exe" -ArgumentList '-o1 -j"MyTapeJob"'
        #This next line is the new way of calling the tape job in Backup Exec
start-bejob -inputobject "MyTapeJob" -confirm:$false
    $To = ""
$Subject = "Veeam to Tape Job Started"
Send-Mailmessage -to $To -Subject $subject -From $From -body $body -smtpserver $SMTPServer
} #End If

#If the result does not equal 0, email me
If ($Result -gt 0){
$Body = "The Veeam to Tape job has NOT started due to errors"
$Subject = "VEEAM BACKUP ERRORS: Veeam to Tape Job NOT Started"
Send-Mailmessage -to $To -Subject $subject -From $From -body $body -smtpserver $SMTPServer
} #End If

#-------------------------END SCRIPT------------------------

Tuesday, August 12, 2014

New Backup Exec 2014 Upgrade

I have a new and now working Backup Exec 2014 installation! Keep in mind that I skipped 2012 after all of the horror stories, so things that I think are new might not be to someone who followed the regular upgrade cycles. I have a simple setup; I just have the one server, and with it I back up maybe a dozen physical boxes, and write my Veeam backup files to tape. I also don't use many of the advanced features, nor do I know much about them. Backup Exec is good at writing tapes and backing up physical boxes. I don't see them surpassing Veeam anytime soon. My previous installation was up-to-date Backup Exec 2010 R3, and worked reliably.

I can report that running Backup Exec 2014 along with Veeam v7 on the same server has not caused any issues. The only thing I did have to do was set some of the Veeam services to delayed start so that Backup Exec could grab control of the tape library. Namely, the Catalog Data Service, the Enterprise Manager, the Restful API, and the Veeam Backup Service itself.

Anyway, enough about Veeam....

After doing a little bit of reading, mostly concerning the upgrade paths and methods, I decided to just pull the trigger and sink or swim. I'm happy to report that the installation of Backup Exec 2014 went very smoothly. All of my jobs and media sets were moved over, though I did need to remove and add back a couple of selections from the jobs. There were problems with some of the jobs' system states and I had some failures, but there are problems with any upgrades, so I dealt. One thing I didn't like was that I didn't have selection lists to change; I had to change the selections for each server in the job. One edit for the full job that runs weekly, and then another edit for the daily differential job. I liked the previous method better. In Backup Exec 2010 you specified what to back up in a selection list, then applied a policy which backed up those selections in a certain way, which is where you set up the schedule and the backup type. You applied the policy to the selection list and that made your jobs. Of course, you could create an entire job, but if you're taking care of multiple servers, the selection/policy was the way to go. Now, in the 2014 version, there aren't policies that I could find. I didn't look too hard, though. I edited the jobs directly.

Seriously, the hardest thing about this installation was getting the license keys straightened out. I just got that figured out today after a 45 minute call with support!!!

One laughable thing about my journey; I attended (for a few minutes, anyway) a Symantec webinar "unveiling" 2014, and showcasing some of its new features. The video was garbage. Conclusion: Overall good job! Work on your presentation and licensing though, Symantec!