Click an Ad

If you find this blog helpful, please support me by clicking an ad!

Friday, February 28, 2014

Listing Disabled Active Directory Users

I generate this report once a month, just to make sure we keep AD cleaned up. With people coming and going, and with every case treated differently, it's sometimes hard to remember when to delete what. This is just a monthly report to make sure we aren't leaving anything out there.

#---------------BEGIN SCRIPT---------------------

#Import the Active Directory module. Unless you're running this on a domain controller, you'll need RSAT installed.
import-module activedirectory

#Find disabled AD accounts. Here, I've excluded some builtin accounts, and an OU.
#Return only the name
$output = (Search-ADAccount -AccountDisabled -UsersOnly | where {
($_.name -cnotlike "System*") -and `
($_.name -ne "Guest") -and `
($_.name -ne "krbtgt") -and `
($_.name -notlike "FederatedEmail*") -and `
($_.name -notlike "DiscoverySearchMailbox*") -and `
($_.DistinguishedName -notlike "*ObjectsToDelete*")
} | select Name)

#Build the body of the email by taking the output above and converting it to a string
$body = ($output | Out-String)

#Send me the email
Send-MailMessage -To itreporting@contoso.com -Subject "PS Report - Disabled Active Directory Users" -Body $body -From "helpdesk@contoso.com" -SmtpServer "mailserver.contoso.com"

#---------------END SCRIPT---------------------

Thursday, February 27, 2014

Remotely Find What User is Logged In - Put This in Your Profile!

It was occurring quite often that I would attempt to shut down or reboot a system only to find that "other users" were logged in. So I found the function below on the internet and adapted it a bit, I guess, because I removed the attribution info. So if this is yours, I'm sorry for not crediting you.

If you find it useful, I would recommend adding it to your profile so it loads the function every time you open Powershell. Do this by pasting it into C:\Users\username\Documents\WindowsPowerShell\Microsoft.PowerShell_profile.ps1.

Syntax is Get-LoggedOnUser <ComputerName>

#---------------------BEGIN SCRIPT---------------------------

#This loads the function into your session so you can use Get-LoggedOnUser "Servername"

function global:Get-LoggedOnUser {
#Requires -Version 2.0            
[CmdletBinding()]            
 Param             
   (                       
    [Parameter(Mandatory=$true,
               Position=0,                          
               ValueFromPipeline=$true,            
               ValueFromPipelineByPropertyName=$true)]            
    [String[]]$ComputerName
   )#End Param

Begin            
{            
 Write-Host "`n Checking Users . . . "
 $i = 0            
}#Begin          
Process            
{
    $ComputerName | Foreach-object {
    $Computer = $_
    try
        {
            $processinfo = @(Get-WmiObject -class win32_process -ComputerName $Computer -EA "Stop")
                if ($processinfo)
                {    
                    $processinfo | Foreach-Object {$_.GetOwner().User} | 
                    Where-Object {$_ -ne "NETWORK SERVICE" -and $_ -ne "LOCAL SERVICE" -and $_ -ne "SYSTEM"} |
                    Sort-Object -Unique |
                    ForEach-Object { New-Object psobject -Property @{Computer=$Computer;LoggedOn=$_} } | 
                    Select-Object Computer,LoggedOn
                }#If
        }
    catch
        {
            "Cannot find any processes running on $computer" | Out-Host
        }
     }#Forech-object(ComputerName)       
            
}#Process
End
{

}#End

}#Get-LoggedOnUser

#--------------------------END SCRIPT------------------------

Tuesday, February 25, 2014

Password Expiration Reminder Emails with Powershell

We used to pay for something that did this, but querying active directory and emailing people based on the results seemed so easy to script!

We've been using this for around two months without any issues.

A prerequisite for this is to download and install Quest's Active Directory Powershell Commands Module, which you can get here.

Please see my comments in the code for further information; here's the script:

#--------------------BEGIN SCRIPT----------------------

#Load the Quest snapin
Add-PSSnapin Quest.ActiveRoles.ADManagement

#Main Variables
$Today = Get-Date
$Outfile = "C:\Temp\Outfile.txt"

#Grab User Accounts. You'll want to specify a maximum because the snapin defaults to 1000.
#Also, you can omit OUs like I did for our "Recycle Bin" OU
$Users = get-qaduser -Sizelimit 2000 | where {$_.DN -notlike "*OU=ObjectsToDelete*"}

#Alter the user list so I can work with it. Basically, I'm taking the list of usernames, getting rid of any blanks, and then converting them to strings for later
$Selection = ($Users | select UserPrincipalName)
$Names = ($Selection | where {($_.UserPrincipalName)} | % {(($_.UserPrincipalName).tostring())})

#Go through the user list and Email everyone that has a password expiring in the next 15 days, or has had one
Foreach ($UserPrincipalName in $Names){
$QueryPiece1 = (Get-QADUser $UserPrincipalName)
$EmailAddress = ($QueryPiece1.Email)
$Date1 = ($QueryPiece1.PasswordExpires)
$Difference = (($Date1.subtract($Today)).days)

#This section sends emails IF the password expires in less than 14 days
If (($Difference -le 14) -and ($Difference -gt 0)){
#Writes to report file
$Output = ($EmailAddress + "," + $Date1 + "," + $Difference)
$Output | Add-Content $outfile

#Builds body of the email that is sent to a user
#Put together email body
$Body1 = "Your Windows password is expiring in $Difference days.`r`n"
$Body2 = "The next time you are logged in to a computer, press Ctrl-Alt-Delete and change your password.`r`n"
$Body3 = "PLEASE REPLY TO THIS EMAIL WITH YOUR EXTENSION IF YOU NEED HELP, OR CALLTHE HELPDESK"
$BodyCombined = ($body1 + $body2 + "`r`n" + $body3)

#Actually Sending the emails to users
Send-Mailmessage -from helpdesk@contoso.com -to $EmailAddress -subject "From the Contoso IT Department - Please Read" -smtpserver mailserver.contoso.com -body $BodyCombined
} #End If
} #End Foreach

#Send IT a report on users that were emailed, then delete the temp file we attached
Send-Mailmessage -from helpdesk@contoso.com -to itreporting@comtoso.com -subject "PS Report - Password Reset Notifications" -smtpserver mailserver.contoso.com -body "Emails Sent to (See Attached)" -Attachments $outfile
Remove-Item $OutFile -Force

#--------------------END SCRIPT----------------------

That wasn't that difficult to create, and saved my department some money. I know the text format's a little wonky there because of my nested IF statements, but if you paste the code into a text editor it should wrap properly and be more readable. I wanted to sit and play with it, but I have a family and a job and stuff.


Pulling Information from SQL Server with Powershell, for Documentation Purposes

I will preface this by saying that I'm an accidental DBA, and probably not a very good one. I managed to get transaction log backups working properly, which I posted about last week, but I'm all the time discovering things that I didn't know that I didn't know.

I have a script that I run once a week just to keep current documentation on my SQL servers. To work with a SQL server in Powershell (and my SQL Servers are SQL 2008R2 running on top of Windows 2008R2), you first need to have SQL Server Management Studio installed. This makes the Powershell modules available to you. After that, open a Powershell prompt and load the modules like so:

add-pssnapin SqlServerCmdletSnapin100
add-pssnapin SqlServerProviderSnapin100

Next, set up the database connection:

[System.Reflection.Assembly]::LoadWithPartialName('Microsoft.SqlServer.SMO')
$SQLConnection = New-Object ('Microsoft.SqlServer.Management.Smo.Server') "MySQLServerName"

Now, you can explore the TONS of information there is by doing this:
$SQLConnection | get-member

There's a lot to probe in this list. Every item with a "Property" membertype yields information. Usually, I like to capture lists like this so I can go through the properties one at a time to find useful information. I would do this like so:
$SQLConnection | Get-Member | where-object {$_.Membertype -like "Property"} | select-object Name | out-file c:\temp\SQL_Properties.txt

Let's break this down a bit:
  • $SQLConnection ***Query the information you go from the SQL Server.
  • Get-Member ***List all properties and methods available on $SQLConnection.
  • where-object {$_.Membertype -like "Property"} ***Only give me results for items that have "Property" as their membertype.
  • select-object Name, Membertype ***I only want the name returned. 
  • out-file c:\temp\SQL_Properties.txt ***Send the output to this file.
Now, let's look at just one of those properties: Databases

Running the following command will give you a lot of information regarding each database on the SQL Server:

$SQLConnection.databases

There's a lot to digest, especially if you have more than one database on there. Using the technique above, I'll break that down to include the pieces I want, like so:

$SQLConnection.databases | Get-Member | where-object {$_.Membertype -like "Property"} | select-object Name | out-file c:\temp\SQL_Database_Properties.txt

Let's just get a couple of interesting properties:
  • Name
  • Owner
  • RecoveryModel
  • Size
  • LastBackupDate
  • LastLogBackupDate
  • SpaceAvailable
  • CompatibilityLevel
$SQLConnection.databases | select Name, Owner, RecoveryModel, Size, LastBackupDate, LastLogBackupDate, SpaceAvailable, CompatibilityLevel

Now that you've got the hang of this, you can pick what you need to know and have it mailed to you via the send-mailmessage command or saved to a file with out-file/export-csv. Automate it, and you'll always have up-to-date documentation!


Monday, February 24, 2014

Backing Up GPOs, and Alerting for Changes

So the problems this script addresses are:
  1. Backup the group policies, on-site and off.
  2. Give me a report on which policies do what.
  3. Point out that a group policy has changed, is new, or has been deleted.
I should point out that I am using Windows Server 2008 R2, and Powershell v3 is installed on the computer that runs this script on a daily basis.

You can read about the Powershell grouppolicy module's commands at Technet.

Some prep:
  • Test that you can have the group policy module, which is new in Powershell v3. To do this, in Powershell type import-module grouppolicy.
  • Wherever you put this, make sure you have a "C:\GPOReporting\Backup" folder structure
  • The folder also uses the temp folder location to store xml output of the GPOs for comparisons. 
  • You should modify the section under "#Mirror GPO Backups to DR using robocopy" to point to your off-site DR location - also make sure the folder structure is there.

Here is the script. I went through and commented it to explain what it's doing:

#Get Date for today and yesterday
$DateToday = (get-date -Format Mdy).ToString()
$DateYesterday = (get-date).AddDays(-1)
$ReportDate = (Get-date -Format Mdy).ToString()
$OldReportDate = (Get-Date ((Get-date).AddDays(-1)) -Format Mdy).ToString()
$OldestReportDate = (Get-Date ((Get-date).AddDays(-2)) -Format Mdy).ToString()
$ChangesToGPODivider = "`r`n=================================================================`r`n"
$ChangesToGPOCounter = 0

#Import the Group Policy module for Powershell
Import-Module GroupPolicy

#Specify output Files and Paths for later use
$GPOReport = "c:\temp\GPOReport.html"
$CoreBackupLocation = "C:\GPOReporting\Backup"
$BackupLocation = "C:\GPOReporting\Backup\$DateToday"
$GPOOffsiteBackupLog = "C:\Temp\GPO_FPDR_BackupCopy.log"
$BackupReport = "C:\Temp\BackupReport_$DateToday.txt"

#Get the GPO Report
get-gporeport -all -ReportType HTML -path $GPOReport

#Get a list of all GPOs
$GPOList = get-gpo -all | select displayname | %{$_.Displayname}

#Make the new folder for backups based on the date
new-item -Path $BackupLocation -ItemType directory | out-null

#Back up the GPOs, redirecting output to the Backup Report
Foreach ($GPO in $GPOList){
Backup-GPO -Name $GPO -Path $BackupLocation >> $BackupReport
} #End Foreach

#Get Rid of GPO backups over 7 days old
$Folders = (Get-childitem $CoreBackupLocation | where {$_.PSIsContainer})
Foreach ($Folder in $Folders){
$FolderAge = (((Get-Date) - $Folder.creationtime).totalhours)
If ($FolderAge -gt 168)
{remove-item $Folder.FullName -recurse -Force}
} #End Foreach

#Mirror GPO Backups to DR using robocopy
start-process "c:\robocopy.exe" -ArgumentList 'C:\GPOReporting\Backup \\BackupServer\c$\GPOBackups /MIR /R:10 /W:5 /NP /LOG:C:\Temp\GPO_FPDR_BackupCopy.log /NS /NC /NFL /NDL' -wait

#This section Generates the Reports
Foreach ($GPO in $GPOList){
$PathToReport = "C:\Temp\" + "$GPO" + "_" + "$ReportDate" + ".xml"
Get-GPOReport -Name $GPO -ReportType XML -path $PathToReport
} #End Foreach

#This section creates variables for testing for created and deleted GPOs
$ReplacementString = "_$OldReportDate.xml"
$YesterdaysFiles = get-childitem c:\temp -Filter *$oldreportdate.xml | %{(($_.Name).replace("$ReplacementString",""))}
$SubjectAdditionCreated = ""
$SubjectAdditionDeleted = ""

#This section is for testing for created and deleted GPOs.
$GPODiff = compare-object -referenceObject $YesterdaysFiles -DifferenceObject $GPOList
$GPODiff | %{
If($_.SideIndicator -like "=>"){ #New GPO
$NewGPO = $_
$NewGPOLabel = ($NewGPO.InputObject)
$ChangesToGPO += $ChangesToGPODivider
$ChangesToGPO += "NEW GPO: $NewGPOLabel`r`n"
$ChangesToGPO += $ChangesToGPODivider
$ChangesToGPOCounter++
$SubjectAdditionCreated += " - New GPO Detected: $NewGPOLabel"
} #End if
ElseIf($_.SideIndicator -like "<="){ #Deleted GPO
$DeletedGPO = $_
$DeletedGPOLabel = ($DeletedGPO.InputObject)
$ChangesToGPO += $ChangesToGPODivider
$ChangesToGPO += "DELETED GPO: $DeletedGPOLabel`r`n"
$ChangesToGPO += $ChangesToGPODivider
$ChangesToGPOCounter++
$SubjectAdditionDeleted += " - GPO Deleted: $DeletedGPOLabel"
} #End If
Else{Break}
} #End Foreach

#This Section is for comparing GPO Settings. It takes the GPOReport XML file that was created yesterday and compares it to the GPOReport output we just created.
Foreach ($GPO in $GPOList){
$PathToReport = "C:\Temp\" + "$GPO" + "_" + "$ReportDate" + ".xml"
$PathToReportYesterday = "C:\Temp\" + "$GPO" + "_" + "$OldReportDate" + ".xml"
$PathToReportOldest = "C:\Temp\" + "$GPO" + "_" + "$OldestReportDate" + ".xml"
$Comparison = $null
#GPO Comparison Changes
$Comparison = (Compare-Object -ReferenceObject (Get-Content $PathToReportYesterday) -DifferenceObject (Get-Content $PathToReport) -CaseSensitive) | where {$_.InputObject -notlike "*<ReadTime>*"} | fl
If ($Comparison -eq $null){
$ChangesToGPO += "NO GPO CHANGES to $GPO`r`n"
} #End If
Else{
$ComparisonString = $Comparison | out-string
$ChangesToGPOCounter++
$ChangesToGPO += $ChangesToGPODivider
$ChangesToGPO += "$GPO HAS CHANGED: "
$ChangesToGPO += $ChangesToGPODivider
$ChangesToGPO += $ComparisonString
$ChangesToGPO += $ChangesToGPODivider
} #End Else
#Delete the Oldest Report, which we no longer need
Remove-Item $PathToReportOldest -EA Silent
} #End Foreach
#For Testing: $ChangesToGPO = $null

#List Any Unlinked GPOs
$UnlinkedGPOs = Get-GPO -All | %{If ($_ | Get-GPOReport -ReportType XML | Select-String -NotMatch "<LinksTo>"){Write-Output $_.DisplayName;Write-Output "<BR>"}}


#E-mail Configuration
$SMTPServer = "mailserver.contoso.com"
$From = "helpdesk@contoso.com"
$Subject = "PS Report - GPO: Backup/Change Report - $ChangesToGPOCounter Changes Detected $SubjectAdditionCreated $SubjectAdditionDeleted"

$Body = $ChangesToGPO
$Body += "`r`n`r`n"
$Body += $ChangesToGPODivider
$Body += "UNLINKED GPOs: `r`n"
$Body += $UnlinkedGPOs
$Body = ($Body.replace("`r`n","<BR>"))

Send-MailMessage -from $From -to reporting@contoso.com -subject $Subject -smtpserver $SMTPServer -body $Body -BodyAsHTML -Attachments $GPOReport,$GPOOffsiteBackupLog,$BackupReport

#Delete Unneeded Output Files
remove-item $GPOReport
remove-item $BackupReport
remove-item $GPOOffsiteBackupLog

Friday, February 21, 2014

Where are my users' mapping their My Documents folder?

We don't have folder redirection enabled on my domain. Long-term, we'd like to get this put in place, but at some point in time some users' folder were manually redirected. Maybe.

Furthermore, in my project of eradicating Windows XP from the network I have run across instances where users were saving things to their "My Documents" folder (which isn't redirected) instead of to their mapped personal drive (which is the file server that gets backed up). It's a miracle that people haven't lost more data!

What I've done is created a Powershell login script (my first) that collects some key pieces of information and writes to a file on a wide-open share.

#Purpose: Run via Group Policy and give IT data on who has folder redirection turned on, as well as how much data our users have stored locally in their "My Docs" folder.

#Get Computername
$ComputerName = $env:COMPUTERNAME

#Get "My Documents" folder path
$Path = [environment]::GetFolderPath([environment+SpecialFolder]::MyDocuments) 

#Define marker file, which prevents this script from running for that user if found
$MarkerFile = "$Path\DELETEME.TXT"
$MarkerText = "TEST"

#Destination Folder to accumulate these files
$Destination = "\\NASDevice\OpenShare\FolderRedir\"

#If the Marker file doesn't exist.....
If ((Test-Path $MarkerFile) -eq $False){
#Make a MarkerFile so this won't run again
$MarkerText | out-file $MarkerFile

#Get Size of My Docs folder, and convert output to GB
$TotalSize = Get-ChildItem -path $path -recurse -errorAction "SilentlyContinue" | Measure-Object -property length -sum
$SizeOutput = "{0:n6}" -f ($TotalSize.sum / 1GB) + " GigaBytes"

#Put the data together
$SubjectInit = "$ComputerName - $Path - $SizeOutput"
$SubjectInit = $SubjectInit.replace("\","-").replace(":",".")

        #Create the file with the data, then move the file to the share
New-Item -name $SubjectInit -path $path -itemtype "file"
Move-Item "$path\$SubjectInit" $Destination

} #End If

What this process ultimately does is place a file on an open share that has a filename like this:
COMPUTERNAME - DocFolder - SizeOfFiles

The IF statement at the bottom checks the existence of a marker file. If the file doesn't exist, the process continues and creates the marker file. Otherwise, nothing happens. This is so the login script only creates a file on the open share once per user.

The part that I had to research was how to run a Powershell script on logon. Thankfully, The Scripting Guy had a very helpful post. I even learned about WMI filters along the way!

I'm not going to regurgitate his steps to you. They are clear-cut enough.

Since I'm going to create this marker-file within users' MyDocs folders, when I've collected all the data I need and it's time to unlink this GPO, I'll need to alter the script to delete the marker-file if it exists.

Another tidbit: Powershell logon scripts only run on Windows 7, and not Windows XP logins. I don't know about Vista; we don't run it.

Thursday, February 20, 2014

MS SQL Server Backups - Going from Simple to Full Recovery Model

We used to just back up our databases nightly, but we .... ahem... had an issue... and lost some data, so we decided to start doing transaction log and differential backups on a few key databases.

After we identified which databases that we needed to protect, we worked out the following process on a test SQL server using the almighty AdventureWorks database. Microsoft distributes this fully populated test database for SQL admins and devs to play with.

You can find the option to switch the recovery model by right-clicking on the database, selecting properties, and then choosing options.

Important: when switching between Simple and Full recovery models, you have to perform a full backup before the database will start using the transaction log files. If you don't make this switch prior to setting up the subplans (see below), you will not be able to set up the transaction log backup subplan. Also, after switching to transaction logs, you should have a good process in place to monitor free space wherever your files are stored. But you already have that, right?

The schedule that worked for us:
1. Full Backup nightly at 12am.
2. Differential backup at noon and again at 6pm.
3. Transaction Log Backup every 2 hours, starting at 1AM (so on all odd hours: 1,3,5,7, etc).

In SQL Server Management Studio (SSMS), I created a maintenance plan with four subplans:
Full Backup
Diff at noon
TLog every 2 hours
Diff at EOD

For each subplan, I chose the appropriate backup type (full, diff, tlog) and chose the databases. I selected the radio button to "Create a backup file for every database", and chose the D:\DB_Backups path. I then checked the "Verify backup integrity" option and set the drop-down to "Compress Backup".

Set the schedule as required, of course.

Now, we need to take a little detour to create a SQL Server Agent Job. This job has one step:
C:\Windows\System32\WindowsPowerShell\v1.0\PowerShell.exe -File C:\PS\DB_Backup_DR_Mirror.ps1

OMG A POWERSHELL SCRIPT - WHO WOULD HAVE THOUGHT?

Go back to your maintenance plan, and for each subplan you created, drag an "Execute SQL Server Agent Job Task" from the "Maintenance Plan Tasks" toolbar over into the process flow area. Connect the green arrow from your Backup database task to this box, and then set the box to run the job you created to run the Powershell script.

Now, you may be wondering what this Powershell script does!

Its job is to copy the backup files created off-SAN and off-site. My SQL server lives in a VMware environment on a Dell SAN. If the SAN died, I want access to my backups. If the building got hit by an airplane, I want off-site copies. Typical DR.

Here's the script:


#Initial Variables that you will modify on a per-case basis
$SQLServerName = "SQLSERVER1"
$DBName = "ImportantDatabase"

#Variables need that automatically generate
$LocalBackupPath = "D:\DB_Backups\$DBName\*.*"
$OffSANCopyLocation = "\\OnsiteNAS\SQLBackup\$SQLServerName\$DBName"
$OffSiteCopyLocation = "\\OffsiteNAS\SQLBackup\$SQLServerName\$DBName"

#Copy to Local NAS (Off-SAN)
Copy-Item -Path $LocalBackupPath -Destination $OffSANCopyLocation -Force

#Copy to Remote NAS (Off-Site)
Copy-Item -Path $LocalBackupPath -Destination $OffSiteCopyLocation -Force

#Remove copies on the SQL Server itself
Remove-Item -Path $LocalBackupPath -Force

#For off-site location, get the creationtime and delete anything older than 7 days
$Files = (Get-childitem $OffSiteCopyLocation)
Foreach ($file in $files){
$FileAge = ((get-date) - ($file.creationtime)).totaldays
If ($FileAge -gt 7){
remove-item $File.FullName -Force
} #End If
} #End Foreach

#For off-SAN location, get the creationtime and delete anything older than 7 days
$Files = (Get-childitem $OffSANCopyLocation)
Foreach ($file in $files){
$FileAge = ((get-date) - ($file.creationtime)).totaldays
If ($FileAge -gt 7){
remove-item $File.FullName -Force
} #End If
} #End Foreach



There at the end, I'm deleting copies older than 7 days. Since I'm writing to tape on a weekly basis, if someone needs an old copy of the database it's easy enough to pull.

Wednesday, February 19, 2014

PSA: Windows 2003 End-of-Life

I know everyone is in a furor over the impending Windows XP end-of-life, but I thought it might be beneficial to point out that Windows Server 2003 goes end-of-life next summer, on 7/14/15.

NOW is the time to start planning on getting rid of these servers, as there is quite a bit of coordination and testing that goes into relocating a server application.

Other EOL dates of interest include:
SQL Server 2005 4/12/2016
SQL Server 2008 7/9/2019
SQL Server 2008 R2         7/9/2019
Windows Server 2008         1/14/2020
Windows Server 2008 R2         1/14/2020
Exchange Server 2010         1/14/2020
Windows 7 1/14/2020
Office 2010         10/13/2020

You can check other products at the Microsoft Product Lifecycle Search engine.

Tuesday, February 18, 2014

Editing the "Current User" registry hive from inside of a non-admin account

It came to pass that I needed to edit the HKEY_CURRENT_USER Windows registry hive for a particular user the other day. Because this user was not a member of the local admins group, I could not edit the keys I needed to.

I ended up following this method:

1. As the user, log out
2. Log in with an admin account
3. Open Regedit
4. Click on HKEY_USERS
5. Click the File dropdown menu and select "Load Hive"
6. Navigate to C:\Users\<username> (This was a Windows 7 box)
7. In the file name field, enter ntuser.dat. You will not see this file, but it's there.
8. Click Open
9. Give the key a temporary name, I named mine "Coatl"
10. Drill down into Coatl and change what you need to.
11. IMPORTANT: When you are finished, you need to click on the temporary hive, then use the File menu dropdown to "Unload" the hive. Failure to complete this step will result in the user being logged into the machine with a temporary profile.
12. Log out and then back in as the user.

Monday, February 17, 2014

Getting Your Email out of the Barracuda Message Archiver

We run a Barracuda Message Archiver 450. I really like the device, but we were looking at alternatives, and I needed a way to test possible solutions with real mail. The question that came to pass was, "How do we get our email out of the Barracuda?"

Basically, there is no out-of-the-box solution to this; Barracuda does not have a tool.

So, I wrote my own using Powershell. :)

I have to say that this code could stand to be cleaned up. I had to use some pretty circuitous methods to get it to work correctly.

Prior to running this:
1. You need to copy all of the files from the SMB share on the Barracuda Message Archiver somewhere else. I mapped this as U drive.
2. You need a working folder with gobs of space. I mapped this as my V drive.
3. Install 7zip

Basically, you have a bunch of .zip files. You extract everything out of these. The extracted files will have no extensions. What I got from support was that these files are either emails themselves, or are gzipped archives. I would use 7zip to try to decompress these files, and if the process returned an exit code of 2, I knew it wasn't a valid archive and would then append the .eml extension. If the file was a zip archive, the files unzipped would have the eml extension tacked onto the end.

I use a random number to create output folders to hold all of the many eml files. Some zips had upwards of 35,000 emails in them.

I KNOW I could have done a better job commenting this code. I'm almost embarassed to put it out here, but I really wished someone had given me some direction, so here it is. If you have the need for this script, you can create a copy of your archives and work through the code a chunk at a time to see what's going on, so that you don't put your production archives at risk. Remember that you can open .eml files with notepad. :)

I will use commented lines within the script for the remainder of this article.
The Script:

#Specify source and working folders, as well as report file variables
$Source = "U:\1"
$WorkingFolder = "V:\Extract"
$ReportFile = "C:\temp\MailArchReport.txt"
$ReportFileSpacer = "`r`n`r`n===========================================================================`r`n`r`n"

#Ask for starting file number and ending file number
[int]$StartingZip = Read-Host "Enter Number of First Zip File to Process"
[int]$EndingZip = Read-Host "Enter Number of Last Zip File to Process"

#Last chance to get out
$LastChanceAnswer = Read-Host "Are you sure you want to continue processing all files between $StartingZip.zip and $EndingZip.zip? (y or n)"
If ($LastChanceAnswer -ne "y"){
Break
} #End If

#Initialize the array to hold all expected zip file names
$ZipFileSet = @()

#Initialize Report File with the starting date and time
Get-Date | Add-Content $ReportFile

#Counter to populate the zip file names array
For ($i = $StartingZip; $i -le $EndingZip; $i++){
$StartipZipStr = "$i.zip"
$ZipFileSet = $ZipFileSet + $StartipZipStr
} #End For

#Add record for zip files processed to the report file
$ReportFileSpacer | Add-Content $ReportFile
"Files Processed:" | Add-Content $ReportFile
$ZipFileSet | %{Add-Content $ReportFile -Value $_}

#Go through the zip file names array and copy the files from the source to the working folder
Foreach ($file in $ZipFileSet){
copy-item "$Source\$file" -destination "$WorkingFolder"
} #End Foreach

#Create the first working folder
$WorkingFolderOneName = "$WorkingFolder\WorkingFolder1"
mkdir $WorkingFolderOneName | out-null

#Unzip all of the zip files in the array
Foreach ($file in $ZipFileSet){
$sourcefile = "$WorkingFolder\$file"
$targetfolder = "$WorkingFolderOneName"
$ZipCommandStringPartOne = 'C:\"Program Files"\7-zip\7z.exe'
$ZipCommandStringPartTwo = "x $sourcefile -o$targetfolder -r"
cmd.exe /C "$ZipCommandStringPartOne $ZipCommandStringPartTwo" | out-null
} #End Foreach

#Get a list of all files
$WeirdZipFiles = Get-ChildItem $TargetFolder -recurse | where {! $_.psiscontainer -and $_.fullname -notlike "*.???"} | select fullname, name, directory

#Add record for number of files
$ReportFileSpacer | Add-Content $ReportFile
"New zip files that don't have a zip extension:" | Add-Content $ReportFile
$WeirdZipFiles | measure-object | select count | %{$_.count | out-string} | Add-Content $ReportFile

#Initialize counters
$MovedCount = 0
$MovedRenamedCount = 0

#Create the folder for the emails
$RandomSeedForEMLFolder = Get-Random
$WorkingFolderTwoName = "$WorkingFolder\Emails_$RandomSeedForEMLFolder"
mkdir $WorkingFolderTwoName | out-null

#Each of those need to be unzipped.
Foreach ($file in $WeirdZipFiles){
$sourcefile = $File.Fullname
$targetfolder = $File.Directory.Fullname
$ZipCommandStringPartOne = 'C:\"Program Files"\7-zip\7z.exe'
$ZipCommandStringPartTwo = "x $sourcefile -o$targetfolder -r"
cmd.exe /C "$ZipCommandStringPartOne $ZipCommandStringPartTwo" | out-null
If ($LastExitCode -eq 2){ #If the file wasn't an archive, output the name.
$RandomSeed = Get-Random
$FileName = $File.Name
$FilePath = $File.directory.fullname
$OldFileFullname = ($FilePath + "\" + $FileName)
$FileNameAddition = "$RandomSeed.eml"
$NewFileName = ($FileName + $FileNameAddition)
$NewFileFullname = ($FilePath + "\" + $NewFileName)
Rename-Item $OldFileFullname -NewName $NewFileName
Move-Item $NewFileFullname $WorkingFolderTwoName
$MovedRenamedCount++
} #End If
If ($LastExitCode -eq 0){ #Otherwise, Rename, then move the raw eml file to working folder two
$FileNameSplit = $File.Name.split(".")
$ResultFileName = $FileNameSplit[0]
$FilePath = $File.directory.fullname
$OldFileFullname = ($FilePath + "\" + $ResultFileName)
$RandomSeed = Get-Random
$FileNameAddition = "$RandomSeed.eml"
$NewFileName = ($ResultFileName + $FileNameAddition)
$FilePath = $file.directory.fullname
$NewFileFullname = ($FilePath + "\" + $NewFileName)
Rename-Item $OldFileFullname -NewName $NewFileName
Move-Item $NewFileFullname $WorkingFolderTwoName
Remove-Item $SourceFile
$MovedCount++
} #End If
} #End Foreach

#Report Stuff
"`r`n Files that were renamed, then moved (.eml files): $MovedCount" | Add-Content $ReportFile
"`r`n Files that were extracted, then moved. $MovedCount" | Add-Content $ReportFile

#Remove working folder one
Remove-Item $WorkingFolderOneName -recurse -force

#Remove the zip files that were processed
Foreach ($file in $ZipFileSet){
$ZipFileSetPath = ($WorkingFolder + "\" + $file)
Remove-Item $ZipFileSetPath -recurse -force
} #End Foreach

#Add an ending timestamp to the report file
Get-Date | Add-Content $ReportFile

#Email the report file
Send-MailMessage `
-To me@contoso.com `
-From administrator@contoso.com `
-SMTPServer mail.contoso.com `
-Subject "Barracuda Zips Processed" `
-Body "See Attached Report" `
-Attachments $ReportFile

Remove-Item $ReportFile -force

Friday, February 14, 2014

Creating my Swiss-Army USB Thumbdrive

I've been messing around with security stuff lately, and finally got the motivation to create a bootable USB Thumbdrive.

I've wanted one for years, but never really got around to making one that fit all of my needs.

First, I used YUMI to make my drive bootable to Kali Linux, a security distribution of Linux that used to be called BackTrack.

Then, I put a ton of great utility apps onto the thumbdrive, and used this guide to create nice litle shortcuts for everything in the root folder.

The final product is a bootable Linux Distro, which also has all of my normal tools on it that are usable from within Windows.

Here's a screenshot of my root folder:

Maybe next time I'll go for something bigger, like this 1TB flash drive from Kingston!

Thursday, February 13, 2014

Sharepoint 2013 filling up my Domain Controller's Security Logs

I just bought and implemented Solarwinds' Syslog server. Good stuff. Now I just need to find the time to look at them! :P

In the process of looking through my domain controllers' security logs (just the failure audits) I was inundated with failures from my Sharepoint server. It made the rest of the logs unreadable, so my goal was set: I needed to fix the Sharepoint server and make it stop doing this!

Here's what the errors look like:

2014-01-22 14:46:13 Kernel.Critical dc02.contoso.com Jan 22 14:46:13 dc02.contoso.com MSWinEventLog 2 Security 12451 Wed Jan 22 14:46:13 2014 4769 Microsoft-Windows-Security-Auditing N/A Audit Failure dc02.contoso.com 14337 A Kerberos service ticket was requested.


Account Information: 
Account Name: spservice@contoso.com 
Account Domain: contoso.com 
Logon GUID: {00000000-0000-0000-0000-000000000000} 

Service Information: 
Service Name: spservice 
Service ID: S-1-0-0 

Network Information: 
Client Address: ::ffff:192.168.1.53 
Client Port: 57013 

Additional Information: 
Ticket Options: 0x40810000 
Ticket Encryption Type: 0xffffffff 
Failure Code: 0x1b 
Transited Services: -

It's happening on multiple "client ports":

56591

56594

56605

56607

56624

56643

etc.

Thankfully, I was able to track down a guide on configuring Sharepoint kerberos authentication. No, my logs are cleared up and I can see the data that I care about!