Click an Ad

If you find this blog helpful, please support me by clicking an ad!

Thursday, October 31, 2013

Storing Credentials for Powershell Scripts

The next script I want to show you has a prerequisite of storing your credentials in a file for later use. I figured I'd better cover that first, so here goes.

These credentials are user speceific (duh) but they are also computer specific. Let's say I was to run a scheduled task that uses The 'ScheduledTaskUser' user, and that this task will run from 'TaskServer'.

In order to make this work, this process must be run on 'TaskServer'!

1. Create the securepassword string from the computer that's going to use it:
read-host -assecurestring | convertfrom-securestring | `
out-file c:\temp\ScheduledTask_Credentials.txt

2. When you press enter to execute the command above, you will be on a blank line. Enter the password and press enter.

3. The password is now saved on the computer as a secure string (not in plaintext) in the file that you specified.

4. In your script file, build the credential:

$password = get-content `
c:\temp\ScheduledTask_Credentials.txt | convertto-securestring

$credentials = new-object `
-typename System.Management.Automation.PSCredential `
-argumentlist "DomainName\ScheduledTaskUser",$password

5. Use the credential. You can pass $credentials anywhere you can use the -credential parameter, like so:

get-wmiobject win32_service `
-computer AnotherServer `
-credential $credentials

Aside:
You'll notice above that I used the backtick character ` to allow me to continue the script on the next line. I'm trying to do this more and more for script readability......

Monday, October 28, 2013

My 100th Post! -- Troubleshooting Change Block Tracking in Veeam

I never thought I'd make it to 100 posts, but here I am! My initial goal was  post per day, but I soon realized that was unrealistic. I have a family, a job, and a life after all. So I'm proud that I've created 100 nuggets of information that can help people. In the end, that's what IT is here to do: help people.

Change Block Tracking (CBT, and I don't mean Nuggets) is a VMware feature that Veeam uses extensively. It's what lets Veeam decide which block have changed since the last backup, so it can skip everything that HASN'T changed. Using it speeds up your backups a lot. The first time you back up a VM, Veeam has to go through every block of data, but thereafter it uses CBT to be selective.

Once in a while, one of my Veeam backups will report a warning, and I'll see error messages for a specific VM like these:

10/22/2013 11:27:26 PM :: Disk [VMFS-VOL3] VMNumber1/VMNumber1_1.vmdk size changed. Changed block tracking is disabled.

This was caused by expanding a drive on a Windows Server 2008 R2 server. You can do this while the VM is running, which is really handy, but you should reset the CBT on the VM before backups run. I'll outline that process here in a bit.

Another error you might see is this:
10/21/2013 11:16:08 PM :: Cannot use CBT: Soap fault. Error caused by file /vmfs/volumes/5122935d-291050d2-ee9c-ac162d75bf50/VMNumber1/VMNumber1.vmdkDetail: '', endpoint: ''

This was caused by using svMotion to migrate a VM to another datastore, but it can be caused by other things, like using vConverter on a VM. I've seen CBT break for no reason at all, but that's been few and far between.

To fix this issue, perform the following steps to reset CBT. It will require two shutdowns, so get a maintenance window if you need one.

Short version/process outline:
1. Shut down the VM
2. Disable CBT
3. Delete the -CTK files from the filesystem
4. Power on the VM
5. Shut Down the VM
6. Enable CBT
7. Power on the VM

The step-by-step version:
1. Shut down the VM
2. In vCenter, edit the settings of the VM
3. Click on the "Options" tab
4. Click on "General"
5. Click "Configuration Parameters"
6. Click on the "Name" column header twice to sort alphabetically
7. set ctkEnabled to FALSE
8. Scroll down and find each virtual hard drive listing. They look like this: scsix:x.ctkEnabled and set these to FALSE as well. YOU MUST DO THIS FOR ALL HARD DISKS. JUST LOOK FOR THE "CTK" IN THE NAME
9. Click OK as many times as it takes to get back out.
10. Ensure the VM is selected and that you are viewing its "Summary" tab
11. On the right-hand side is a box called "Resource", and within it there is storage listed.
12. Right-click on the listed storage (for example, VMFS-VOL17) and select "Browse Datastore"
13. Look for a folder with the same name as the VM
14. Go into that folder
15. For each virtual disk and snapshot disk there is a .ctkfile. For example:

vmname.vmdk
vmname-flat.vmdk
vmname-ctk.vmdk
vmname-000001.vmdk
vmname-000001-delta.vmdk
vmname-000001-ctk.vmdk

Delete ALL of the -CTK files (make sure you are choosing wisely!)

16. Now, boot the VM, and wait until VMware Tools has loaded. The easiest way to accomplish this is to open the VM's console, then use the menu (VM->Power->Shut Down Guest) to shut the VM down. If that option is greyed out, then VMware Tools is not running. Just try again in a few seconds.
17. Shut it down
18. Now, we will turn CBT back on
19. In vCenter, edit the settings of the VM
20. Click on the "Options" tab
21. Click on "General"
22. Click "Configuration Parameters"
23. Click on the "Name" column header twice to sort alphabetically
24. set ctkEnabled to TRUE
25. Scroll down and find each virtual hard drive listing. They look like this: scsix:x.ctkEnabled and set these to TRUE as well. YOU MUST DO THIS FOR ALL HARD DISKS. JUST LOOK FOR THE "CTK" IN THE NAME
26. Click OK as many times as it takes to get back out.
27. Power the VM on, and you're done.

Friday, October 25, 2013

VMware vCenter with a SQL Express back-end can get full!

If you have a smaller VMware infrastructure, you are probably running vCenter with its database on a SQL Express instance. This works fine, but eventually that database will fill up if you don't adjust the retention setting in vCenter, because SQL Express is limited to a maximum database size of 4GB.

This happened recently to a friend of mine, but in my haste to create a monitor in PRTG to monitor the size I neglected to get a screenshot of what that moment looks like. For that, dear reader, I apologize.

What I CAN tell you is how to find out how big yours is, and how to keep it from growing too large. The size of the database can be found by looking in C:\Program Files\Microsoft SQL Server\MSSQL10_50.VIM_SQLEXP\MSSQL\DATA\VIM_VCDB.mdf. The path may differ based on whether you have a 32 or 64-bit SQL installation, and what you named the file, but that's where mine is. That MDF file is your database (the LDF file with the same base name is your Transaction Log file, just FYI). So, you don't want this file to get near 4GB. I have a sensor in PRTG (my monitoring software) that keeps track of the file size, so I can look at trends or whatnot. It would also be fairly easy to throw something together using powershell to alert me if the file grows over a certain threshold.

What I would also do, however, is go into vCenter, click Home-->vCenter Server Settings and adjust the Database Retention Policy to something smaller. I believe the boxes are unchecked by default, so there is no retention policy, meaning that it saves everything forever. See this picture:




Your needs might be different - I have VeeamONE monitoring my stuff, so I elected for 30 days.

Wednesday, October 23, 2013

SQL Data Drive Almost Full? Why is MSDB GIGANTIC!!!!????

I came in a few weeks ago to this potential catastrophe. One of my SQL servers had only a few gigs of space left on its data drive, so I opened a timeline graph to see if we were just running out of room because of normal data growth, or whether there was an issue. There was an issue. The size of my MSDB file was 15.5 GB! I hunted around on the net for answers to how this could possibly happen, and in the end I fixed the issue without the server running out of space. What was happening was that there was a reindexing maintenance plan that stalled out and was just stuck writing to MSDB.

I also feel the need to say that I'm a noob accidental DBA. What that means is that I know the most about SQL in my workplace, so I'm automatically in charge of SQL. I've learned a lot, but I totally Google-Fu'd my way through this problem.

Here's what I ended up doing to fix the issue:

1. I found that the culprit was backed up messages in the sysxmitqueue by running this script:

SELECT object_name(i.object_id)
as objectName,

i.[name] as indexName,

Sum(a.total_pages) as totalPages,

sum(a.used_pages) as usedPages,

sum(a.data_pages) as dataPages,

(sum(a.total_pages) * 8 ) / 1024 as totalSpaceMB,
(sum( a.used_pages)* 8 ) / 1024 as usedSpaceMB,
(sum(a.data_pages)* 8 ) / 1024 as dataSpaceMB

FROM
sys.indexes i

INNER
JOIN sys.partitions p

ON i.object_id= p.object_id
AND i.index_id= p.index_id

INNER
JOIN sys.allocation_units a

ON p.partition_id= a.container_id

GROUP
BY i.object_id, i.index_id, i.[name]

ORDER
BY sum(a.total_pages)
DESC,object_name(i.object_id)

GO

2. I stopped the maintenance job that was stuck.

3. I then ran
SELECT * from sys.transmission_queue

This gave me the conversation handle that was stuck (9A06F198-5008-E011-9526-0050569F69D3).

4. I then ran
End Conversation 9A06F198-5008-E011-9526-0050569F69D3 with cleanup

This stopped the conversation and purged the table in MSDB. It took around 20 minutes to run.

5. I kicked everyone off the system. A quick way to see connections to your SQL server is to run sp_who2. This lists all of the connections.

6. I stopped the SQL Server Agent Service

7. I ran the following SQL command:
Alter Database msdb 
Set New_Broker With Rollback Immediate

8. I right-clicked on the MSDB database, selected tasks, then shrink -> database

9. This reclaimed 15 of the 15.5 GB. Crisis averted!

Tuesday, October 22, 2013

What I've been up to for the past couple of months......

So if I have any regular readers, they may have noticed that I went through a two-month spell where I didn't post anything. I've been learning a ton of stuff, mainly revolving around security, and just went all out on that. I could have written lots of "What I learned today" posts, but I just haven't felt like writing, honestly.

It started when I went to GrrCon, a Security Conference in Grand Rapids, Michigan (USA) that really got my juices flowing. I came out of it wanting to learn so many things, and KNOWING that I needed to get more used to using Linux.

So, first order of business was to get Kali Linux, which used to be called BackTrack. I got an "old" laptop from work and loaded it up. I also got a promiscuous wifi card for sniffing wireless traffic. I am running a WPA2-PSK network at home, so I went through some tutorials on how to crack the wifi passwords, made a dummy password dictionary that contained my real password, and was able to crack my password.

Password cracking is something I have read a lot about lately, but haven't gotten around to; it's one thing on a very long list. I've read some really intriguing articles recently by Dan Goodwin over at Ars Technica. Here is the latest, but if you're interested in seeing how passwords are becoming more and more useless every day, you should look into older password-related articles on the site. There's some gold there!

The day before GrrCon, I attended a class on using MetaSploit to gain access to a vulnerable system. Basically, they gave us a Kali Linux VM and a Windows 2000 VM, and taught us how to use MetaSploit to root the Windows box. It was STUPID easy, and it gave me a whole new perspective on why those updates need to get out ASAP on Patch Tuesday every month. I was able to use MS08-067 to create a new administrator-level user on the Windows box in about 4 commands. Not exaggerating. That IE 0-day that was making headline over the past week? There was a Metasploit plugin available 7 days before the fix. Metasploit does some really amazing things once you have root access to a machine. In a few keystrokes you can start logging keystrokes. You can dump the SAM hash to a file and use another program to start cracking the hashes to get actual passwords. And more. Here! Offensive Security makes a free VM called MetaSploitable that you can practice on!

There really is a LOT of stuff out there for the aspiring "Hacker":
SecurityTube.net is like YouTube, but focuses on security and has a lot of "How To" videos.
DarkReading.com is a great security news site that I've been reading daily.
Hack Forums is a really neat forum where you can get help, or just lurk to see what's possible.

I'm really excited about this stuff!

Monday, October 21, 2013

Documenting Scheduled Tasks for all of my Servers

Yeah I'm getting the itch to write more regularly. I've only had yesterday's post in about 2 months, and it's time to get some stuff out there!

Today, I'm going to talk about a report that I run every week that collects every scheduled task running on my servers, puts it all into an excel file, and emails it to me.

Now, because it uses Excel, this is a task that needs to be run manually. Also, before you run it you should have a C:\Temp folder, and a list of servers in C:\lists\TaskSched-servers.txt -- or you can change those lines, which are below the functions.

#First, here are the functions that deal with Excel:
Function Release-Ref ($ref) 
    {
        ([System.Runtime.InteropServices.Marshal]::ReleaseComObject(
        [System.__ComObject]$ref) -gt 0)
        [System.GC]::Collect()
        [System.GC]::WaitForPendingFinalizers() 
    }

Function ConvertCSV-ToExcel
{
<#   
  .SYNOPSIS  
    Converts one or more CSV files into an excel file.
     
  .DESCRIPTION  
    Converts one or more CSV files into an excel file. Each CSV file is imported into its own worksheet with the name of the
    file being the name of the worksheet.
       
  .PARAMETER inputfile
    Name of the CSV file being converted
  
  .PARAMETER output
    Name of the converted excel file
       
  .EXAMPLE  
  Get-ChildItem *.csv | ConvertCSV-ToExcel -output 'report.xlsx'
  
  .EXAMPLE  
  ConvertCSV-ToExcel -inputfile 'file.csv' -output 'report.xlsx'
    
  .EXAMPLE      
  ConvertCSV-ToExcel -inputfile @("test1.csv","test2.csv") -output 'report.xlsx'
  
  .NOTES
  Author: Boe Prox      
  Date Created: 01SEPT210      
  Last Modified:  
     
#>
     
#Requires -version 2.0  
[CmdletBinding(
    SupportsShouldProcess = $True,
    ConfirmImpact = 'low',
DefaultParameterSetName = 'file'
    )]
Param (    
    [Parameter(
     ValueFromPipeline=$True,
     Position=0,
     Mandatory=$True,
     HelpMessage="Name of CSV/s to import")]
     [ValidateNotNullOrEmpty()]
    [array]$inputfile,
    [Parameter(
     ValueFromPipeline=$False,
     Position=1,
     Mandatory=$True,
     HelpMessage="Name of excel file output")]
     [ValidateNotNullOrEmpty()]
    [string]$output    
    )

Begin {     
    #Configure regular expression to match full path of each file
    [regex]$regex = "^\w\:\\"
    
    #Find the number of CSVs being imported
    $count = ($inputfile.count -1)
   
    #Create Excel Com Object
    $excel = new-object -com excel.application
    
    #Disable alerts
    $excel.DisplayAlerts = $False

    #Show Excel application
    $excel.Visible = $False

    #Add workbook
    $workbook = $excel.workbooks.Add()

    #Remove other worksheets
    $workbook.worksheets.Item(2).delete()
    #After the first worksheet is removed,the next one takes its place
    $workbook.worksheets.Item(2).delete()   

    #Define initial worksheet number
    $i = 1
    }

Process {
    ForEach ($input in $inputfile) {
        #If more than one file, create another worksheet for each file
        If ($i -gt 1) {
            $workbook.worksheets.Add() | Out-Null
            }
        #Use the first worksheet in the workbook (also the newest created worksheet is always 1)
        $worksheet = $workbook.worksheets.Item(1)
        #Add name of CSV as worksheet name
        $worksheet.name = "$((GCI $input).basename)"

        #Open the CSV file in Excel, must be converted into complete path if no already done
        If ($regex.ismatch($input)) {
            $tempcsv = $excel.Workbooks.Open($input) 
            }
        ElseIf ($regex.ismatch("$($input.fullname)")) {
            $tempcsv = $excel.Workbooks.Open("$($input.fullname)") 
            }    
        Else {    
            $tempcsv = $excel.Workbooks.Open("$($pwd)\$input")      
            }
        $tempsheet = $tempcsv.Worksheets.Item(1)
        #Copy contents of the CSV file
        $tempSheet.UsedRange.Copy() | Out-Null
        #Paste contents of CSV into existing workbook
        $worksheet.Paste()

        #Close temp workbook
        $tempcsv.close()

        #Select all used cells
        $range = $worksheet.UsedRange

        #Autofit the columns
        $range.EntireColumn.Autofit() | out-null
        $i++
        } 
    }        

End {
    #Save spreadsheet
    $workbook.saveas("$output")

    Write-Host -Fore Green "File saved to $output"

    #Close Excel
    $excel.quit()  

    #Release processes for Excel
    $a = Release-Ref($range)
    }
}

#------------------------------------------------------------------------------------
#------------------------------------------------------------------------------------
#Now here's the meat and potatoes:
$servers = Get-Content \\server\c$\lists\TaskSched-servers.txt
Remove-Item "C:\Temp\Scheduled Tasks Documentation.csv" -ErrorAction SilentlyContinue
$TempFile = "C:\Temp\Scheduled Tasks Documentation.csv"
$Attachment = "C:\temp\Scheduled Tasks Documentation.xlsx"
$Tasks = @()

Foreach ($Computername in $Servers){
$schtask = (schtasks.exe /query /s $ComputerName /V /FO CSV | ConvertFrom-Csv)
$schtask = ($schtask | where {$_.Taskname -notlike "*\Microsoft*" -and $_.Taskname -notlike "Taskname"})
$schtask = ($schtask | where {$_."Run as User" -notlike "Network Service"})
if ($schtask){
foreach ($sch in $schtask){
$sch  | Get-Member -MemberType Properties | 
ForEach -Begin {$hash=@{}} -Process {
If ($WithSpace){
($hash.($_.Name)) = $sch.($_.Name)} #End If
Else {
($hash.($($_.Name).replace(" ",""))) = $sch.($_.Name)} #End Else
} -End {$Tasks += (New-Object -TypeName PSObject -Property $hash)} #End Foreach
} #End Foreach
} #End If
} #End Foreach
$Tasks | select Hostname, TaskName, ScheduledTaskState, Status, LastResult, RunasUser, TasktoRun, Comment, NextRunTime, LastRunTime, ScheduleType, StartTime, Months, Days, Repeat:Every | export-csv $tempfile

#This Removes the first line of the file, which is just junk
$x = get-content $tempfile
$x[1..$x.count] | set-content $Tempfile

#Use the Functions above to import the CSV and output an Excel file
ConvertCSV-ToExcel -inputfile $Tempfile -output $Attachment

#Email me the file
$To = "me@contoso.com"
$From = "helpdesk@contoso.com"
$Subject = "PS Report - Scheduled Tasks - Documentation Purposes"
$Body = "This is a list of scheduled tasks on all servers, to be used for documentation purposes"
$SMTPServer = "SMTPServer.contoso.com"
Send-Mailmessage -to $To -Subject $subject -From $From -body $body -smtpserver $SMTPServer -attachments $Attachment

#Delete the Temp Files
remove-item $Attachment -force
Remove-Item $Tempfile

Sunday, October 20, 2013

Checking for Running SureBackup Labs

I created this script to notify me (remind me) that a SureBackup Lab was still running.

On key thing to have in place is a consistent naming convention. My Veeam Surebackup lab jobs all follow this one: "Server_LAB".

This runs at 4pm every weekday:

#Add Veeam Powershell Snap-In
Add-PSSnapin VeeamPSSnapin

#Email Variables
$To = "me@contoso.com"
$From = "helpdesk@contoso.com"
$SMTPServer = "smtpserver.contoso.com"

#Here I get all of the running lab jobs (using my naming convention)
$SBJobs = (Get-vsbjob | where {$_.name -like "*LAB*"})

#Foreach one, get the job's state, then email me if that state is "Working"
Foreach ($SBJob in $SBJobs){
$SBJobState = ((Get-VSBJob -name ($SBJob.name)).GetLastState())
If ($SBJobState -eq "Working"){
$SBJobName = ($SBJob.Name)
$Body = "SureBackup Lab $SBJobName is Running"
$Subject = "SureBackup Lab $SBJobName is Running"
Send-Mailmessage -to $To -Subject $subject -From $From -body $body -smtpserver $SMTPServer
} # End If
} # End Foreach