Showing posts with label Scripting. Show all posts
Showing posts with label Scripting. Show all posts

Wednesday, August 8, 2018

PowerShell List Local Administrators on all servers

I needed to list all Local Administrators on all servers at a company as part of a report.

I could not find a good PowerShell script which queried the server to see if it was online, then send a WMI query to enumerate the Local Administrators.

Here is a copy of the script I put together


$serverlist = Get-Content C:\Users\clint-b\serverlist.txt

foreach ($server in $serverlist)
    {
    $ipAddress = $pingStatus.ProtocolAddress;
    # Ping the computer
    $pingStatus = Get-WmiObject -Class Win32_PingStatus -Filter "Address = '$server'";
    if($pingStatus.StatusCode -eq 0)
        {
        Write-Host -ForegroundColor Green "Ping Reply received from $server.";
        $server | Out-File -NoClobber -Append C:\Users\clint-b\localadmins.txt
        $admins = Gwmi win32_groupuser –computer $server
        $admins = $admins |? {$_.groupcomponent –like '*"Administrators"'}
        $admins |? {$_.groupcomponent –like '*"Administrators"'} | fl *PartComponent* | Out-File -NoClobber -Append C:\Users\clint-b\localadmins.txt
        }
    else
    {
    Write-Host -ForegroundColor Red "No Ping Reply received from $server.";
    }
    }


In order to use this script you will need to put together a text file which has all the servers/workstations you want to query.  I used DSQUERY to make this text file but you can use numerous tools.

The text file must have the hostname of each member server on a separate line like:

SERVER1
SERVER2
SERVER3

Under the Out-File section, specify the location of where you want the data to be stored.

Sunday, June 18, 2017

For Each Line in Text File Do - Batch Script

Below is a simple batch script which takes each line of a text file and lets you use it in a script.  I have provided an example of this below.

I have needed FOR EACH, DO batch scripts numerous times over the years and its always hard to find a good one on the Internet.

@ECHO OFF
For /f %%i in (c:\computerlist.txt) do (
Echo ************************
Echo %%i
Echo ************************
psexec \\%%i -h -u domain\username -p password "\\domain\netlogon\mybatchscript.bat"
)
pause

Very handy during day to day sysadmin tasks!

Thursday, January 26, 2017

Displaying full values of Attributes in PowerShell

In Windows PowerShell often when you are running queries, PowerShell will only show a limited value for objects which contain large attributes.  The attribute output is cutoff with a "..." at the end of the attribute.


To configure PowerShell to display the full output of a cmdlet, enter the following into the shell window:

$FormatEnumerationLimit=-1

The shell will now push the full output onto the screen for long attributes.


Hope this post was helpful.

For IT Support in Perth, Contact Avantgarde Technologies. 

Monday, September 21, 2015

Remove all Printers Deployed from a specific Print Server with Powershell

I had a customer which had deployed printers from a legacy print server utilising scripts.  They have recently built a new 2012 R2 print server where they deployed the printers utilising Print Management Console and Group Policy.

All printers were redeployed from the new print server.

The customer however had a number of printers setup on workstations still pointing to the legacy print server.  As such they wanted to remove all printers deployed from the hostname of the legacy print server.

The following PowerShell script achieves this and can be easily deployed with Group Policy.  Simply replace "PRINTSERVER" with the name of your print server and then deploy the PowerShell script.

$PrintServer = "\\PRINTSERVER"
$Printers = Get-WmiObject -Class Win32_Printer
ForEach ($Printer in $Printers) {
If ($Printer.SystemName -like "$PrintServer") {
(New-Object -ComObject WScript.Network).RemovePrinterConnection($($Printer.Name))
}
}


 

Friday, March 27, 2015

PowerShell - Find All Files Beginning With

A customer of mine was hit with another one of those Viruses which encrypt all data on shared drives mapping back to the file server.  The entire shared drive was encrypted and users were no longer able to access documents on the volume.

I restored all encrypted files from backup however I still had these HELP_DECRYPT Ransome ware files in every directory on the file server.


As a result I needed an easy way to find and delete each of these files.

PowerShell!

First set the path you want to search, mine was H:\Shared.

Next run the following command to search any files containing HELP_DECRYPT with the following command:

Get-ChildItem $Path -Recurse | Where{$_.Name -Match "HELP_DECRYPT"}


 This went through and listed all of these HELP_DECRYPT files in every directory of the file server recursively.

After you have carefully went through all the results and confirmed that no legitimate files were listed, you can pipe the output from the Get-ChildItem command into Remove-Item cmdlet.


After piping the Output into Remove-Item, run the command to list the items again to ensure they were all deleted correctly.  Getting no output as per above means the files were removed successfully.
 

Monday, August 11, 2014

Powershell Find and Replace the Remote Desktop Services Profile

A customer of mine has configured all roaming user profiles on a Remote Desktop Services environment through the user account in Active Directory instead of utilising Group Policy setting "Set roaming profile path for all users logging onto this computer", the Microsoft preferred method as it ensures each profile folder matches the username and prevents inconsistencies which may encore as a result of an administrator incorrectly naming a folder.  It is recommended all Remote Desktop Services environments utilise Group Policy as a means of setting roaming profile and not the Active Directory user accounts.

I am in the process of implementing a new file server into the customers environment and updating the Remote Desktop Services profiles to point to the new file server.  My customer has the remote desktop services profile specified on each user account and there are inconsistencies as to how the profile is named, it does not always match username!  As a result we are not able to simply move to Group Policy roaming profile mapping moving forward.

I need a way of performing a find and replace to update the Remote Desktop Services User Profile path to match the new file server.  I achieved this by writing a PowerShell script to perform this task which I would like to share with you - here is a copy of my code:


$erroractionpreference = “stop"

 

$pDirValueOld = "\\oldserver\rdsprofiles"

$pDirValueNew = "\\newserver\rdsprofiles"

 

$profilepath = $null

 

$searcher = New-Object adsisearcher

$searcher.Filter = "(&(objectCategory=person)(objectClass=user))"

$searcher.SearchRoot = "LDAP://OU=Users,OU=Avantgarde Technologies,OU=Companies,OU=Active Directory,DC=at,DC=local"

$searcher.PageSize = 1000

$results = $searcher.FindAll()

 

 

foreach($result in $results)

{

$ADuser = [adsi]$result.Path

        foreach($user in $ADuser)

        {

        echo $user.distinguishedName

        $profilepath = $null

        $profilepath = $user.psbase.InvokeGet(“TerminalServicesProfilePath") -replace [regex]::Escape($pDirValueOld),($pDirValueNew)

        $user.psbase.InvokeSet(“TerminalServicesProfilePath",$profilepath)

        $user.setinfo()

        }

}  

To utilise this code you want to modify the following values:

$pDirValueOld = "\\oldfileserver\share"

$pDirValueNew = \\newfileserver\share

$searcher.SearchRoot = "LDAP://OU=Users,OU=Avantgarde Technologies,OU=Companies,OU=Active Directory,DC=at,DC=local
  • pDirValueOld is the value you want to search for to be replaced.
  • pDirValueNew is the value you wish to set the profile to.
  • $searcher.SearchRoot is the LDAP path in Active Directory you wish to run this query recursively against.
Now I have a bunch of users in the Avantgarde Technologies --> Users OU which need to be updated.  As you see I have my Remote Desktop Services profile populated below:

 
I put the code into my PowerShell ISE development environment (as an alternative from saving it as a .ps1 script).  Then I made sure the following fields were populated correctly.
 
 
Run the code and all users recursively will have the Remote Desktop Services profile updated to point to the new path through a find and replace!
 
We can see the profile was successfully updated.
 
 
You can also modify my above code to perform a fine and replace on other items in the user account!

Hope you have found this post helpful!

Monday, June 23, 2014

PowerShell - Nightly DFS-R Database Backups

Windows Server 2012 R2 provides a new feature allowing customers to export and import the DFS-R database located in "System Volume Information" for any volumes with folders partaking in DFS-R replication.  For more information about this new feature, please see the following TechNet article which I strongly recommend a read over:

http://technet.microsoft.com/library/dn482443.aspx

The ability to Export and Import the DFS-R database provides the following advantages:
  • Significantly reduces the amount of time the initial sync process takes when pre-staging new file servers to partake in DFS-R replication as the DFS-R database can be exported and imported into the new server.  This process also requires the data be copied through robocopy or a backup product (such as ntbackup or wbadmin) and ensure the data is exactly the same as the source server.
  • Provide the ability to restore a corrupt DFS-R database which can be caused by incorrect shutdown of a Windows Server running DFS-R.  When a DFS-R database goes corrupt the server automatically kicks of self recovery by default which involves cross checking the file hashes against every file for the replication groups volume against the other DFS-R servers in the cluster in order to repair the state of the database.  This process can take a long time, sometimes as long as the initial sync process backlogging all new replication traffic.
Some DFS-R environments consist of a single hub server and up to 100 spoke servers with large volumes of data sometimes exceeding 10TB with over 10 million files.  In this scenario if the DFS-R database suffered corruption on the hub server, this would result in the entire DFS-R replication backlogging for weeks while the self recovery process rechecks all the files across the environment!

I have a customer with a DFS-R environment similar to the example provided above.  As a result I put in place measures to recover the DFS-R database in the event corruption occurred.  A PowerShell script was created to automatically backup the database on a nightly basis using the new Export-DfsrClone cmdlet introduced in Windows Server 2012 R2 which runs on the hub server.  In the event corruption occurs, we can simply import the database using the new Import-DfsrClone cmdlet.

This PowerShell Script performs the following:
  • Creates backups under C:\DfsrDatabaseBackups
  • Each backup is placed in a folder labelled YYYY-MM-DD HH-mm"
  • The script automatically deletes any database backups older then 14 days by default to ensure old backups are cleaned up.

#DFSR Database Backup Script - Created by Clint Boessen 15/04/2014
$basefolder = "C:\DfsrDatabaseBackups"
$datefolder = get-date -format "yyyy-MM-dd HH-mm"
$backuplocation = $basefolder + "\" + $datefolder
New-Item -ItemType directory $backuplocation
Export-DfsrClone -Volume E: -Path $backuplocation -Force

#Remove Databases older then 14 Days
$Now = Get-Date
$Days = "14"
$LastWrite = $Now.AddDays(-$Days)

c:
 
 
cd $basefolder

$Folders = get-childitem -path $basefolder |
Where {$_.psIsContainer -eq $true} |
Where {$_.LastWriteTime -le "$LastWrite"}

foreach ($Folder in $Folders)

{
write-host "Deleting $Folder" -foregroundcolor "Red"
Remove-Item $Folder -recurse -Confirm:$false
}
 

Add the above script to a PowerShell script "ps1" file and create a scheduled task on your DFS-R file server to run the script according to a schedule in which you want DFS-R database backups to occur.  Once configured, you will see DFS-R database backups occurring on a regular basis according to your schedule with old backups automatically being cleaned automatically!



I scheduled my script to run at 5am on weekdays.  Please not the backup process can take hours, in my environment due to large amount of files the export is taking a total of 3 hours finishing around 9am which you can see by the date modified timestamp.

It is important to note, DFS-R replication will not work when a database backup is occurring.  As a result please ensure the backups are scheduled at a time when replication can be paused.  Replication automatically resumes after the export process is completed.

PowerShell - Locate Missing SYSTEM Permissions from Folder Structure

I am in the middle of a DFS-R project for a customer where I'm provisioning new Windows Server 2012 R2 file servers and migrating the data across to the new server.  To perform the migration I initially performed the pre-sync of the data with robocopy in backup mode "/b" then added the new servers to the DFS-R replication group/namespace.  Once the initial DFS-R sync had completed which took a few days, I enabled the namespace for the new servers and disabled the old servers.

Upon cutting the users across, many users complained the data was approximately 7 days old which is the approximate time I did the initial robocopy.  After further investigation it appeared DFS-R was not keeping the data in sync and many directories had not been replicated.  These files and folders which were not replicated also did not appear in the backlog count under DFS-R Health Reports which were run to verify replication status.

It turned out the cause of this issue was because the "SYSTEM" permissions were missing from many directories in the file server structure.  As the DFS-R service runs under the "SYSTEM" account, it must have access to the data in order to perform replication.  Robocopy was however able to move this data as it was running in backup mode which uses VSS to snapshot the data.

This directory structure utilised block policy inheritance numerous times throughout the folder structure and as a result finding directories which did not have SYSTEM permissions configured correctly was a challenging task.  As a result I wrote a PowerShell script which performs an audit against a directory structure and returns all folders which are missing the "SYSTEM" permission so that an Administrator can manually add the missing permission at all folder levels with inheritance broken.

This is a handy script and I posted it online for everyone as I recommend running it against any directory structure on file servers to ensure the SYSTEM account has full control over all data, a recommended Microsoft best practice.

$OutFile = "C:\Permissions.csv"
$RootPath = "E:\PATHTOBESCANNED"

$Folders = dir $RootPath -recurse | where {$_.psiscontainer -eq $true}
foreach ($Folder in $Folders){
       $ACLs = get-acl $Folder.fullname | ForEach-Object { $_.Access  }
     $Found = $False
       Foreach ($ACL in $ACLs){
    if ($ACL.IdentityReference -eq "NT AUTHORITY\SYSTEM")
        {
            $Found = $True
        }
       }
             if ($Found -ne $True)
        {
             $OutInfo = $Folder.FullName
             Add-Content -Value $OutInfo -Path $OutFile
         }
    }

I hope this PowerShell script helps other people out there!

Tuesday, April 15, 2014

Delete all folders under a sub folder older then X days

Below is a simple PowerShell script which deletes all folders under D:\test older then 2 days.  Have fun!

$Now = Get-Date
$Days = "2"
$TargetFolder = "D:\test"
$LastWrite = $Now.AddDays(-$Days)
d:
cd "D:\test"
$Folders = get-childitem -path $TargetFolder | Where {$_.psIsContainer -eq $true} | Where {$_.LastWriteTime -le "$LastWrite"}

   
foreach ($Folder in $Folders)
   
{
   
write-host "Deleting $Folder" -foregroundcolor "Red"
   
Remove-Item $Folder -recurse -Confirm:$false
   
}
   

Monday, August 26, 2013

Scripting with Sysinternals tools - Removing the Licensing Agreeement

Mark Russinovich and Bryce Cogswell from the Microsoft Sysinternals team publish many great command line and GUI applications for advance system management and diagnostic tasks.  When using any of their tools, as a user you must first accept a licensing agreement which can be annoying especially when you want to use some of their software in something such as a logon script.  The following is an example using their Disk Usage executable which I copied to C:\Windows\System32:


Now if you do not want this license agreement to pop up for every user, you must add the registry key that accepts the license key to each users profile before the script is launched.  If you are scripting in batch this can be done with:

reg.exe ADD "HKCU\Software\Sysinternals\du" /v EulaAccepted /t REG_DWORD /d 1 /f
All Sysinternals utilities are all configured the same, just replace the \du with the name of the tool such as \psexec.

I also found the a bunch of other Sysinternals applications which you can add to the registry on Peter Hahndorf's blog.  This can be found on the following URL:

http://peter.hahndorf.eu/blog/post/2010/03/07/WorkAroundSysinternalsLicensePopups

Batch File - Output Command to Variable

I am by no means an expert in batch scripting and if you follow my posts you will have noticed I'm more of a VB scripting man.  However I was doing some batch scripting for a customer of mine today in which I need to export some data from a command line executable tool to a variable.

This can be done as follows:

FOR /F "delims=" %i IN ('date /t') DO set today=%i
echo %today%


Now one thing which caught me out,  if you put this code into a batch script "as is" you will notice the script will error out.  This is because to declare variables in a batch script you must use two % signs instead of one.

For example, "set today=%I" needs to be "set today=%%i".  To put the code in a batch script and to make it run you would use:

FOR /F "delims=" %%i IN ('date /t') DO set today=%%i
echo %today%

I hope this small post has been helpful - thankyou for reading.

Sunday, August 25, 2013

VBS - Access is Denied with .Size Option with Vista/2008 or later Operating Systems

I needed to write a VB Script to audit the maximum file size on a bunch of workstations.  To do this in VBS its very simple with only a few lines of code using the .Size option with the folder object you declare under the Scripting.FileSystemObject API.  I have demonstrated the code required to grab the file size below in blue.

Set objFSO = CreateObject("Scripting.FileSystemObject")
Set objFolder = objFSO.GetFolder("C:\Users\boessenc!\)
Wscript.Echo objFolder.Size


Running this code on ANY folder on a Vista/2008 or higher workstation will give you a Permission denied error:

Microsoft VBScript runtime error: Permission denied


Running this on any older version of Windows such as Windows XP or 2003 works without problems.  The test above I ran the script against my users profile - a folder which of course my account has access to demonstrating the issue.

I also tested running this script against the WMI database with the FileSize property of the Win32_Directory class.  That doesn't work either.

As a result need to utilise another method for grabbing the directory size.

Wednesday, November 28, 2012

Changing Password on Administrator Accounts - Performing an Audit

This article addresses a common task which many administrators have to address within their career as as an IT professional - changing the password on a core administrator account.

Scenario

It is well known that Administrators should always create dedicated service accounts with appropriate access to be used by network applications on a Microsoft network.  However there is always a case of a lazy administrator in the past who could not be bothered to create dedicated service accounts so they use the default domain admin account "Domain\Administrator" for applications and services to use.  So what happens in the event when there are applications and services across the network using a default domain account, everyone including previous employees, current employees and end users know the password to this account and you don't know exactly applications are using the account?  This article addresses exactly this situation.

Solution

The only way to identify all applications using an account for authentication is to revert to audit logs on domain controllers and identify the IP addresses in which the authentication attempts have been initiated from.  Once you have the IP addresses as an administrator your able to dig down into the servers configuration and identify what applications are installed and figure out what is making the authentication attempts from the account.  No application will be able to tell you exactly what program is performing the authentication request because all applications are different.  For example some applications may store the domain administrator credentials in a text configuration file, others might store the credentials in some type of database table and others might simply store it in a service or scheduled task.  No audit application understand the inter workings of every single application made, at best they can only look for where applications "usually" store credentials and return results based on that.

Another thing to note is each domain controller stores audit logs for authentication requests made against the individual DC.  There is no place where you can look at all authentication requests against domain controllers on a domain wide level without using additional software.  To gain inside into what authentication requests are being made on your network I recommend a product such as Snare Server.  Snare is seen by many as the industry standard for capturing and filtering audit and event log data.  Snare Server will pull audit logs from all domain controllers in your organisation and allow you to quickly identify exactly what servers in your organisation are using a specific account.

After ringing them for pricing, they are very cost effective compared to other audit collection tools on the market.  Its priced based on geographical region, so you will need to contact them to get pricing for your country.

Check out the following video which goes through Snare Server in detail:

http://www.intersectalliance.com/Contact.html?Video=SnareServer

Discovery VBS Script

As mentioned above the only way to perform a thorough audit of what applications in your environment are using a specific account is to revert to audit logs.  However below I will show you a handy VBS script which is able scan through all computer accounts in your domain and check if they are using an administrator account for a service or scheduled task.  This script can be downloaded from the following location, just rename it to *.vbs.

https://sites.google.com/site/cbblogspotfiles/ScanForUserID.txt

This script requires you to modify two fields:
  • The strSearchFor field is the account in which you want to find.  For example, Administrator.
  • The strExclude field are computers or locations within Active Directory in which you want to exclude.  If you want to exclude nothing you can leave this field as ""
strSearchFor = "Administrator"
strExclude = "dx-iren,dx-iren2,OU=Computers"


When you run the script it will display output in a webpage.  It pings each machine before performing the scan to ensure it is online so ensure ICMP is enabled on your windows firewall.  This can be done with group policy.


Ensure you launch it using cscript from a command prompt running as Administrator "Run As Administrator" to get around User Account Control (UAC) restrictions.

The audit results get pushed out to a CSV file under C:\results.csv and can be opened in Excel.


 Hope this post has been helpful and goodluck.