First of all I would like to point out that I am not an expert on debugging applications - in fact far from it. I only know the basic steps required to analyze a dump file allowing you find out what is causing your system to crash. Analyzing dump is a very technical area, by reading this I hope to make it easier for you!
I'm going to provide you with two ways on how to analyze dump files, an old method I used years and years ago and the new method using the new Debugging Tools for Windows.
The old method using dumpchk and pstat
I have only used this method windows server 2000 and it was many years ago. I do not know if it will work on the new operating systems such as server 2008 or windows 7!
Before we proceed you can get a copy of dumpchk.exe from the Windows 2003 Support Tools which can be downloaded from here:
http://www.microsoft.com/downloads/details.aspx?FamilyId=6EC50B78-8BE1-4E81-B3BE-4E7AC4F0912D&displaylang=en
pstat.exe can be downloaded from the Windows 2000 Resource kit found here:
http://support.microsoft.com/kb/927229
First run dumpchk.exe against your dump file. It can be handy piping this to a text file so you can read it easier.
dumpchk.exe C:\WINDOWS\Minidump\Mini102609-01.dmp > c:\dumpchkresults.txt
From the data gathered in the text file your looking specifically for the "ExceptionAddress".
MachineImageType i386
NumberProcessors 1
BugCheckCode 0xc000021a
BugCheckParameter1 0xe1270188
BugCheckParameter2 0x00000001
BugCheckParameter3 0x00000000
BugCheckParameter4 0x00000000
ExceptionCode 0x80000003
ExceptionFlags 0x00000001
ExceptionAddress 0x8014fb84
In this instance the ExceptionAdderss is 0x8014fb84.
The next step is boot the server up and run pstat.exe against it. Again it is good to pipe the results to a text file. What pstat shows us is all the drivers and what memory ranges they are using.
MODULENAME Load Addr Code Data Paged LinkDate
----------------------------------------------------------------------
Ntoskrnl.exe 80100000 270272 40064 434816 Sun May 11 00:10:39 1997
Hal.dll 80010000 20384 2720 9344 Mon Mar 10 16:39:20 1997
Aic78xx.sys 80001000 20512 2272 0 Sat Apr 05 21:16:21 1997
Scsiport.sys 801d7000 9824 32 15552 Mon Mar 10 16:42:27 1997
Disk.sys 80008000 3328 0 7072 Thu Apr 24 22:27:46 1997
Class2.sys 8000c000 7040 0 1632 Thu Apr 24 22:23:43 1997
Ino_flpy.sys 801df000 9152 1472 2080 Tue May 26 18:21:40 1998
Ntfs.sys 801e3000 68160 5408 269632 Thu Apr 17 22:02:31 1997
Floppy.sys f7290000 1088 672 7968 Wed Jul 17 00:31:09 1996
Cdrom.sys f72a0000 12608 32 3072 Wed Jul 17 00:31:29 1996
Cdaudio.sys f72b8000 960 0 14912 Mon Mar 17 18:21:15 1997
Null.sys f75c9000 0 0 288 Wed Jul 17 00:31:21 1996
Ksecdd.sys f7464000 1280 224 3456 Wed Jul 17 20:34:19 1996
Beep.sys f75ca000 1184 0 0 Wed Apr 23 15:19:43 1997
Cs32ba11.sys fcd1a000 52384 45344 14592 Wed Mar 12 17:22:33 1997
Msi8042.sys f7000000 20192 1536 0 Mon Mar 23 22:46:22 1998
Mouclass.sys f7470000 1984 0 0 Mon Mar 10 16:43:11 1997
Kbdclass.sys f7478000 1952 0 0 Wed Jul 17 00:31:16 1996
Videoprt.sys f72d8000 2080 128 11296 Mon Mar 10 16:41:37 1997
Ati.sys f7010000 960 9824 48768 Fri Dec 12 15:20:37 1997
Vga.sys f7488000 128 32 10784 Wed Jul 17 00:30:37 1996
Msfs.sys f7308000 864 32 15328 Mon Mar 10 16:45:01 1997
Npfs.sys f7020000 6560 192 22624 Mon Mar 10 16:44:48 1997
Ndis.sys fccda000 11744 704 96768 Thu Apr 17 22:19:45 1997
Win32k.sys a0000000 1162624 40064 0 Fri Apr 25 21:17:32 1997
Ati.dll fccba000 106176 17024 0 Fri Dec 12 15:20:08 1997
Cdfs.sys f7050000 5088 608 45984 Mon Mar 10 16:57:04 1997
Ino_fltr.sys fc42f000 29120 38176 1888 Tue Jun 02 16:33:05 1998
Tdi.sys fc4a2000 4480 96 288 Wed Jul 17 00:39:08 1996
Tcpip.sys fc40b000 108128 7008 10176 Fri May 09 17:02:39 1997
Netbt.sys fc3ee000 79808 1216 23872 Sat Apr 26 21:00:42 1997
El90x.sys f7320000 24576 1536 0 Wed Jun 26 20:04:31 1996
Afd.sys f70d0000 1696 928 48672 Thu Apr 10 15:09:17 1997
Netbios.sys f7280000 13280 224 10720 Mon Mar 10 16:56:01 1997
Parport.sys f7460000 3424 32 0 Wed Jul 17 00:31:23 1996
Parallel.sys f746c000 7904 32 0 Wed Jul 17 00:31:23 1996
Parvdm.sys f7552000 1312 32 0 Wed Jul 17 00:31:25 1996
Serial.sys f7120000 2560 0 18784 Mon Mar 10 16:44:11 1997
Rdr.sys fc385000 13472 1984 219104 Wed Mar 26 14:22:36 1997
Mup.sys fc374000 2208 6752 48864 Mon Mar 10 16:57:09 1997
Srv.sys fc24a000 42848 7488 163680 Fri Apr 25 13:59:31 1997
Pscript.dll f9ec3000 0 0 0
Fastfat.sys f9e00000 6720 672 114368 Mon Apr 21 16:50:22 1997
Ntdll.dll 77f60000 237568 20480 0 Fri Apr 11 16:38:50 1997
---------------------------------------------------------------------
Total 2377632 255040 1696384
With this information you can then get the ExceptionAddress and find out which memory range it fits into using the data provided by pstat.exe. This is just an example but in this case the crash was caused by Ntoskrnl.exe. Microsoft wrote a KB article documenting this procedure:
http://support.microsoft.com/kb/192463
The new method using Debugging Tools for Windows
The recommended way for analyzing dump files is using Debugging Tools for Windows. This is not a tool, it is a toolkit containing a wide variety of diagnostic tools. There is a 32bit and 64bit version of the product, install the correct one depending on what platform your system is running.
As of this writing the latest version is 6.11.1.404.
Download the 32bit version from here:
http://www.microsoft.com/whdc/devtools/debugging/installx86.Mspx
Download the 64bit version from here:
http://www.microsoft.com/WHDC/DEVTOOLS/DEBUGGING/INSTALL64BIT.MSPX
Again the Debugging Tools are very complicated - below I am only going to show you the basic steps on how to find out what caused your system to crash. Lets begin.
Before you can analyse a dump file you first need symbol files. Symbol files contain symbolic information such as function names and data variable names and are created when an application is built. Symbol files have a .dbg or .pdb extension. These files are used by various debuggers from different vendors including Debugging Tools for Windows which we are going to use below. Without these files call stacks which show how functions are called would be inaccurate or incorrect causing function names from being omitted from the call stack. Only Microsoft can provide symbol files for the Microsoft core components such as kernel32.dll, ntdll.dll, user32.dll and other core windows components as Microsoft are the ones that developed these. Microsoft also provides symbol files for many other third party applications and drivers.
For more about Symbol files see KB311503 - this is more of a developers thing.
If you have some idea into what is causing the blue screen you can download the symbol files for just a few files which you want to analyse. However in this case we have no idea what is causing the blue screen so I want to download all symbol files for each of my drivers and windows application files in C:\Windows\System32.
To download all the symbol files for c:\windows\system32 use the synchk.exe tool which comes as part of the Debugging Tools. Use the /r switch which means perform a recursive query. In my example I am placing the symbol files in C:\symbols.
Run the following command:
symchk /r c:\windows\system32 /s SRV*c:\symbols\*http://msdl.microsoft.com/download/symbols
It will give you out put similar to this:
Note it is saying FAILED because it cannot find the symbol file in c:\symbols which is normal. If it cannot find the file, it goes and downloads it. Note this will take a while - for my server it took just over an hour to download all the symbol files ending up to be 558 MB of data.
You will now have symbol files for all the different drivers and application libaries in your system32 directory.
You also need a copy of the i386 directory from the windows CD to analyze the dump. Ensure this is the same service pack as the system you are running. I'm using Windows Server 2003 SP2 so I had to track down a Windows Server 2003 SP2 CD.
Now that you have the symbol files and the i386 directory it is now possible to analyse the dump.
Use the following command from Microsoft KB 315263:
windbg -y SymbolPath -i ImagePath -z DumpFilePath
In my case I used:
windbg -y c:\symbols -i D:\i386 -z C:\WINDOWS\Minidump\Mini123009-01.dmp
This comes up and tells me what file triggered the crash:
Now that I know bxnd52x.sys caused the crash I now need to link it to a driver. As I described in my article "Permently Remove Driver" all drivers are referenced OEM*.INF files in c:\windows\inf. To find out which driver caused the crash search all files in C:\Windows\inf for any that contain the text string "bxnd52x.sys". You can do this using the following command:
find /c "bxnd52x.sys" c:\windows\inf\*.inf | find ":" | find /v ": 0"
This came back with two INF files containing this string.
If we open these files up we can see it is the broadcom network driver!
Note I have two because I have already updated the driver to see if that would fix the problem which it didn't. See how one was version 2.8.13 and the new one is version 5.0.13. The old drivers still stay in place for the "roll back driver functionality" in device manager. The fact that upgrading the driver did not fix the problem means that the problem is with the broadcom network adapter itself. This network adapter is onboard so it looks like im going to have to contact HP and arrange for a new mainboard.
One more thing I would like to point out is in windbg.exe you can enter additional commands to find more information as described on KB315263.
The !analyze -v command displays verbose output.
This comes back and shows you a few pages of information on how it determined that bxnd52x.sys caused the issue. Most of this is beyond me but its good to know as Microsoft or a vendor could request for this information.
There are lots of other commands for finding more information. For example the command "lm N T" can give you all the drivers and modules running on the system at the time of the crash.
I hope you have learnt something out of this. As always I'm always looking forward to feedback so please leave me a comment or shoot me an email to clint@kbomb.com.au.
Wednesday, December 30, 2009
The Low-Down on Password Policies
Below we will be looking at password policies in detail as this can get confusing in some circumstances. I will cover generic domain password policies as well as the new server 2008 granular Password Settings Objects (PSO's). There are many articles on the Internet around password policies for windows networks. The reason for writing this is I have found many of the articles are missing some key points that are very important! These missing points will be my main focus in this post.
Where do I link the password policy?
Many of the articles on the Internet discussing password policies describe how to configure the password security object but not where to link it and why. Password policies are applied to computer objects not user objects in active directory. There can only be one password policy "per account database".
Microsoft says:
There can be only a single password policy for each account database. An Active Directory domain is considered a single account database, as is the local account database on stand alone computers.There can be only a single password policy for each account database. An Active Directory domain is considered a single account database, as is the local account database on stand alone computers.
Taken from:
http://technet.microsoft.com/en-us/library/cc875814.aspx
If you were to link your password policy to the "Domain Controllers" OU, this would mean that your password policy would apply to active directory and all user accounts in Active Directory.
If you were to link your password policy at root domain level, this would hit all computer objects in the active directory database including domain controllers assuming you do not have block policy inheritance set on the domain controllers organisational unit. The password policy will hit the local SAM database for all member workstations in the active directory domain meaning that not only will active directory accounts use the password policy but also machine-local accounts on member workstations will also now need to adhere to the password policy.
Microsoft recommends always linking your password policy at the root domain level to ensure the policy covers local accounts as well as active directory accounts for obvious reasons. It is silly having a strict password policy for your domain user accounts but not for your local user accounts. If someone cracked a local user account on a member workstation they could still rootkit a PC which they could then use as an access point to attack the active directory domain.
Here is a quick structure from Microsoft on where to link your policies:
Default Domain Security Policy Settings:
- Password Policy
- Domain Account Lockout Policy
- Domain Kerberos Policy
Default Domain Controller Security Policy Settings:
- User Rights Assignment Policy
- Audit Policy
Taken from:
http://technet.microsoft.com/en-us/library/cc773164.aspx
Do I create a new GPO or use the existing default group policy objects?
Microsoft recommends never modifying the "default domain policy" and "default domain controllers policy". Create a new group policy object called something like "Corporate Password Policy" and link it at the root domain level. Ensure to assign it a higher priority to the Default Domain Policy to ensure the settings override.
Taken from Microsoft:
It is a best practice to avoid modifying these built-in GPOs, if you need to apply password policy settings that diverge from the default settings, you should instead create a new GPO and link it to the root container for the domain or to the Domain Controllers OU and assign it a higher priority than the built-in GPO: If two GPOs that have conflicting settings are linked to the same container, the one with higher priority takes precedence.
Reference:
http://technet.microsoft.com/en-us/library/cc875814.aspx
The reason behind this is because doing so makes it much easier to recover from serious problems with security settings. If the new security settings create problems, you can temporarily disable the new Group Policy object until you isolate the settings that caused the problems.
I want to see who is Logging in and out in my Domain?
This is not part of password policies, this is actually an audit policy. Audit policies need to be configured on the Domain Controllers OU as I mentioned above. This Microsoft article describes what policies should be configured on the domain root level and which policies need to be configured.
Two audit policies I want to draw attention to are "Audit logon events" and "Audit account logon events" as they cause much confusion.
"Audit account logon events" are when user accounts try to authenticate against a domain controller. It can log success or failure depending on how you configure it. This basically checks the users credentials are correct (right username and password).
"Audit logon events" generates events for the creation and destruction of logon sessions. For example things that trigger these events are things like accessing a share. Your already "authenticated" on the network, your just using your kerberos key to access network resources.
For information on the various audit policies see this post:
http://www.enterprisenetworkingplanet.com/netos/article.php/624921
I recommend only configuring audit policies you need. Do not audit everything, if so your event logs will be spammed. Personally I only audit "Account logon events" as I only want to see when people logged in and out of the network, and any failed logon attempts.
So what are these Password Settings Objects in Server 2008?
As I mentioned above password policies apply on a database level meaning that every account in the Active Directory database needs to adhere to a password policy. In Server 2000/2003 if you want to have a different password policy for different departments or user groups, the only way to achieve this was to create another domain within the same forest.
Windows 2008 gets around this with the new Password Settings Objects called PSO's for short. PSO's can be configured to effect specific users and groups in an active directory domain.
PSO's sometimes get refered to as "Granular Password Settings" or "Fine-Grained Password Policy". One thing I would like to point out a PSO is not a policy. A PSO is an object in the active directory database much like a user account or computer account. There are two AD Classes that make this work:
- Password Settings Container
- Password Setting Objects
To use PSO's your domain functional level must be server 2008. With PSO's they do not replace your domain password policy - you still need to configure this. It is there more as a fallback encase the PSO does not get applied to particular users.
If you want a step by step guide on how to configure Password Settings Objects please read these two articles by Jakob H. Heidelberg from windowsecurity.com:
http://www.windowsecurity.com/articles/Configuring-Granular-Password-Settings-Windows-Server-2008-Part-1.html
http://www.windowsecurity.com/articles/Configuring-Granular-Password-Settings-Windows-Server-2008-Part2.html
Where do I link the password policy?
Many of the articles on the Internet discussing password policies describe how to configure the password security object but not where to link it and why. Password policies are applied to computer objects not user objects in active directory. There can only be one password policy "per account database".
Microsoft says:
There can be only a single password policy for each account database. An Active Directory domain is considered a single account database, as is the local account database on stand alone computers.There can be only a single password policy for each account database. An Active Directory domain is considered a single account database, as is the local account database on stand alone computers.
Taken from:
http://technet.microsoft.com/en-us/library/cc875814.aspx
If you were to link your password policy to the "Domain Controllers" OU, this would mean that your password policy would apply to active directory and all user accounts in Active Directory.
If you were to link your password policy at root domain level, this would hit all computer objects in the active directory database including domain controllers assuming you do not have block policy inheritance set on the domain controllers organisational unit. The password policy will hit the local SAM database for all member workstations in the active directory domain meaning that not only will active directory accounts use the password policy but also machine-local accounts on member workstations will also now need to adhere to the password policy.
Microsoft recommends always linking your password policy at the root domain level to ensure the policy covers local accounts as well as active directory accounts for obvious reasons. It is silly having a strict password policy for your domain user accounts but not for your local user accounts. If someone cracked a local user account on a member workstation they could still rootkit a PC which they could then use as an access point to attack the active directory domain.
Here is a quick structure from Microsoft on where to link your policies:
Default Domain Security Policy Settings:
- Password Policy
- Domain Account Lockout Policy
- Domain Kerberos Policy
Default Domain Controller Security Policy Settings:
- User Rights Assignment Policy
- Audit Policy
Taken from:
http://technet.microsoft.com/en-us/library/cc773164.aspx
Do I create a new GPO or use the existing default group policy objects?
Microsoft recommends never modifying the "default domain policy" and "default domain controllers policy". Create a new group policy object called something like "Corporate Password Policy" and link it at the root domain level. Ensure to assign it a higher priority to the Default Domain Policy to ensure the settings override.
Taken from Microsoft:
It is a best practice to avoid modifying these built-in GPOs, if you need to apply password policy settings that diverge from the default settings, you should instead create a new GPO and link it to the root container for the domain or to the Domain Controllers OU and assign it a higher priority than the built-in GPO: If two GPOs that have conflicting settings are linked to the same container, the one with higher priority takes precedence.
Reference:
http://technet.microsoft.com/en-us/library/cc875814.aspx
The reason behind this is because doing so makes it much easier to recover from serious problems with security settings. If the new security settings create problems, you can temporarily disable the new Group Policy object until you isolate the settings that caused the problems.
I want to see who is Logging in and out in my Domain?
This is not part of password policies, this is actually an audit policy. Audit policies need to be configured on the Domain Controllers OU as I mentioned above. This Microsoft article describes what policies should be configured on the domain root level and which policies need to be configured.
Two audit policies I want to draw attention to are "Audit logon events" and "Audit account logon events" as they cause much confusion.
"Audit account logon events" are when user accounts try to authenticate against a domain controller. It can log success or failure depending on how you configure it. This basically checks the users credentials are correct (right username and password).
"Audit logon events" generates events for the creation and destruction of logon sessions. For example things that trigger these events are things like accessing a share. Your already "authenticated" on the network, your just using your kerberos key to access network resources.
For information on the various audit policies see this post:
http://www.enterprisenetworkingplanet.com/netos/article.php/624921
I recommend only configuring audit policies you need. Do not audit everything, if so your event logs will be spammed. Personally I only audit "Account logon events" as I only want to see when people logged in and out of the network, and any failed logon attempts.
So what are these Password Settings Objects in Server 2008?
As I mentioned above password policies apply on a database level meaning that every account in the Active Directory database needs to adhere to a password policy. In Server 2000/2003 if you want to have a different password policy for different departments or user groups, the only way to achieve this was to create another domain within the same forest.
Windows 2008 gets around this with the new Password Settings Objects called PSO's for short. PSO's can be configured to effect specific users and groups in an active directory domain.
PSO's sometimes get refered to as "Granular Password Settings" or "Fine-Grained Password Policy". One thing I would like to point out a PSO is not a policy. A PSO is an object in the active directory database much like a user account or computer account. There are two AD Classes that make this work:
- Password Settings Container
- Password Setting Objects
To use PSO's your domain functional level must be server 2008. With PSO's they do not replace your domain password policy - you still need to configure this. It is there more as a fallback encase the PSO does not get applied to particular users.
If you want a step by step guide on how to configure Password Settings Objects please read these two articles by Jakob H. Heidelberg from windowsecurity.com:
http://www.windowsecurity.com/articles/Configuring-Granular-Password-Settings-Windows-Server-2008-Part-1.html
http://www.windowsecurity.com/articles/Configuring-Granular-Password-Settings-Windows-Server-2008-Part2.html
Wednesday, December 23, 2009
Outlook MAPI Access Exchange 2007 vs 2010
One major difference administrators need to plan for when deploying exchange 2010 is the changes to the client access server. In Exchange 2007 outlook clients talked MAPI directly to the exchange 2007 mailbox server roles. In Exchange 2010 the only device that talks to the mailbox servers is client access server. Exchange 2010 client access server proxies the MAPI requests on to the exchange mailbox server and applies smarts. This stops the mailbox server being spammed with multiple TCP connections from different sources like it was in exchange 2003 and 2007. This is also adds to how Microsoft have achieved the 70% reduction in disk I/O with the 2010 mailbox server, see:
http://clintboessen.blogspot.com/2009/11/microsoft-recommends-sata-for-exchange.html
In exchange 2007 the client access servers did not use much resources. In exchange 2010 ensure you take into account the client access servers will under a heavier load dealing with all the MAPI requests from end clients.
There is a new service on the Exchange 2010 client access server called the "Microsoft Exchange RPC Client Access Service" that is responsible for talking MAPI to the outlook clients on the internal network. The RPC Client Access service also talks to active directory on behalf of outlook clients, something that outlook use to do directly! Outlook connects to an NSPI endpoint on the Client Access Server, and NSPI then talks to the Active Directory via the Active Directory driver. The NSPI endpoint replaces the DSProxy component as we know from Exchange 2007.
This not only improves the consistency, when applying business logic to clients, but also provides a much better client experience during switch-over and fail-overs when you have deployed a highly available solution that makes use of the new Database Availability Group (DAG) HA feature.
http://clintboessen.blogspot.com/2009/11/microsoft-recommends-sata-for-exchange.html
In exchange 2007 the client access servers did not use much resources. In exchange 2010 ensure you take into account the client access servers will under a heavier load dealing with all the MAPI requests from end clients.
There is a new service on the Exchange 2010 client access server called the "Microsoft Exchange RPC Client Access Service" that is responsible for talking MAPI to the outlook clients on the internal network. The RPC Client Access service also talks to active directory on behalf of outlook clients, something that outlook use to do directly! Outlook connects to an NSPI endpoint on the Client Access Server, and NSPI then talks to the Active Directory via the Active Directory driver. The NSPI endpoint replaces the DSProxy component as we know from Exchange 2007.
This not only improves the consistency, when applying business logic to clients, but also provides a much better client experience during switch-over and fail-overs when you have deployed a highly available solution that makes use of the new Database Availability Group (DAG) HA feature.
Monday, December 21, 2009
Failed to change domain affiliation, hr=800704f1
Performing an Active Directory Migration from a windows server 2008 FFL forest to another windows server 2008 FFL forest. User accounts migrate fine, so do computer accounts. However when the ADMT agent goes to update the domain membership on the member computers in one domain I recieve the following error in the ADMT Agent logs:
2009-12-22 14:21:15 ERR3:7075 Failed to change domain affiliation, hr=800704f1 The system detected a possible attempt to compromise security. Please ensure that you can contact the server that authenticated you.
To get around this ensure that you have "Allow cryptography algorithms compatible with Windows NT 4.0" enabled on the default domain controllers policy in both the source and destination domain.
To do this follow Microsoft KB Article 942564:
http://support.microsoft.com/kb/942564
2009-12-22 14:21:15 ERR3:7075 Failed to change domain affiliation, hr=800704f1 The system detected a possible attempt to compromise security. Please ensure that you can contact the server that authenticated you.
To get around this ensure that you have "Allow cryptography algorithms compatible with Windows NT 4.0" enabled on the default domain controllers policy in both the source and destination domain.
To do this follow Microsoft KB Article 942564:
http://support.microsoft.com/kb/942564
Thursday, December 17, 2009
View System Hardware Information in Linux
You want to be able to view information about a system from a linux shell in terms of what disks it has, what processor, the system model the vendor etc.
There is a program in linux called dmidecode which pulls this information straight from the systems BIOS.
You can run dmidecode by simply typing "dmidecode" in a linux shell. However it spits out a lot of information so I recommend running it through "more" or "less".
Run:
dmidecode | more
or
dmidecode | less
You can then scroll through this information and find out all kinds of cool information about the system such as what model it is!
There is a program in linux called dmidecode which pulls this information straight from the systems BIOS.
You can run dmidecode by simply typing "dmidecode" in a linux shell. However it spits out a lot of information so I recommend running it through "more" or "less".
Run:
dmidecode | more
or
dmidecode | less
You can then scroll through this information and find out all kinds of cool information about the system such as what model it is!
hpacucli Create Hot Spare
I have a ProLiant DL360 G5 running Red Hat 4.1.2-14.
This server has a Smart Array 6400 with 13 300GB SCSI disks allocated in two arrays and 1 disk unallocated.
This server also has a Smart Array P400i with 4 disks allocated in 2 arrays and 1 disk unallocated.
My goal is to add the unallocated disk as a spare to both arrays on each controller. This server does not have a GUI so I must use the HP Array Configuration Utility CLI called "hpacucli".
To run the HP Array Configuration Utility CLI simply type "hpacucli" in the shell.
Next lets look at all our arrays using the following command:
ctrl all show config
We can see that Smart Array 6400 is in Slot 2 and Smart Array P400i is in Slot 0. We can also see the unassigned disks are physicaldrive 1:8 and physicaldrive 2I:1:5.
To assign the drives as hot spares run the following commands:
ctrl slot=2 array all add spares=1:8
ctrl slot=0 array all add spares=2I:1:5
We specify "array all" as we want to make the disk available to all disks on the controller. We could go "array A" if we wanted to make the spare available to just onje array... I don't know why you would though!
Now if we run the "ctrl all show config" command again we see it's added it in as a spare:
If you want more information about performing other tasks using hpacucli the following link is the best documentation I could find:
http://people.freebsd.org/~jcagle/hpacucli-readme
This server has a Smart Array 6400 with 13 300GB SCSI disks allocated in two arrays and 1 disk unallocated.
This server also has a Smart Array P400i with 4 disks allocated in 2 arrays and 1 disk unallocated.
My goal is to add the unallocated disk as a spare to both arrays on each controller. This server does not have a GUI so I must use the HP Array Configuration Utility CLI called "hpacucli".
To run the HP Array Configuration Utility CLI simply type "hpacucli" in the shell.
Next lets look at all our arrays using the following command:
ctrl all show config
We can see that Smart Array 6400 is in Slot 2 and Smart Array P400i is in Slot 0. We can also see the unassigned disks are physicaldrive 1:8 and physicaldrive 2I:1:5.
To assign the drives as hot spares run the following commands:
ctrl slot=2 array all add spares=1:8
ctrl slot=0 array all add spares=2I:1:5
We specify "array all" as we want to make the disk available to all disks on the controller. We could go "array A" if we wanted to make the spare available to just onje array... I don't know why you would though!
Now if we run the "ctrl all show config" command again we see it's added it in as a spare:
If you want more information about performing other tasks using hpacucli the following link is the best documentation I could find:
http://people.freebsd.org/~jcagle/hpacucli-readme
Monday, December 14, 2009
0x800700005 when creating scheduled task
When creating a scheduled task the following error was experianced:
Error: 0x800700005 access denied when creating a new task
This is because administrators do not have permission to the task scheduler folder for some reason. To fix this perform the following:
1. Open a command prompt
2. Run “Net Use T: \\%computername%\C$\Windows\Tasks /Persistent:No”
3. Open Explorer, right-click Tasks (T:), goto ‘Properties’ Click the [Security]
tab. In security permissions grant Administrators and System Full Control if they do not already have it, and then click [Apply]. Then click [Advanced], check [x] Replace permission entries on all child objects, and click [Ok]. Restart the "Task Scheduler" service and give it another try.
Error: 0x800700005 access denied when creating a new task
This is because administrators do not have permission to the task scheduler folder for some reason. To fix this perform the following:
1. Open a command prompt
2. Run “Net Use T: \\%computername%\C$\Windows\Tasks /Persistent:No”
3. Open Explorer, right-click Tasks (T:), goto ‘Properties’ Click the [Security]
tab. In security permissions grant Administrators and System Full Control if they do not already have it, and then click [Apply]. Then click [Advanced], check [x] Replace permission entries on all child objects, and click [Ok]. Restart the "Task Scheduler" service and give it another try.
Wednesday, December 9, 2009
Sysprep 2008 Server
You have a virtual environment, and you create a 2008 template server. You need to sysprep it to ensure the system is unique.
Sysprep is located under c:\Windows\System32\sysprep\sysprep.exe on every 2008 server install.
Tick the box "Generalize", this ensures the server has a new SID. Change the shutdown options to shutdown.
When the virtual server shuts down begin cloning it.
Sysprep is located under c:\Windows\System32\sysprep\sysprep.exe on every 2008 server install.
Tick the box "Generalize", this ensures the server has a new SID. Change the shutdown options to shutdown.
When the virtual server shuts down begin cloning it.
Wednesday, December 2, 2009
Exchange 2010 Deploy Assist
If you are looking to migrate to Exchange 2010 from 2003 or 2007, or prehaps perform a new Exchange 2010 installation there is a cool tool Microsoft has just released called Deploy Assist.
The Deployment Assist tool asks you a series of questions about your environment. Based on these questions it will then provide you with step by step instructions on how to perform the migration or fresh installation of exchange 2010. This is very handy as it tackles all kinds of network environments.
To use this tool go to the following link:
http://technet.microsoft.com/exdeploy2010
The Deployment Assist tool asks you a series of questions about your environment. Based on these questions it will then provide you with step by step instructions on how to perform the migration or fresh installation of exchange 2010. This is very handy as it tackles all kinds of network environments.
To use this tool go to the following link:
http://technet.microsoft.com/exdeploy2010
Tuesday, December 1, 2009
Software Inventory Powershell Script
I wrote a powershell script for inventorying software on all PC's on a windows domain. Below I will show you how to use it and provide you with a copy of the code.
Preparation
For this program to work you will need a computer running Microsoft Powershell. Windows 7 and 2008 R2 come with powershell pre-built into them. For windows XP, 2003 and 2008 you need to manually download and install the powershell component. This PC must be a member of the domain!
Get powershell from here:
http://www.microsoft.com/windowsserver2003/technologies/management/powershell/download.mspx
Next you need to configure powershell to allow unsigned scripts to run. By default powershell is set to RemoteSigned meaning it will not run any scripts that have not been signed by an external certificate authority. To digitally sign scripts you need to buy a certificate for each piece of code you want to sign! Open powershell with administrator rights to the local system. You may have to “Run as Administrator” due to UAC!
Close the window and open another powershell window that has domain admin rights. This means you need to run the power shell window as another user. To perform “Run As” on windows 7 or Vista please see:
http://clintboessen.blogspot.com/2009/11/use-old-run-as-windows-7-and-windows.html
Running the Script
This powershell script will go through every computer account in active directory, ping the computer name, if it replies then perform the WMI query. This avoids the script failing for computers that are not turned on.
From the powershell session running as a domain admin account, navigate to the folder containing the script and run it.
You can see it will go through and skip any PC’s that are turned off. This data will get pushed into a csv which is placed in your user’s profile. Remember if you’re running as a domain admin, the csv file will be located in the domain admin user’s profile, not yours!
Reviewing the Results
Note: You must not open this file while the script is still running. This will prevent the script being able to write to it as excel takes ownership of it.
Open the CSV file up in excel and format it so it is easier to read. All the software will be listed in order along with which computer the application is installed on. Any PC’s that have any problems with WMI will not be able to perform the audit, however in a healthy windows domain all PC’s should be able to respond to WMI queries.
Script Requirement
To be able to query a PC's software you need the Management and Monitoring Tools --> WMI Windows Installer Provider windows component installed from add and remove windows components. This feature is automatically built into windows vista, 2008 and windows 7. If this is not installed the script will return the following error:
You will also recieve errors when you perform manual WMI queries related to software against these machines that do not have this component:
Don't worry I wrote up a blog post on this problem and have developed a way to mass deploy this windows component out to all PC's your trying to audit. Find this here:
http://clintboessen.blogspot.com/2009/11/wmi-error-invalid-class-0x80041010-fix.html
Copy of the Script
Here is the full copy of my script:
$datetime = Get-Date -Format "yyyyMMddhhmmss";
$strCategory = "computer";
# Create a Domain object. With no params will tie to computer domain
$objDomain = New-Object System.DirectoryServices.DirectoryEntry;
$objSearcher = New-Object System.DirectoryServices.DirectorySearcher; # AD Searcher object
$objSearcher.SearchRoot = $objDomain; # Set Search root to our domain
$objSearcher.Filter = ("(objectCategory=$strCategory)"); # Search filter
$colProplist = "name";
foreach ($i in $colPropList)
{
$objSearcher.PropertiesToLoad.Add($i);
}
$colResults = $objSearcher.FindAll();
# Add column headers
Add-Content "$Env:USERPROFILE\softwareaudit $datetime.csv" "Computer,Caption,Description,Identifying Number,Installation Date,Installation Date 2,Installation Location,Installation State,Name,Package Cache,SKU Number,Vendor,Version";
foreach ($objResult in $colResults)
{
$objComputer = $objResult.Properties;
$computer = $objComputer.name;
$ipAddress = $pingStatus.ProtocolAddress;
# Ping the computer
$pingStatus = Get-WmiObject -Class Win32_PingStatus -Filter "Address = '$computer'";
if($pingStatus.StatusCode -eq 0)
{
Write-Host -ForegroundColor Green "Ping Reply received from $computer.";
write-host "Connecting to $computer..."
$colItems = get-wmiobject -class "Win32_Product" -namespace "root\CIMV2" `
-computername $computer
write-host "#############################"
write-host "Computer: " $computer
write-host "#############################"
write-host
write-host
foreach ($objItem in $colItems)
{
write-host "Caption: " $objItem.Caption
$caption = $objItem.Caption
write-host "Description: " $objItem.Description
$description = $objItem.Description
write-host "Identifying Number: " $objItem.IdentifyingNumber
$identifier = $objItem.IdentifyingNumber
write-host "Installation Date: " $objItem.InstallDate
$installdate = $objItem.InstallDate
write-host "Installation Date 2: " $objItem.InstallDate2
$installdate2 = $objItem.InstallDate2
write-host "Installation Location: " $objItem.InstallLocation
$installlocation = $objItem.InstallLocation
write-host "Installation State: " $objItem.InstallState
$installstate = $objItem.InstallState
write-host "Name: " $objItem.Name
$name = $objItem.Name
write-host "Package Cache: " $objItem.PackageCache
$packagecache = $objItem.PackageCache
write-host "SKU Number: " $objItem.SKUNumber
$skunumber = $objItem.SKUNumber
write-host "Vendor: " $objItem.Vendor
$vendor = $objItem.Vendor
write-host "Version: " $objItem.Version
$version = $objItem.Version
write-host
# Need to add in a special character for " as some of the values from the WMI query has commers in them that mess up the csv file
$sc = [char]34
Add-Content "$Env:USERPROFILE\softwareaudit $datetime.csv" "$sc$computer$sc,$sc$caption$sc,$sc$description$sc,$sc$identifier$sc,$sc$installdate$sc,$sc$installdate2$sc,$sc$installlocation$sc,$sc$installstate$sc,$sc$name$sc,$sc$packagecache$sc,$sc$skunumber$sc,$sc$vendor$sc,$sc$version$sc"
}
}
else
{
Write-Host -ForegroundColor Red "No Ping Reply received from $computer.";
}
}
It automatically pulls the domain information from the local computers domain membership.
Deleting old Computer Accounts
If you find there are hundreds of computer accounts that are no longer in use, and the script is sitting there pinging computer account after computer account taking ages to move through the list you may want to clean up old computer accounts in active directory.
To find all computers that have been inactive for the last four weeks and remove them run the following command on a domain controller:
dsquery computer -inactive 4 | dsrm
Preparation
For this program to work you will need a computer running Microsoft Powershell. Windows 7 and 2008 R2 come with powershell pre-built into them. For windows XP, 2003 and 2008 you need to manually download and install the powershell component. This PC must be a member of the domain!
Get powershell from here:
http://www.microsoft.com/windowsserver2003/technologies/management/powershell/download.mspx
Next you need to configure powershell to allow unsigned scripts to run. By default powershell is set to RemoteSigned meaning it will not run any scripts that have not been signed by an external certificate authority. To digitally sign scripts you need to buy a certificate for each piece of code you want to sign! Open powershell with administrator rights to the local system. You may have to “Run as Administrator” due to UAC!
Close the window and open another powershell window that has domain admin rights. This means you need to run the power shell window as another user. To perform “Run As” on windows 7 or Vista please see:
http://clintboessen.blogspot.com/2009/11/use-old-run-as-windows-7-and-windows.html
Running the Script
This powershell script will go through every computer account in active directory, ping the computer name, if it replies then perform the WMI query. This avoids the script failing for computers that are not turned on.
From the powershell session running as a domain admin account, navigate to the folder containing the script and run it.
You can see it will go through and skip any PC’s that are turned off. This data will get pushed into a csv which is placed in your user’s profile. Remember if you’re running as a domain admin, the csv file will be located in the domain admin user’s profile, not yours!
Reviewing the Results
Note: You must not open this file while the script is still running. This will prevent the script being able to write to it as excel takes ownership of it.
Open the CSV file up in excel and format it so it is easier to read. All the software will be listed in order along with which computer the application is installed on. Any PC’s that have any problems with WMI will not be able to perform the audit, however in a healthy windows domain all PC’s should be able to respond to WMI queries.
Script Requirement
To be able to query a PC's software you need the Management and Monitoring Tools --> WMI Windows Installer Provider windows component installed from add and remove windows components. This feature is automatically built into windows vista, 2008 and windows 7. If this is not installed the script will return the following error:
You will also recieve errors when you perform manual WMI queries related to software against these machines that do not have this component:
Don't worry I wrote up a blog post on this problem and have developed a way to mass deploy this windows component out to all PC's your trying to audit. Find this here:
http://clintboessen.blogspot.com/2009/11/wmi-error-invalid-class-0x80041010-fix.html
Copy of the Script
Here is the full copy of my script:
$datetime = Get-Date -Format "yyyyMMddhhmmss";
$strCategory = "computer";
# Create a Domain object. With no params will tie to computer domain
$objDomain = New-Object System.DirectoryServices.DirectoryEntry;
$objSearcher = New-Object System.DirectoryServices.DirectorySearcher; # AD Searcher object
$objSearcher.SearchRoot = $objDomain; # Set Search root to our domain
$objSearcher.Filter = ("(objectCategory=$strCategory)"); # Search filter
$colProplist = "name";
foreach ($i in $colPropList)
{
$objSearcher.PropertiesToLoad.Add($i);
}
$colResults = $objSearcher.FindAll();
# Add column headers
Add-Content "$Env:USERPROFILE\softwareaudit $datetime.csv" "Computer,Caption,Description,Identifying Number,Installation Date,Installation Date 2,Installation Location,Installation State,Name,Package Cache,SKU Number,Vendor,Version";
foreach ($objResult in $colResults)
{
$objComputer = $objResult.Properties;
$computer = $objComputer.name;
$ipAddress = $pingStatus.ProtocolAddress;
# Ping the computer
$pingStatus = Get-WmiObject -Class Win32_PingStatus -Filter "Address = '$computer'";
if($pingStatus.StatusCode -eq 0)
{
Write-Host -ForegroundColor Green "Ping Reply received from $computer.";
write-host "Connecting to $computer..."
$colItems = get-wmiobject -class "Win32_Product" -namespace "root\CIMV2" `
-computername $computer
write-host "#############################"
write-host "Computer: " $computer
write-host "#############################"
write-host
write-host
foreach ($objItem in $colItems)
{
write-host "Caption: " $objItem.Caption
$caption = $objItem.Caption
write-host "Description: " $objItem.Description
$description = $objItem.Description
write-host "Identifying Number: " $objItem.IdentifyingNumber
$identifier = $objItem.IdentifyingNumber
write-host "Installation Date: " $objItem.InstallDate
$installdate = $objItem.InstallDate
write-host "Installation Date 2: " $objItem.InstallDate2
$installdate2 = $objItem.InstallDate2
write-host "Installation Location: " $objItem.InstallLocation
$installlocation = $objItem.InstallLocation
write-host "Installation State: " $objItem.InstallState
$installstate = $objItem.InstallState
write-host "Name: " $objItem.Name
$name = $objItem.Name
write-host "Package Cache: " $objItem.PackageCache
$packagecache = $objItem.PackageCache
write-host "SKU Number: " $objItem.SKUNumber
$skunumber = $objItem.SKUNumber
write-host "Vendor: " $objItem.Vendor
$vendor = $objItem.Vendor
write-host "Version: " $objItem.Version
$version = $objItem.Version
write-host
# Need to add in a special character for " as some of the values from the WMI query has commers in them that mess up the csv file
$sc = [char]34
Add-Content "$Env:USERPROFILE\softwareaudit $datetime.csv" "$sc$computer$sc,$sc$caption$sc,$sc$description$sc,$sc$identifier$sc,$sc$installdate$sc,$sc$installdate2$sc,$sc$installlocation$sc,$sc$installstate$sc,$sc$name$sc,$sc$packagecache$sc,$sc$skunumber$sc,$sc$vendor$sc,$sc$version$sc"
}
}
else
{
Write-Host -ForegroundColor Red "No Ping Reply received from $computer.";
}
}
It automatically pulls the domain information from the local computers domain membership.
Deleting old Computer Accounts
If you find there are hundreds of computer accounts that are no longer in use, and the script is sitting there pinging computer account after computer account taking ages to move through the list you may want to clean up old computer accounts in active directory.
To find all computers that have been inactive for the last four weeks and remove them run the following command on a domain controller:
dsquery computer -inactive 4 | dsrm
Monday, November 30, 2009
WMI Error Invalid Class 0x80041010 fix
I went and wrote a powershell script that performs a software audit of a Microsoft network. However half the machines on the network returned an error saying Invalid Class 0x80041010. When I ran a wmi query against a failed computer manually, sure thing it failed again!
wmic /Failfast:on /node:"ausdc01" product GET /all
Node - AUSDC01
ERROR:
Code = 0x80041010
Description = Invalid class
Facility = WMI
When powershell queried it I also got an error:
Get-WmiObject : Invalid Class
When performing these queries against the server the following event logs were generated under the Application logs:
Event Type: Error
Event Source: WinMgmt
Event Category: None
Event ID: 10
Date: 1/12/2009
Time: 9:57:43 AM
User: N/A
Computer: AUSDC01
Description:
Event filter with query "select * from MSMCAEvent_MemoryError where (type = 3221553223) and (LogToEventlog <> 0)" could not be (re)activated in namespace "//./root/WMI" because of error 0x80041010. Events may not be delivered through this filter until the problem is corrected.
Event Type: Error
Event Source: WinMgmt
Event Category: None
Event ID: 10
Date: 1/12/2009
Time: 9:57:43 AM
User: N/A
Computer: AUSDC01
Description:
Event filter with query "select * from MSMCAEvent_PCIBusError where (type = 2147811416) and (LogToEventlog <> 0)" could not be (re)activated in namespace "//./root/WMI" because of error 0x80041010. Events may not be delivered through this filter until the problem is corrected.
Event Type: Error
Event Source: WinMgmt
Event Category: None
Event ID: 10
Date: 1/12/2009
Time: 9:57:43 AM
User: N/A
Computer: AUSDC01
Description:
Event filter with query "select * from MSMCAEvent_SMBIOSError where (type = 3221553253) and (LogToEventlog <> 0)" could not be (re)activated in namespace "//./root/WMI" because of error 0x80041010. Events may not be delivered through this filter until the problem is corrected.
Event Type: Error
Event Source: WinMgmt
Event Category: None
Event ID: 10
Date: 1/12/2009
Time: 9:57:43 AM
User: N/A
Computer: AUSDC01
Description:
Event filter with query "select * from MSMCAEvent_CPUError where (type = 2147811392) and (LogToEventlog <> 0)" could not be (re)activated in namespace "//./root/WMI" because of error 0x80041010. Events may not be delivered through this filter until the problem is corrected.
Event Type: Error
Event Source: WinMgmt
Event Category: None
Event ID: 10
Date: 1/12/2009
Time: 9:57:43 AM
User: N/A
Computer: AUSDC01
Description:
Event filter with query "select * from MSMCAEvent_PlatformSpecificError where (type = 3221553255) and (LogToEventlog <> 0)" could not be (re)activated in namespace "//./root/WMI" because of error 0x80041010. Events may not be delivered through this filter until the problem is corrected.
I installed WMITools on both a server that was working correctly and a server that was not working. Download WMITools from here:
http://www.microsoft.com/downloads/details.aspx?familyid=6430F853-1120-48DB-8CC5-F2ABDC3ED314&displaylang=en
In the WMI CIM Studio I saw that on one server the CIM_Products\Win32_Product WMI class existed on one server, but not on the other.
Server that fails with the WMI error 0x80041010:
Server that worked:
After some more research I found out that this Win32_Product class object gets added in when the Management and Monitoring Tools --> WMI Windows Installer Provider is installed.
Add this component and it will resolve the problem:
Remotely Pushout WMI Windows Installer Provider
In my case I want to to use the WMI Windows Installer Provider to perform a software inventory of my network. I cannot do this if this component isn't installed on every computer through out my domain. It is installed by default on vista, 2008 and windows 7 by default but not windows XP or 2003.
To do this first we must place the i386 folder from a Windows 2003 CD on a network share as WMI Windows Installer Provider requires a few files from it.
Next we are going to have to create a custom .reg file to change the place where our computers look for i386 directory for windows component files. This data is located under:
HKLM\Software\Microsoft\Windows\CurrentVersion\Setup
The two main values we need to change is CDInstall to be 0, as we are not installing from a CD and SourcePath. Whatever you put as SourcePath the add and remove windows components utility will add \i386 to the end of it. I shared my i386 directory out as "i386" so to navigate to it I need to type \\ausdc01\i386. This means for my SoucePath I enter it as \\ausdc01.
The default value for SourcePath is D:
Next we need to export the changes to a .reg file. Right click on Setup and click Export. Make sure the export range is set to Selected branch and not all.
Once exported open up the .reg file in notepad. There are many registry directory keys under the Setup folder that would have been exported along with our data. These are not required and need to be removed. Please click the below image to enlarge. Everything south of the red square should be removed. Also please remove BootDir encase we actually do have a server that doesn't have C:\ as its boot partition!
Copy the registry file you have crated to a network share. In my environment I just used the netlogon directory in which I created a folder wmichange \\domain\netlogon\wmichange.
You could use pstools or a startup script to make this change on servers now by simply scripting:
regedit /s \\domain\netlogon\wmichange\installsource.reg
However we are going to encompase this as part of our same script.
Next we need to make an unattended setup file, much like we do when we are doing desktop or server deployment on large scales. Create the file under the same directory as above. I called mine answer.txt
\\domain\netlogon\wmichange\answer.txt
In the answer file enter:
[Components]
WbemMSI = On
This is what is required to install the Management and Monitoring Tools --> WMI Windows Installer Provider component.
You can specify any components in add and remove windows components in an answer file. Here are some good links for future reference:
http://itk.samfundet.no/dok-filer/ms-xp-sp2-deployment-ref/u_components.htm
http://forums.techarena.in/server-scripting/738510.htm
The command to kick off this unattended install is:
sysocmgr.exe /i:%windir%\inf\sysoc.inf /u:\\domain\netlogon\wmichange\answer.txt
This will go through and add the WMI Windows Installer Provider component for us.
This should install the component without even prompting for any user interaction. It will pull the files of the network share as configured above.
Now finally lets create a batch script under our \\domain\netlogon\wmichange\ directory called run.bat
Put both commands in the bat file:
regedit /s \\domain\netlogon\wmichange\installsource.reg
sysocmgr.exe /i:%windir%\inf\sysoc.inf /u:\\domain\netlogon\wmichange\answer.txt
Now you got two methods to use this, you can either run it remotely using PSExec or you can use startup scripts. Below I will only show how to use PSExec.
PSexec is a program to remotely execute commands which is part of the PSTools pack... get it from:
http://technet.microsoft.com/en-us/sysinternals/bb896649.aspx
PSExec has the capability of running a command against every computer in a domain, or a list of computers from a text file. Remember only 2003 and XP dont have WMI Windows Installer Provider installed by default, so we only want to do these.
Below I will run the command against a single computer CANHQDC01 which did not have the WMI Windows Installer Provider installed.
psexec \\CANHQDC01 "\\domain\netlogon\wmichange\run.bat"
Below shows you the output of the command and how it carried out all the tasks:
Error code 0 is good, it means there were no errors. Now I can perform software audits on CANHQDC01 using WMI where before I couldn't!
wmic /Failfast:on /node:"ausdc01" product GET /all
Node - AUSDC01
ERROR:
Code = 0x80041010
Description = Invalid class
Facility = WMI
When powershell queried it I also got an error:
Get-WmiObject : Invalid Class
When performing these queries against the server the following event logs were generated under the Application logs:
Event Type: Error
Event Source: WinMgmt
Event Category: None
Event ID: 10
Date: 1/12/2009
Time: 9:57:43 AM
User: N/A
Computer: AUSDC01
Description:
Event filter with query "select * from MSMCAEvent_MemoryError where (type = 3221553223) and (LogToEventlog <> 0)" could not be (re)activated in namespace "//./root/WMI" because of error 0x80041010. Events may not be delivered through this filter until the problem is corrected.
Event Type: Error
Event Source: WinMgmt
Event Category: None
Event ID: 10
Date: 1/12/2009
Time: 9:57:43 AM
User: N/A
Computer: AUSDC01
Description:
Event filter with query "select * from MSMCAEvent_PCIBusError where (type = 2147811416) and (LogToEventlog <> 0)" could not be (re)activated in namespace "//./root/WMI" because of error 0x80041010. Events may not be delivered through this filter until the problem is corrected.
Event Type: Error
Event Source: WinMgmt
Event Category: None
Event ID: 10
Date: 1/12/2009
Time: 9:57:43 AM
User: N/A
Computer: AUSDC01
Description:
Event filter with query "select * from MSMCAEvent_SMBIOSError where (type = 3221553253) and (LogToEventlog <> 0)" could not be (re)activated in namespace "//./root/WMI" because of error 0x80041010. Events may not be delivered through this filter until the problem is corrected.
Event Type: Error
Event Source: WinMgmt
Event Category: None
Event ID: 10
Date: 1/12/2009
Time: 9:57:43 AM
User: N/A
Computer: AUSDC01
Description:
Event filter with query "select * from MSMCAEvent_CPUError where (type = 2147811392) and (LogToEventlog <> 0)" could not be (re)activated in namespace "//./root/WMI" because of error 0x80041010. Events may not be delivered through this filter until the problem is corrected.
Event Type: Error
Event Source: WinMgmt
Event Category: None
Event ID: 10
Date: 1/12/2009
Time: 9:57:43 AM
User: N/A
Computer: AUSDC01
Description:
Event filter with query "select * from MSMCAEvent_PlatformSpecificError where (type = 3221553255) and (LogToEventlog <> 0)" could not be (re)activated in namespace "//./root/WMI" because of error 0x80041010. Events may not be delivered through this filter until the problem is corrected.
I installed WMITools on both a server that was working correctly and a server that was not working. Download WMITools from here:
http://www.microsoft.com/downloads/details.aspx?familyid=6430F853-1120-48DB-8CC5-F2ABDC3ED314&displaylang=en
In the WMI CIM Studio I saw that on one server the CIM_Products\Win32_Product WMI class existed on one server, but not on the other.
Server that fails with the WMI error 0x80041010:
Server that worked:
After some more research I found out that this Win32_Product class object gets added in when the Management and Monitoring Tools --> WMI Windows Installer Provider is installed.
Add this component and it will resolve the problem:
Remotely Pushout WMI Windows Installer Provider
In my case I want to to use the WMI Windows Installer Provider to perform a software inventory of my network. I cannot do this if this component isn't installed on every computer through out my domain. It is installed by default on vista, 2008 and windows 7 by default but not windows XP or 2003.
To do this first we must place the i386 folder from a Windows 2003 CD on a network share as WMI Windows Installer Provider requires a few files from it.
Next we are going to have to create a custom .reg file to change the place where our computers look for i386 directory for windows component files. This data is located under:
HKLM\Software\Microsoft\Windows\CurrentVersion\Setup
The two main values we need to change is CDInstall to be 0, as we are not installing from a CD and SourcePath. Whatever you put as SourcePath the add and remove windows components utility will add \i386 to the end of it. I shared my i386 directory out as "i386" so to navigate to it I need to type \\ausdc01\i386. This means for my SoucePath I enter it as \\ausdc01.
The default value for SourcePath is D:
Next we need to export the changes to a .reg file. Right click on Setup and click Export. Make sure the export range is set to Selected branch and not all.
Once exported open up the .reg file in notepad. There are many registry directory keys under the Setup folder that would have been exported along with our data. These are not required and need to be removed. Please click the below image to enlarge. Everything south of the red square should be removed. Also please remove BootDir encase we actually do have a server that doesn't have C:\ as its boot partition!
Copy the registry file you have crated to a network share. In my environment I just used the netlogon directory in which I created a folder wmichange \\domain\netlogon\wmichange.
You could use pstools or a startup script to make this change on servers now by simply scripting:
regedit /s \\domain\netlogon\wmichange\installsource.reg
However we are going to encompase this as part of our same script.
Next we need to make an unattended setup file, much like we do when we are doing desktop or server deployment on large scales. Create the file under the same directory as above. I called mine answer.txt
\\domain\netlogon\wmichange\answer.txt
In the answer file enter:
[Components]
WbemMSI = On
This is what is required to install the Management and Monitoring Tools --> WMI Windows Installer Provider component.
You can specify any components in add and remove windows components in an answer file. Here are some good links for future reference:
http://itk.samfundet.no/dok-filer/ms-xp-sp2-deployment-ref/u_components.htm
http://forums.techarena.in/server-scripting/738510.htm
The command to kick off this unattended install is:
sysocmgr.exe /i:%windir%\inf\sysoc.inf /u:\\domain\netlogon\wmichange\answer.txt
This will go through and add the WMI Windows Installer Provider component for us.
This should install the component without even prompting for any user interaction. It will pull the files of the network share as configured above.
Now finally lets create a batch script under our \\domain\netlogon\wmichange\ directory called run.bat
Put both commands in the bat file:
regedit /s \\domain\netlogon\wmichange\installsource.reg
sysocmgr.exe /i:%windir%\inf\sysoc.inf /u:\\domain\netlogon\wmichange\answer.txt
Now you got two methods to use this, you can either run it remotely using PSExec or you can use startup scripts. Below I will only show how to use PSExec.
PSexec is a program to remotely execute commands which is part of the PSTools pack... get it from:
http://technet.microsoft.com/en-us/sysinternals/bb896649.aspx
PSExec has the capability of running a command against every computer in a domain, or a list of computers from a text file. Remember only 2003 and XP dont have WMI Windows Installer Provider installed by default, so we only want to do these.
Below I will run the command against a single computer CANHQDC01 which did not have the WMI Windows Installer Provider installed.
psexec \\CANHQDC01 "\\domain\netlogon\wmichange\run.bat"
Below shows you the output of the command and how it carried out all the tasks:
Error code 0 is good, it means there were no errors. Now I can perform software audits on CANHQDC01 using WMI where before I couldn't!
Test Exchange Connectivity
I found an awesome tool for testing exchange connectivity for things like Exchange ActiveSync, ActiveSync AutoDiscover, Synchronization, Notification, Availability, and OOF and SMTP.
To use it simply go to:
https://www.testexchangeconnectivity.com/
To use it simply go to:
https://www.testexchangeconnectivity.com/
Sunday, November 29, 2009
Converting Active Directory Integer8 values
Active directory stores dates as Integer8 (a 64-bit number) values. These cannot be read easily by the human eye.
There are many ways to convert these values... I'll show you a few tricks to do this.
Lets export a Integer8 value:
dsquery * "CN=computerobject,OU=Servers,DC=Domain,DC=internal" -attr lastLogon
lastLogon
129040375603051932
One way you can accomplish this is by using powershell:
$(get-date 1/1/1601).adddays(($(&dsquery * "CN=computerobject,OU=Servers,DC=Domain,DC=internal" -attr lastLogon)[1].Trim())/(60*10000000)/1440)
If you dont have powershell you can simply use the w32tm utility:
w32tm /ntte 129040375603051932
There are many ways to convert these values... I'll show you a few tricks to do this.
Lets export a Integer8 value:
dsquery * "CN=computerobject,OU=Servers,DC=Domain,DC=internal" -attr lastLogon
lastLogon
129040375603051932
One way you can accomplish this is by using powershell:
$(get-date 1/1/1601).adddays(($(&dsquery * "CN=computerobject,OU=Servers,DC=Domain,DC=internal" -attr lastLogon)[1].Trim())/(60*10000000)/1440)
If you dont have powershell you can simply use the w32tm utility:
w32tm /ntte 129040375603051932
Thursday, November 26, 2009
Recovering Windows Encrypted Data for EFS and Bitlocker
In this post I'm going to go over two encryption methods built naively into windows, EFS (Encrypted File System) and BitLocker and how to recover the data should the encryption keys become lost or corrupt.
BitLocker
Bit Locker requires Trusted Platform Module (TPM) v1.2 or higher. TPM is a security chipset built into the computers hardware. It either has it or it doesn't. If you don't have TPM you cannot use bitlocker.
The TPM contains information such as:
- the recovery password
- TPM owner password
- information required to identify which computers and volumes the recovery information applies to.
On a Microsoft network if you deploy bitlocker to all your vista / windows 7 workstations or 2008 member servers, you can deploy a group policy to store this TPM backup data in Active Directory. This ensures that data can always be recovered by authorized users even if the physical computer containing the bitlocked hard drive fails. You cannot save recovery information in Active Directory if the domain controller is running a version of Windows Server earlier than Windows Server 2003 with SP1.
There are 5 files needed to achieve TPM password backups:
- Add-TPMSelfWriteACE.vbs
- BitLockerTPMSchemaExtension.ldf
- List-ACEs.vbs
- Get-TPMOwnerInfo.vbs
- Get-BitLockerRecoveryInfo.vbs
Download these files from here:
http://www.microsoft.com/downloads/details.aspx?FamilyID=3a207915-dfc3-4579-90cd-86ac666f61d4&DisplayLang=en
If you are at 2003 functional level you have to perform a schema extention to create the fields in Active Director required to store the TPM recovery information. If you are 2008 functional level this is not required.
If you are serious about backing up TPM Recovery Information to Active Directory, please carefully read the following technet article:
http://technet.microsoft.com/en-us/library/cc766015(WS.10).aspx
I do not recommend implementing BitLocker into an active directory organisation without backing up the TPM recovery information from all bitlocked machines on the network.
Encrypted File System
The encrypted file system uses Data Recovery Agents (DRA) to backup encryption keys. You can have one more more DRA's for different departments under different organisational units, everyone in one DRA.
When you setup EFS you create a version 2 EFS user certificate from the EFS cert template and roll the certificates out to all users using auto-enrollment (Requires 2003 functional level).
If you are looking to implement EFS in your organisation make sure you read and completely understand Data Recovery Agents by reading the following article:
http://technet.microsoft.com/en-us/library/bb457020.aspx
BitLocker
Bit Locker requires Trusted Platform Module (TPM) v1.2 or higher. TPM is a security chipset built into the computers hardware. It either has it or it doesn't. If you don't have TPM you cannot use bitlocker.
The TPM contains information such as:
- the recovery password
- TPM owner password
- information required to identify which computers and volumes the recovery information applies to.
On a Microsoft network if you deploy bitlocker to all your vista / windows 7 workstations or 2008 member servers, you can deploy a group policy to store this TPM backup data in Active Directory. This ensures that data can always be recovered by authorized users even if the physical computer containing the bitlocked hard drive fails. You cannot save recovery information in Active Directory if the domain controller is running a version of Windows Server earlier than Windows Server 2003 with SP1.
There are 5 files needed to achieve TPM password backups:
- Add-TPMSelfWriteACE.vbs
- BitLockerTPMSchemaExtension.ldf
- List-ACEs.vbs
- Get-TPMOwnerInfo.vbs
- Get-BitLockerRecoveryInfo.vbs
Download these files from here:
http://www.microsoft.com/downloads/details.aspx?FamilyID=3a207915-dfc3-4579-90cd-86ac666f61d4&DisplayLang=en
If you are at 2003 functional level you have to perform a schema extention to create the fields in Active Director required to store the TPM recovery information. If you are 2008 functional level this is not required.
If you are serious about backing up TPM Recovery Information to Active Directory, please carefully read the following technet article:
http://technet.microsoft.com/en-us/library/cc766015(WS.10).aspx
I do not recommend implementing BitLocker into an active directory organisation without backing up the TPM recovery information from all bitlocked machines on the network.
Encrypted File System
The encrypted file system uses Data Recovery Agents (DRA) to backup encryption keys. You can have one more more DRA's for different departments under different organisational units, everyone in one DRA.
When you setup EFS you create a version 2 EFS user certificate from the EFS cert template and roll the certificates out to all users using auto-enrollment (Requires 2003 functional level).
If you are looking to implement EFS in your organisation make sure you read and completely understand Data Recovery Agents by reading the following article:
http://technet.microsoft.com/en-us/library/bb457020.aspx
Wednesday, November 25, 2009
5.1.0 500-'Firewall Error'
When trying to send external emails out we were getting the following error randomly for particular emails:
[203.10.1.143] #<[203.10.1.143] #5.0.0 smtp; 5.1.0 - Unknown address error 500-'Firewall Error' (delivery attempts: 0)> #SMTP#;
CAUSE: The Cisco firewall has a configuration entry like the following (it may have additional parameters specified after in addition to esmtp):
ip inspect name esmtp
This problem occurs because of incompatibilities or restrictions caused by the Cisco firewall configuration. It is more likely to occur if you are sending an email to multiple recipients or using a distribution list in Exchange.
FIX: Disable this entry in the Cisco firewall configuration by inserting the word “no” at the beginning of the line as shown, so that it should now read something like
no ip inspect name esmtp
Other people have seen this problem as well:
http://www.solutions.pro/email-server-generates-500-firewall-error-error-8/
http://nzschooltech.blogspot.com/2009/09/exchange-management-console-issue-on.html
http://mattlog.net/2008/12/31/exchange-2007-500-firewall-error/
[203.10.1.143] #<[203.10.1.143] #5.0.0 smtp; 5.1.0 - Unknown address error 500-'Firewall Error' (delivery attempts: 0)> #SMTP#;
CAUSE: The Cisco firewall has a configuration entry like the following (it may have additional parameters specified after
ip inspect name
This problem occurs because of incompatibilities or restrictions caused by the Cisco firewall configuration. It is more likely to occur if you are sending an email to multiple recipients or using a distribution list in Exchange.
FIX: Disable this entry in the Cisco firewall configuration by inserting the word “no” at the beginning of the line as shown, so that it should now read something like
no ip inspect name
Other people have seen this problem as well:
http://www.solutions.pro/email-server-generates-500-firewall-error-error-8/
http://nzschooltech.blogspot.com/2009/09/exchange-management-console-issue-on.html
http://mattlog.net/2008/12/31/exchange-2007-500-firewall-error/
Labels:
Exchange 2000/2003,
Exchange 2007/2010,
Networking
Tuesday, November 24, 2009
Email Dossier - A great way to test email addresses
An awesome website to test email addresses, the MX records and a test SMTP conversation for them is:
http://centralops.net/co/EmailDossier.aspx
Here is a screenshot of the type of information it produces:
http://centralops.net/co/EmailDossier.aspx
Here is a screenshot of the type of information it produces:
Saturday, November 21, 2009
SMTP Site Links in Active Directory - When To Use
I'm sure most of you Microsoft Administrators out there have seen SMTP site links before. What are they and how are they different to standard IP site links used by the DRA (Directory Replication Agent - the thing that compresses replication data for inter-site replication).
If you have an extremely unreliable connection with high latency you must use SMTP Site Links and not IP Site Links. These are connections like satellite for example - satellite connections must use SMTP for replication as IP will not function correctly.
One thing about SMTP Site Links is it cannot replicate the SYSVOL, it only replicates things like the Schema Naming Context and the Default Domain Naming context. It can also replicate Global Catalog Data. Because SMTP Site Links cannot replicate SYSVOL, they can only be used between different domains in the same active directory forest (remember active directory sites and services is the physical network topology for all logical domains in a forest - not just for 1 domain like we usually see).
If you have a company with a remote site over satellite links, you need a domain controller on site, as latency in satellite is way to high to authenticate over the WAN. Your only option is to create a child domain for that remote site. This is not a bad thing, because remember child domains have transitive trusts and can access all resources in the parent domain anyway if granted permissions so don't be scared about creating additional domains - it is very easy. Most companies try and get away with just a single active directory domain - but some cases you are required to have additional domains!
For more information please read:
http://technet.microsoft.com/en-us/library/bb742427.aspx
http://support.microsoft.com/kb/244368
If you have an extremely unreliable connection with high latency you must use SMTP Site Links and not IP Site Links. These are connections like satellite for example - satellite connections must use SMTP for replication as IP will not function correctly.
One thing about SMTP Site Links is it cannot replicate the SYSVOL, it only replicates things like the Schema Naming Context and the Default Domain Naming context. It can also replicate Global Catalog Data. Because SMTP Site Links cannot replicate SYSVOL, they can only be used between different domains in the same active directory forest (remember active directory sites and services is the physical network topology for all logical domains in a forest - not just for 1 domain like we usually see).
If you have a company with a remote site over satellite links, you need a domain controller on site, as latency in satellite is way to high to authenticate over the WAN. Your only option is to create a child domain for that remote site. This is not a bad thing, because remember child domains have transitive trusts and can access all resources in the parent domain anyway if granted permissions so don't be scared about creating additional domains - it is very easy. Most companies try and get away with just a single active directory domain - but some cases you are required to have additional domains!
For more information please read:
http://technet.microsoft.com/en-us/library/bb742427.aspx
http://support.microsoft.com/kb/244368
Microsoft Recommends SATA for Exchange 2010
Exchange 2010 has a new I/O pattern that results in 70 percent lower I/O requirements than Exchange 2007 (and Exchange 2007 had 70 percent lower I/O requirements than Exchange 2003!). This reduced I/O pattern, thanks to optimizations that make it so writes don't come in bursts anymore, combined with advancements in SATA drives, means SATA is now a realistic storage platform for Exchange 2010. SATA was previously just for desktop systems.
Microsoft recommends when using SATA storage you have your mailboxes replicated to at least 3 other servers in your organisation using DAGs - the latest and greatest in exchange mailbox high availability... see:
http://clintboessen.blogspot.com/2009/08/exchange-2010-database-mobility.html
Mailbox databases are replicated and are made highly available with load balancing accross your organisation, no server becomes mission critical. When deployed correctly if a single mailbox server was to fail, no one would care... where with previous versions people would be panicing. The server can simply be repaired whenever the administrator gets a chance... 3 months down the line if need be.
Also as long as you have your mailboxes replicated to at least 3 servers Microsoft says you no longer need to worry about backing up to expensive tape devices and storage mediums. This also comes about with the new Dumpster 2.0 replacing the old Dumpster 1.0 that was used in exchange 2003 and 2007. If you need all mail to be kept for 10 years, even after a user has deleted it, you can configure this using the dumpster allowing administrators to manually recieve deleted items from the DAG on any physical server. Quoted from Microsoft "Your organization can rely on the Exchange 2010 high availability infrastructure—which can provide up to sixteen replicated database copies—rather than tape backups to recover from failures, which helps you to reduce operating costs."
For more about dumpster 2.0 read:
http://clintboessen.blogspot.com/2009/10/exchange-2010-dumpster-20.html
Because SATA is a cheap storage solution where you can buy 2TB drives for next to nothing these disks will be locally attached to the servers. My recommendation is to not virtualize the mailbox servers as they ususally have high load even in a load balanced solution. This is one of very few servers in a windows environment I would not virtualize, nearly everything else should be virtualized. Microsoft does not recommend virtualizing servers with high utilization!
Links:
http://www.microsoft.com/exchange/2010/en/my/storage.aspx
http://www.microsoft.com/exchange/2010/en/us/mailbox-resiliency.aspx
Microsoft recommends when using SATA storage you have your mailboxes replicated to at least 3 other servers in your organisation using DAGs - the latest and greatest in exchange mailbox high availability... see:
http://clintboessen.blogspot.com/2009/08/exchange-2010-database-mobility.html
Mailbox databases are replicated and are made highly available with load balancing accross your organisation, no server becomes mission critical. When deployed correctly if a single mailbox server was to fail, no one would care... where with previous versions people would be panicing. The server can simply be repaired whenever the administrator gets a chance... 3 months down the line if need be.
Also as long as you have your mailboxes replicated to at least 3 servers Microsoft says you no longer need to worry about backing up to expensive tape devices and storage mediums. This also comes about with the new Dumpster 2.0 replacing the old Dumpster 1.0 that was used in exchange 2003 and 2007. If you need all mail to be kept for 10 years, even after a user has deleted it, you can configure this using the dumpster allowing administrators to manually recieve deleted items from the DAG on any physical server. Quoted from Microsoft "Your organization can rely on the Exchange 2010 high availability infrastructure—which can provide up to sixteen replicated database copies—rather than tape backups to recover from failures, which helps you to reduce operating costs."
For more about dumpster 2.0 read:
http://clintboessen.blogspot.com/2009/10/exchange-2010-dumpster-20.html
Because SATA is a cheap storage solution where you can buy 2TB drives for next to nothing these disks will be locally attached to the servers. My recommendation is to not virtualize the mailbox servers as they ususally have high load even in a load balanced solution. This is one of very few servers in a windows environment I would not virtualize, nearly everything else should be virtualized. Microsoft does not recommend virtualizing servers with high utilization!
Links:
http://www.microsoft.com/exchange/2010/en/my/storage.aspx
http://www.microsoft.com/exchange/2010/en/us/mailbox-resiliency.aspx
Friday, November 20, 2009
Windows 2008 Versions - Hardware Support
Below I will list the different versions of server 2008 and what each support in terms of hardware.
Server 2008 Web Server x86 supports 4GB Memory and 4 processors
Server 2008 Web Server x64 supports 4GB Memory and 4 processors
Server 2008 Standard x86 supports 4GB Memory and 4 processors
Server 2008 Standard x64 supports 32GB Memory and 4 processors
Windows 2008 Enterprise x86 supports 64GB Memory and 8 processors
Windows 2008 Enterprise x64 supports 2TB Memory and 8 processors
Windows 2008 Datacenter x86 supports 64GB of Memory and 32 processors
Windows 2008 Datacenter x64 supports 2TB of Memory and 64 processors
Windows 2008 Datacenter can only be purchased OEM.
Server 2008 Web Server x86 supports 4GB Memory and 4 processors
Server 2008 Web Server x64 supports 4GB Memory and 4 processors
Server 2008 Standard x86 supports 4GB Memory and 4 processors
Server 2008 Standard x64 supports 32GB Memory and 4 processors
Windows 2008 Enterprise x86 supports 64GB Memory and 8 processors
Windows 2008 Enterprise x64 supports 2TB Memory and 8 processors
Windows 2008 Datacenter x86 supports 64GB of Memory and 32 processors
Windows 2008 Datacenter x64 supports 2TB of Memory and 64 processors
Windows 2008 Datacenter can only be purchased OEM.
Thursday, November 19, 2009
Citrix RDP Logon Error
The following error was experianced when logging onto a citrix server using a standard user account with RDP. Administrator accounts can still logon with RDP.
The desktop you are trying to open is currently available only to administrators. Contact your administrator to confirm that the correct settings are in place for your client connection.
Also the error message "To log on to this remote computer, you must have Terminal Server User Access permissions on this computer."
The second error is ususally when you are trying to log onto a terminal server and you do not have "Allow logon over terminal services" user rights assignment in place. Most cases you nest the user account or domain users inside "Remote Desktop Users", a group that is always granted "Allow logon over terminal services".
However with citrix installed even though these permissions are right it will still not log in. To resolve this go to tscc.msc to open up the Terminal Services Configuration Connections. Go to the properties of the RDP-Tcp terminal services connector. Go to the Citrix Settings tab, and untick Non-administrators only lauch published applications.
The desktop you are trying to open is currently available only to administrators. Contact your administrator to confirm that the correct settings are in place for your client connection.
Also the error message "To log on to this remote computer, you must have Terminal Server User Access permissions on this computer."
The second error is ususally when you are trying to log onto a terminal server and you do not have "Allow logon over terminal services" user rights assignment in place. Most cases you nest the user account or domain users inside "Remote Desktop Users", a group that is always granted "Allow logon over terminal services".
However with citrix installed even though these permissions are right it will still not log in. To resolve this go to tscc.msc to open up the Terminal Services Configuration Connections. Go to the properties of the RDP-Tcp terminal services connector. Go to the Citrix Settings tab, and untick Non-administrators only lauch published applications.
Wednesday, November 18, 2009
Use Old Run As Windows 7 and Windows Vista
In Windows XP/2003 you had a feature to run a program as another user called "Run As". In Windows Vista and Windows 7 when your right click there is only "Run As Administrator" which runs automatically as an administrator account on the local computer or domain which is available.
For Windows 7 this menu still exists it's just hidden. To be able to run a program as a different user press CTRL + SHIFT then right click on the application.
For Windows Vista there is no way with the core windows build apart from using command prompt and using the runas.exe command line tool.
However Sysinternals has come to the rescue here with a small code ShellRunas applet. Grab it from here:
http://technet.microsoft.com/en-us/sysinternals/cc300361.aspx
For Windows 7 this menu still exists it's just hidden. To be able to run a program as a different user press CTRL + SHIFT then right click on the application.
For Windows Vista there is no way with the core windows build apart from using command prompt and using the runas.exe command line tool.
However Sysinternals has come to the rescue here with a small code ShellRunas applet. Grab it from here:
http://technet.microsoft.com/en-us/sysinternals/cc300361.aspx
Connect to Printer Error Windows Vista and Windows 7
The following error was experianced when trying to add a printer from a 2003 print server to a windows 7.
Windows cannot connect to the printer. No printers were found.
To get around this, instead of adding the printer as a network printer (like you would normally do), add it as a local printer creating a printer port. Perform the following steps to do this:
Select Add a Local Printer.
Select Create a New port... in the port box select Local Port.
Under enter a port name, enter the UNC path of the printer on the print server.
Click Have disk to manually select the driver.
Download the driver from the manufactures website. Windows Vista and Windows 7 drivers are very similar, if the manufacture only has a x86 or x64 vista driver it will most likely work on windows 7. In my case I downloaded the driver from the Ricoh website as I have a Ricoh Aficio MP C4500 printer. Select the driver in the wizard.
Sometimes drivers have support for multiple printers. Select the right printer that matches your model.
Provide the printer with a name
Wait for the driver to install.
Print a test page to verify that the procedure worked. Click finish.
This error is pretty generic and can have multiple causes, but generally it's when the print server does not have the correct print driver for your version of windows. On the windows 2003 print server you cannot view if it has windows vista or windows 7 driver support.
To add in support for windows vista and windows 7 you need to use the print management console from a windows 2008, vista or windows 7 PC. Print management console comes as part of the RSAT (Remote Server Administration Tools).
To download RSAT for Windows Vista go to:
http://www.microsoft.com/downloads/details.aspx?FamilyId=9FF6E897-23CE-4A36-B7FC-D52065DE9960&displaylang=en
To download RSAT for Windows 7 go to:
http://www.microsoft.com/downloads/details.aspx?FamilyID=7D2F6AD7-656B-4313-A005-4E344E43997D&displaylang=en
Then simply add the driver to the print server for windows vista or windows 7.
Windows cannot connect to the printer. No printers were found.
To get around this, instead of adding the printer as a network printer (like you would normally do), add it as a local printer creating a printer port. Perform the following steps to do this:
Select Add a Local Printer.
Select Create a New port... in the port box select Local Port.
Under enter a port name, enter the UNC path of the printer on the print server.
Click Have disk to manually select the driver.
Download the driver from the manufactures website. Windows Vista and Windows 7 drivers are very similar, if the manufacture only has a x86 or x64 vista driver it will most likely work on windows 7. In my case I downloaded the driver from the Ricoh website as I have a Ricoh Aficio MP C4500 printer. Select the driver in the wizard.
Sometimes drivers have support for multiple printers. Select the right printer that matches your model.
Provide the printer with a name
Wait for the driver to install.
Print a test page to verify that the procedure worked. Click finish.
This error is pretty generic and can have multiple causes, but generally it's when the print server does not have the correct print driver for your version of windows. On the windows 2003 print server you cannot view if it has windows vista or windows 7 driver support.
To add in support for windows vista and windows 7 you need to use the print management console from a windows 2008, vista or windows 7 PC. Print management console comes as part of the RSAT (Remote Server Administration Tools).
To download RSAT for Windows Vista go to:
http://www.microsoft.com/downloads/details.aspx?FamilyId=9FF6E897-23CE-4A36-B7FC-D52065DE9960&displaylang=en
To download RSAT for Windows 7 go to:
http://www.microsoft.com/downloads/details.aspx?FamilyID=7D2F6AD7-656B-4313-A005-4E344E43997D&displaylang=en
Then simply add the driver to the print server for windows vista or windows 7.
Subscribe to:
Posts (Atom)