How to perform an offline audit of your Active Directory NTLM hashes

It’s read-only Friday so I decided to perform a offline audit of our Active Directory passwords.

I found this great tool: https://gitlab.com/chelmzy/five-minute-password-audit which in turn is a fork of this tool: https://github.com/DGG-IT/Match-ADHashes

What I’m going to write here is mostly a repeat of these two Gitrepos with a few tweaks and corrections.

To perform this procedure you will need to be able to login to a Domain Controller. You’re also going to want a secure location to perform all of this work so the dumped list of usernames and hashes doesn’t escape your control.

The secure location should be a workstation or server running the same or a newer version of Windows than your Domain Controller. For example if you’re running AD 2012R2 you can’t complete this on a 2008R2 box. You’re secure workstation or server will need to be running PowerShell 5.0 or newer.

Step 1 – Export NTDS.dit and the SYSTEM hive

  1. Login to a domain controller
  2. Open a Command Prompt window
  3. Type “ntdsutil”
  4. Click ‘Yes’ if the UAC prompts you
  5. Run the following commands:
    activate instance ntds
    ifm
    
    # Replace <DOMAINNAME> with your domains name
    create full c:\temp\<DOMAINNAME>-audit
    
    # Wait for command to complete
    quit
    quit
  6. Transfer “C:\Temp\<DOMAINNAME>-audit” to the secure location you’ll work on it. I do not recommend performing the rest of these steps on your Domain Controllers

Step 2 – Download the latest Have I Been Pwned Offline NTLM password list

  1. Go to https://haveibeenpwned.com/Passwords
  2. Scroll to the bottom and download the “ordered by prevalence” NTLM link
  3. Once downloaded, transfer the password list to your secure location in the audit directory and extract it

Step 3 – Covert the hashes in the NTDS.dit file to Hashcat formatting

  1. On your secure workstation/server launch PowerShell as an administrator (right click, run as administrator on the PowerShell shortcut)
  2. Install the DSInternals tools by running
    Install-Module -Name DSInternals -Force
  3. Go into the audit directory
    cd c:\temp\<DOMAINNAME>-audit
  4. Convert the hashes
    $key = Get-BootKey -SystemHivePath .\registry\SYSTEM
    
    # Change <DOMAINNAME> to your domains name
    Get-ADDBAccount -All -DBPath '.\Active Directory\ntds.dit' -BootKey $key | Format-Custom -View HashcatNT | Out-File <DOMAINNAME>-hashes.txt -Encoding ASCII

Step 4 – Compare your hashes to HIBP

The code in the Git Repos I linked at the beginning of the article are written as functions. For myself I just wanted a script I could execute with the appropriate parameters instead of futzing around with importing the function.

I also tweaked the original script for formatting (I like a bit more white space personally), added CSV headers, removed the spaces between commas, had the script append it’s execution time to the end of the CSV file and allowed for relative filenames as parameters instead of requiring absolute paths.

Here is my version of the script:

<#
This is a slightly altered version of https://gitlab.com/chelmzy/five-minute-password-audit/blob/master/Match-ADHashes.ps1 which is a slightly alter version of https://github.com/DGG-IT/Match-ADHashes/ for no nonsense output. All credit to them.
.NAME
    Match-ADHashes
.SYNOPSIS
    Matches AD NTLM Hashes against other list of hashes
.DESCRIPTION
    Builds a hashmap of AD NTLM hashes/usernames and iterates through a second list of hashes checking for the existence of each entry in the AD NTLM hashmap
        -Outputs results as object including username, hash, and frequency in database
        -Frequency is included in output to provide additional context on the password. A high frequency (> 5) may indicate password is commonly used and not necessarily linked to specific user's password re-use.
.PARAMETER ADNTHashes
    File Path to 'Hashcat' formatted .txt file (username:hash)
.PARAMETER HashDictionary
    File Path to 'Troy Hunt Pwned Passwords' formatted .txt file (HASH:frequencycount)
.PARAMETER Verbose
    Provide run-time of function in Verbose output
.EXAMPLE
    $results = Match-ADHashes -ADNTHashes C:\temp\adnthashes.txt -HashDictionary -C:\temp\Hashlist.txt 
.OUTPUTS
    Array of HashTables with properties "User", "Frequency", "Hash"
    User                            Frequency Hash                            
    ----                            --------- ----                            
    {TestUser2, TestUser3}             20129     H1H1H1H1H1H1H1H1H1H1H1H1H1H1H1H1
    {TestUser1}                     1         H2H2H2H2H2H2H2H2H2H2H2H2H2H2H2H2
.NOTES
    If you are seeing results for User truncated as {user1, user2, user3...} consider modifying the Preference variable $FormatEnumerationLimit (set to -1 for unlimited)
    
    =INSPIRATION / SOURCES / RELATED WORK
        -DSInternal Project https://www.dsinternals.com
        -Checkpot Project https://github.com/ryhanson/checkpot/
    =FUTURE WORK
        -Performance Testing, optimization
        -Other Languages (golang?)
.LINK
    https://github.com/DGG-IT/Match-ADHashes/
#>

param(
    [Parameter(Mandatory = $true)]
    [System.IO.FileInfo] $ADNTHashes,
    [Parameter(Mandatory = $true)]
    [System.IO.FileInfo] $HashDictionary
)

process {
    $stopwatch = [System.Diagnostics.Stopwatch]::StartNew()

    # Set the current location so .NET will be nice and accept relative paths
    [Environment]::CurrentDirectory = Get-Location

    # Declare and fill new hashtable with ADNThashes. Converts to upper case to 
    $htADNTHashes = @{}
    Import-Csv -Delimiter ":" -Path $ADNTHashes -Header "User","Hash" | % {$htADNTHashes[$_.Hash.toUpper()] += @($_.User)}
       
    # Create Filestream reader
    $fsHashDictionary = New-Object IO.Filestream $HashDictionary,'Open','Read','Read'
    $frHashDictionary = New-Object System.IO.StreamReader($fsHashDictionary)

    # Output CSV headers
    Write-Output "Username,Frequency,Hash"

    #Iterate through HashDictionary checking each hash against ADNTHashes
    while ($null -ne ($lineHashDictionary = $frHashDictionary.ReadLine())) {
        if($htADNTHashes.ContainsKey($lineHashDictionary.Split(":")[0].ToUpper())) {
                $user = $htADNTHashes[$lineHashDictionary.Split(":")[0].ToUpper()]
                $frequency = $lineHashDictionary.Split(":")[1]
                $hash = $linehashDictionary.Split(":")[0].ToUpper()
                Write-Output "$user,$frequency,$hash"
            }
        }

    $stopwatch.Stop()

    Write-Output "Function Match-ADHashes completed in $($stopwatch.Elapsed.TotalSeconds) Seconds"
}
    
end {
}

 

To execute it, copy/paste it into notepad and save it as ‘myAudit.ps1’ or what ever file name you’d like.

Now perform your audit:

# Replace <DOMAINNAME> with your domain name
.\myAudit.ps1 -ADNTHashes <DOMAINNAME>-hashes.txt -HashDictionary <HIBP TEXT FILE> | Out-File <DOMAINNAME>-PasswordAudit.csv

# Example
.\myAudit.ps1 -ADNTHashes myDomain-hashes.txt -HashDictionary pwned-passwords-ntlm-ordered-by-count-v5.txt | Out-File myDomain-PasswordAudit.csv

The final result will be a CSV file you can dig through.

Step 6 – Clean it all up

The output may or may not surprise you but what ever the outcome, when you’re done you want to get rid of the <DOMAINNAME>-hashes.txt and the NTDIR.dis file as soon as possible. If someone snags a copy of that you’ll likely get in some serious trouble.

Head on over to SysInternals and grab SDelete

.\sdelete.exe -p 7 -r -s <DIRECTORY OR FILE>

 

Script to sync Domain Controller SSL Certificates to a specific host

We have an application that uses LDAP over SSL to authenticate users via Active Directory. The server running the application is a member of the domain and has the domains Root CA installed in it’s local certificate store.

Technically the Root CA should be good enough for the server and any applications on it to trust the SSL certificates on our domain controllers because they are signed by that Root CA. Not the case for this application.

We have four Domain Controllers each with a different SSL certificate that expires yearly and each with a different expiry date. Exporting and importing these certificates manually is going to be a huge annoyance.

I wrote a PowerShell script to handle doing it automatically for us. This script is being run against 2012 R2 Domain controllers which is why I use the PowerShell Module exporting the certificates and the target is  2008 R2 which is why the import is handled via ‘certutil’. You could easily swap these out in the script to suite your needs.

# Domain Controller SSL Certificate syncing script
#
# Created by: Eric Schewe
# Created on: 2018-10-11
#
# Permission Requirements
# -----------------------------------------------------
# "Remote Management Users" in the domain that holds the DCs you're pulling the certificates from
# Local Admin on the destination server
# Login As Batch Job on what ever server you're running the script from unless you're running it as a local admin on the destination server
#
# Other requirepments
# -----------------------------------------------------
# A directory called C:\Temp must exist on all servers involved
#

# The domain we want to get DC certs from
$domainInfo = Get-ADDomain <FQDN OF YOUR DOMAIN>

# Server we're installing the certificates on
$destinationServer = "<FQDN OF THE DESTINATION SERVER>"

# For each Domain Controller in the domain do the following
foreach ($dc in $domainInfo.ReplicaDirectoryServers) {

    # Open a remote session to the DC
    Write-Host "Connecting to $($dc)"
    $sessionCertServer = New-PSSession -ComputerName $dc -EnableNetworkAccess

    # Find the certificates thumbprint that matches the DCs FQDN
    Write-Host "Getting cert list for $($dc)"
    $thumbPrint = Invoke-Command -Session $sessionCertServer -ScriptBlock {(Get-ChildItem -Path Cert:\LocalMachine\My | Where-Object {$_.Subject -match "$($dc)"}).Thumbprint}

    # Debugging
    Write-Host "Dumping thumb prints from $($dc)"
    Write-Host $thumbPrint

    # Export the certificate on the remote DC, based on it's thumbprint
    Write-Host "Exporting $($thumbPrint) on $($dc)"
    Invoke-Command -Session $sessionCertServer -ScriptBlock {Get-ChildItem -Path Cert:\LocalMachine\My\$thumbPrint | Export-Certificate -FilePath "C:\Temp\$($args[0])-159969.crt" -Type CERT} -argumentlist $dc

    # Copy the exported certificate from the remote server to the local server
    Write-Host "Copying C:\Temp\$($dc)-159969.crt from $dc to C:\Temp locally"
    Copy-Item -FromSession $sessionCertServer -Path "C:\Temp\$($dc)-159969.crt" -Destination "C:\temp\"

    # Remove the exported certificate file on the remote server and close the session
    Write-Host "Cleaning up remote files and sessions"
    Invoke-Command -Session $sessionCertServer -ScriptBlock {Remove-Item -path "C:\Temp\$($args[0])-159969.crt"} -argumentlist $dc
    Remove-PSSession -Session $sessionCertServer

}

# Open a session to the server we're putting the certs on
$sessionDestinationServer = New-PSSession -ComputerName $destinationServer -EnableNetworkAccess

# Copy each 
foreach ($dc in $domainInfo.ReplicaDirectoryServers) {

    # Import the exported certificate file on the local server and then delete it
    Write-Host "Moving C:\Temp\$($dc)-159969.crt from local server to destination server"
    Copy-Item -ToSession $sessionDestinationServer -Path "C:\Temp\$($dc)-159969.crt" -Destination "C:\temp\"

    Write-Host "Importing certificate for $($dc) into destination server"
    Invoke-Command -Session $sessionDestinationServer -ScriptBlock {certutil -enterprise -f -v -AddStore "Root" "C:\Temp\$($args[0])-159969.crt"} -argumentlist $dc

    Write-Host "Deleting certificate files for $($dc)"
    Remove-Item -path "C:\Temp\$($dc)-159969.crt"
    Invoke-Command -Session $sessionDestinationServer -ScriptBlock {Remove-Item -path "C:\Temp\$($args[0])-159969.crt"} -argumentlist $dc

}

Remove-PSSession -Session $sessionDestinationServer

 

DFS not working properly over VPN for personal computers

We recently switched to a new VPN server after Mac OS dropped support for PPTP and because we were way overdue to do it anyway. Since then personal computers were unable to access network shares via DFS.

They could go directly to the file server and that would work.

Users who connected to VPN with a organization owned laptops were able to use DFS.

After some digging it turned out the issue was our old VPN allowed for WINS (yes yes I know) and our new VPN has WINS disabled (by design, see… we’re trying)

The proper solution to this problem is to re-configure DFS to use DNS only: https://support.microsoft.com/en-us/help/244380/how-to-configure-dfs-to-use-fully-qualified-domain-names-in-referrals

Unfortunately I didn’t have the time to implement this.

What we ended up doing is re-configured the DHCP scope to set VPN users DNS Suffix to ‘vpn.mydomain.com’

I then added aliases for all of our file servers and DFS servers under ‘vpn.mydomain.com’. Example:

  • fileserver1.vpn.mydomain.com, CNAME, fileserver1.it.mydomain.com
  • dfsserver1.vpn.mydomain.com, CNAME, dfsserver1.it.mydomain.com

This is a crappy hacky work around that isn’t really sustainable but will work for now until we can sit down and plan changing our DFS over to use DNS only.

Domain Computers worked fine because we use group policy to push out multiple DNS search suffixes. DHCP doesn’t allow you to do this with Windows PCs so when they try to lookup ‘fileserver1’ they would try to hit WINS if implemented and then append their DNS suffix (vpn.mydomain.com) and then fail to find the file server resulting in a “Network Path Not Found” errors.

Exchange users unable to share calendars post AD/Exchange migration

We just recently went through an AD forest migration AND an Exchange 2010 -> 2016 migration across forests at the same time. Good times.

One of the many issues that came up after the migration was the majority of our users being unable to share their calender’s with other users.

When trying to share via manually editing the calendar permissions users would get the error “One or more users cannot be added to the folder access list. Non-local users cannot be given rights on this server.”

 

If users tried to go the invite route by right clicking their calendar, choosing ‘Share’ and ‘Share Calendar’ they would get “Calendar sharing is not available with the following entries because of permission settings on your network:”

 

If you took a look at our GAL you’d see all of the users you couldn’t share with had a circle with a line through their entry:

 

I ended up stumbling across a solution by accident when trying to fix this on my own account. It turned out my account was a ‘Shared’ mailbox and not a ‘User’ mailbox. I converted it with the below PowerShell and then my account started working again:

Set-Mailbox -Identity <USERNAME> -Type Regular -DomainController <DC FQDN>

This worked great for me but my situation was unique. Other users with the issue were already ‘User’ mailboxes. I took another problematic account and ran the above command on it and got this:

Set-Mailbox -Identity <USERNAME2> -Type Regular -DomainController <DC FQDN>
WARNING: Couldn't convert the mailbox because the mailbox "USERNAME2" is already of the type "Regular".

Despite that warning message this users mailbox was now fixed after the user closed/re-opened Outlook.

I re-ran the command against their mailbox and the output was this:

Set-Mailbox -Identity <USERNAME2> -Type Regular -DomainController <DC FQDN>

WARNING: Couldn't convert the mailbox because the mailbox "USERNAME2" is already of the type "Regular".
WARNING: The command completed successfully but no settings of '<DOMAIN FQDN>/User Accounts/Staff/USERNAME2' have been
 modified.

Why didn’t I get that second warning about not making any changes the first time I ran it? Simple. It’s because something was changed and Microsoft doesn’t think I need to know that.

Digging into the account attributes I figured out what changed. It’s called ‘msExchRecipientDisplayType’ and was introduced in Exchange 2007. This attribute determines what kind of recipient the mailbox is in the Address Book.

Pre-AD Migration msExchRecipientDisplayType was set to 1073741824 which is a “ACL able Mailbox User”.

Post-AD Migration msExchRecipientDisplayType was set to 0 which is a “Mailbox User”.

Makes sense now why you can’t apply permissions (ACL) on a “Mailbox User” when a “ACL able Mailbox User” user type exists.

We used Microsoft’s own tools (ADMT, Exchange 2016) to migrate our users from one forest and Exchange to another. Some where in that migration the attribute was wiped out and not transferred on 2941 out of 3123 mailboxes.

Here is how you can identity all users in your environment with this attribute set to “0”

Get-AdUser -Filter * -Properties Name,msExchRecipientDisplayType -Server <DC FQDN> | Where-Object { $_.msExchRecipientDisplayType -eq "0" } | Select Name,msExchRecipientDisplayType

Our environment is a mix of Shared, User, Resource and Equipment Mailboxes. There were affected accounts in all four categories. If we did a simple script that looked for “msExchRecipientDisplayType=0” and changed it to “1073741824” we might end up with the wrong value for a mailbox depending on what type it’s supposed to be. Based on my reading msExchRecipientDisplayType should be 1073741824 for Shared and User mailboxes, 7 for a Room Mailbox and 8 for a Equipment Mailbox.

We decided the best way to fix this was simply re-applying the user type that a mailbox already was. This made the PowerShell much simpler. Here’s what we ran:

Get-Recipient -Resultsize unlimited | where {$_.RecipientTypeDetails -eq "SharedMailbox"} |Set-Mailbox -Type Shared -DomainController <DC FQDN>
Get-Recipient -Resultsize unlimited | where {$_.RecipientTypeDetails -eq "UserMailbox"} |Set-Mailbox -Type Regular -DomainController <DC FQDN>
Get-Recipient -Resultsize unlimited | where {$_.RecipientTypeDetails -eq "RoomMailbox"} |Set-Mailbox -Type Room -DomainController <DC FQDN>
Get-Recipient -Resultsize unlimited | where {$_.RecipientTypeDetails -eq "EquipmentMailbox"} |Set-Mailbox -Type Equipment -DomainController <DC FQDN>

These commands together fixed 2890 of 2941 broken mailboxes.

This will generate a simple report of which mailboxes weren’t converted and what type they are:

$domainController = "<DC FQDN>"
$brokenUsers = Get-AdUser -Filter * -Properties Name,msExchRecipientDisplayType -Server $domainController | Where-Object { $_.msExchRecipientDisplayType -eq "0" }
$user = ""

foreach ($user in $brokenUsers) {

    Get-User $user.Name -DomainController $domainController | Group RecipientTypeDetails

}

Clear-Variable domainController
Clear-Variable brokenUsers
Clear-Variable user

I took one of the accounts that wasn’t fixed and ran this command:

Set-Mailbox -Identity <USERNAME> -Type Regular -DomainController <DC FQDN>

This corrected the account. No idea why the batch command didn’t. I ran this command for all of the regular mailboxes that didn’t fix in the batch and it worked fine. That left me with a bunch of shared mailboxes that were still broken and one user account that would not fix.

Running this on the shared mailboxes did not help:

Set-Mailbox -Identity <USERNAME> -Type Shared -DomainController <DC FQDN>

I checked one of the problematic accounts in ADSIEdit.msc and it had the correct msExchRecipientDisplayType value of 1073741824 despite PowerShell telling me it was set to 0.

Since there were only 23 accounts left that were problematic I used ADSIEdit to verify and fix the remainders.

We ran into a few users who still had problems using the e-mail invite method to share their calendar. This was fixed by having them clear their Outlook auto-complete via these steps:

  1. On the File tab, choose Options > Mail.
  2. Under Send messages, choose Empty Auto-Complete List
  3. Choose Yes to confirm you want to empty the list.
  4. Close/Re-open Outlook
  5. Try again

 

References

Previous version of the Active Directory Replication Status Tool

Who liked using the Active Directory Replication Status Tool? I did.

Who thought it was a great, simple, straight forward tool that was far easier than interpreting the output of some command line tools and didn’t feel it needed to become a cloud service with a less intitive interface? I do.

Digging through a few of my servers I found the old installer for the Active Directory Replication Status Tool. The version you can install on your own servers and doesn’t appear to give an error about the installer being expired.

Download it here: https://dl.dropboxusercontent.com/u/22772464/adreplstatusInstaller.zip

I believe this is version 1.1 even though the splash page when launching it says it’s 1.0. It was the latest version you could download before Microsoft expired it here and told everyone to start using the cloud based SCOM solution they offer.

Please share it/mirror it for all to enjoy.

 

Update – 2016-04-05

Looks like Microsoft is trying to kill this tool. When you try the use the version I’ve linked above now it says it has also expired.

I’ve figured out a workaround. Download this handy tool and then do this:

runasdate settings

If anyone knows assembler and wants to try dissecting repl.exe to remove the date check I’ll happily link/host to their modified repl.exe.

I’ve seen a few comments on other sites about the saftey of getting this tool from a non-Microsoft source and this is the best I can provide to prove I have not altered the MSI file. If you check the hashes against the two download links below you’ll see they match files uploaded well before this blog post.

MSI File
MD5: d63ceaa4131f8dc64800d33ac3b242c7
SHA1: 1a117510e42d284199743c53722dd51690a93d59

http://www.herdprotect.com/adreplstatusinstaller.msi-1a117510e42d284199743c53722dd51690a93d59.aspx
https://malwr.com/analysis/Y2JlNDQ0YmM0MjM3NGY4MWJlOGJhOTkyMDNkZGQxZGI/

You can try the workaround I’ve mentioned above on the official download as well: https://www.microsoft.com/en-ca/download/details.aspx?id=30005

 

Update 2016-04-20

Good news everyone! Microsoft has brought back the stand-alone tool thanks to everyone who provided them feedback demanding it.

See: https://feedback.azure.com/forums/267889-operational-insights/suggestions/12293631-bring-back-the-on-prem-ad-replication-status-tool

The new, non-expiring download, can be found here: https://www.microsoft.com/en-us/download/details.aspx?id=30005

 

Update 2017-07-12

Looks like the tool is expiring again.

I was able to re-download, re-install from here: https://www.microsoft.com/en-us/download/details.aspx?id=30005

The tool worked again but now gives me a 24 day count down.

 

Update 2020-01-13

Microsoft might have finally killed this app. The latest version says the license is expired and the RunAsDate trick isn’t working for me anymore, at least not on Server 2019.

Looks like Microsoft posted a new version of the tool here: https://www.microsoft.com/en-us/download/details.aspx?id=30005 sadly it’s license is still expired.