The script that I reference in this post can be downloaded here:
GoDaddyDNSUpdatePublic.ps1.txt
I love the concept of using Let's Encrypt for free SSL/TLS certificates. However, the short 90-day lifetime of the certificates is designed for automated renewal. In this blog post I'm going to show the steps required to script the use of GoDaddy for DNS verification.
For the basic steps on how to get a SAN certificate by using Let's Encrypt and DNS verification by using Windows PowerShell, please see my previous blog post:
Using Let's Encrypt Certificates for Exchange Server
Let's Encrypt requires you to create an identifier for each DNS name that you want to include on a certificate. You need to validate each identifier to prove ownership of the domain. When you are using DNS validation, you need to create a TXT record in DNS for each identifier.
Unfortunately (from an ease of user perspective), the validation for an identifier is only valid for 30 days. This means, when you go to renew a certificate, you also need validate your identifiers again. Practically, this means you create new identifiers with the same DNS name, but a different alias, and validate them before generating the certificate.
Since DNS validation requires you to create a TXT record in DNS, you need a way to automate this. Fortunately, many DNS providers have a web API that you can use to programmatically access and create DNS records. However, be aware that there is no wide spread standard for this API. GoDaddy has created DomainConnect and submitted it as a standard, but I've not seen wide acceptance of it.
For this blog, I'll be showing the use of GoDaddy's API mostly because it's the DNS provider that I use most often.
Authentication to create and manage DNS is done by using an API key and a secret. Both of these are included in the URL when you perform tasks. You get the API key and secret from the GoDaddy Developer Portal (
developer.godaddy.com).
On the GoDaddy Developer portal:
- Sign-in by using your GoDaddy Account
- Go to the Keys tab and click Create Key.
- Give your key a name.
- Copy the key and the secret to a file and click OK.
This creates a test key which you cannot use for updating your DNS. However, you're now at a screen where you can create a production key. Save the details of that production key to a file for using in the script.
At this point, it is assumed that you've already created a vault and registered by using the ACMESharp cmdlets. The remaining steps are purely to automate the process.
#define domain that records are being created in
#script only supports a single domain
$domain = 'yourdomain.com'
#For Pfx file
$pfxPass = "PasswordOfYourChoiceToSecurePfxFile"
$pfxPath = "C:\Scripts\cert.pfx"
#header used for accessing GoDaddy API
$key = 'YourBigLongKeyHere'
$secret = 'YourBigLongPasswordHere'
$headers = @{}
$headers["Authorization"] = 'sso-key ' + $key + ':' + $secret
#First identity will be the subject, all others in SAN
$identities = "mail.yourdomain.com","autodiscover.yourdomain.com"
$allAlias = @()
I started my script by defining some variables used later on:
- $domain is your domain in GoDaddy where the DNS records are being created.
- $pfxPass and $pfxPath are used used then the certificate is exported to a PFX file before being imported into the Exchange Server.
- $key and $secret are provided by GoDaddy when you obtain your production key for the API.
- $headers is included as the authentication information later on when the call is made to the GoDaddy web api to create the TXT record.
- $identities contains the DNS names that will be included in the certificate. My example has two names, but more names can be added.
- $allAlias is defined as an array so that later functionality adding aliases can be processed.
Foreach ($ident in $identities) {
[string]$unique = (Get-Date).Ticks
$alias = ($ident.Replace(".",""))+$unique
$allAlias = $allAlias + $alias
$id=New-ACMEIdentifier -Dns $ident -alias $alias
If ($id.Status -eq "valid") {Continue}
$chResults = Complete-ACMEChallenge $alias -ChallengeType dns-01 -Handler manual
$chData = $chResults.Challenges | Where-Object {$_.Type -eq "dns-01"}
$value=$chData.Challenge.RecordValue
$name=$chData.Challenge.RecordName
#remove domain name from $name
$recordname = $name.Replace(".$domain","")
I use a Foreach look to create each identity and verify it using DNS. I'm going to go through this Foreach loop in chunks.
Since I'm going to need to create multiple identities over time, I wanted a unique identifier ($unique) to ensure there wouldn't be conflicts in naming. I chose to use the ticks value from time. This has the added advantage that you could sort them based on when they were created.
Each identity is referred by by it's alias. I defined the alias for each identity as the DNS name of the identity with the dots removed and $unique added. After each alias is generated, it's added to the $allAlias array.
When the identifier is created for Let's Encrypt, it's placed in the $id variable. The $id variable is then used to verify the status of the identifier. If an identifier with the same DNS name has already been created and verified then the status is valid. If it's valid, we don't need to do any of the other work in the loop and Continue to tell the script to carry on with the next identifier. If the status is not valid (which is expected for new identifiers) then we process the rest of the loop.
The results for the Complete-ACME challenge are placed in $chResults. The Challenges property for those results for the dns-01 challenge type are placed in $chData where we can get the RecordValue and RecordName properties:
- $name contains the name of the TXT record required for validation
- $value contains the text string that needs to be included in the TXT record for validation
When the TXT record is created by using the GoDaddy API, the data used in the request does not contain the domain name in the name. The $name variable is processed to remove the domain name (the domain name is replaced with nothing) and the results placed in $recordname which contains the data that will be submitted to the GoDaddy API.
#create DNS record
$json = ConvertTo-Json @{data=$value}
Invoke-WebRequest https://api.godaddy.com/v1/domains/$domain/records/TXT/$recordname -method put -headers $headers -Body $json -ContentType "application/json"
UPDATE: July 2, 2018
A reader named Jason has reported that the json format used by GoDaddy has changed and that the above code snippet needs to be updated. I have not verified, but Jason says updating the middle line to the following fixes the problem:
- $json = ConvertTo-Json @(@{data=$value})
After all the processing of data is done, creating the TXT record is fairly straightforward. The data for the request is put into a hash table that is converted into Json. This hash table only requires the data but you can include other information like the TTL.
Invoke-WebRequest accesses the GoDaddy web api with a put method to send the data. This same method that web forms use to return data to a web site. The URL being access needs to contain your domain name and the type of record being created. In this case, I hard coded TXT as the record type in the URL, but $domain is used to insert the domain name. The $recordname variable is included in the URL because we only want to create that specific record. If the URL ends at TXT then the API assumes that we're providing an array of all the TXT records and any other existing TXT records are wiped out. The $headers variable (defined earlier) provides the authentication information for the request.
#Submit the challenge to verify the DNS record
#30 second wait is to ensure that DNS record is available via query
Do {
Write-Host "Waiting 30 seconds before testing DNS record"
Start-Sleep -Seconds 30
$dnslookup = Resolve-DnsName -Type TXT -Name $name
$dnsSuccess = $false
If ($dnslookup.strings -eq $value) {
Write-Host "DNS query was successful"
Submit-ACMEChallenge $alias -ChallengeType dns-01
$dnsSuccess = $true
} Else {
Write-Host "DNS query for $name failed"
Write-Host "The value is not $value"
Write-Host "We will try again"
}
} Until ($dnsSuccess -eq $true)
I ran into an issue when verifying the TXT records. After creating the TXT record there can be a delay (a few second to a few minutes) until the record is accessible via the DNS servers. If Let's Encrypt tries to verify before it accessible, then it fails and isn't recoverable. You need to create a new identifier. So, I added this code to verify the DNS record is accessible before telling Let's Encrypt to verify.
The Resolve-DnsName cmdlet looks for the TXT record we just created. If the lookup contains the expected value then Submit-ACMEChallenge is used to tell Let's Encrypt to verify it. Also, the variable $dnsSuccess is set to $true and the loop ends. If it's not successful, we try again at 30 second intervals until it resolves successfully. Since adding this code to the script I haven't had any failures from Let's Encrypt. I think there may be some caching of failed lookups on the client running the script which result in a two minute delay, but better that than the Let's Encrypt lookup failing.
#Verify that the dns record was checked and valid
$chStatus = $null
Do {
Write-Host "Waiting 10 seconds before testing DNS validation"
Start-Sleep -Seconds 10
$chStatus = (Update-ACMEIdentifier $alias).Status
#$chStatus=((Update-ACMEIdentifier $alias -ChallengeType dns-01).Challenges | Where-Object {$_.Type -eq "dns-01"}).Status
If ($chStatus -eq "valid") {Write-Host "$ident is verfied"}
Else {Write-Host "DNS record not valid yet: $chStatus"}
} Until ($chStatus -eq "valid")
After the DNS lookup for the TXT record is successful, the script uses Update-ACME to retrieve the status of the verification. There is a 10 second pause to allow Let's Encrypt to perform the verification. If the verification is still pending then the loop repeats again at 10 second intervals.
I have two methods for checking the status ($chStatus). The more complex version is one I saw in an example someone else provided. However, I found that the simpler version seems to work fine. However, I did see one person indicating that when the challenge type is not specified that a pending request is not properly retained in the local vault and fails. With my delay of 10 seconds, I'm not sure that's ever happened. Both versions do seem to work though.
This is the end of the Foreach loop that processes the identifiers. Each identifier has now been verfied and there is a variable $allAlias that contains the alias used for each identifier. Next up is creating the certificate.
#Create Certificate
New-ACMECertificate $allAlias[0] -generate -AlternativeIdentifierRefs $allAlias -Alias $allAlias[0]
Submit-ACMECertificate $allAlias[0]
Write-Host "Waiting 10 seconds for certificate to be generated"
Start-Sleep -Seconds 10
Update-ACMECertificate $allAlias[0]
Get-ACMECertificate $allAlias[0] -ExportPkcs12 $pfxPath -CertificatePassword $pfxPass -Overwrite
The New-ACMECertificate cmdlet creates a certificate request. The first identifier ($allAlias[0]) becomes the subject of the certificate. Then the entire array $allAlias is submitted as alternative references which is the Subject Alternative Names attribute in the certificate. When you list the alternative identifiers manually, you can skip the identifier used for the subject because it's automatically added to the SAN attribute also. However, it works fine when that identifier is specified also. The alias for the certificate is set to be the same as the alias for the identifier used as the subject.
The certificate request is submitted and the script waits for 10 seconds to ensure that Let's Encrypt has time to generate the certificate. Update-ACMECertificate retrieve the certificate information from Let's Encrypt and puts it in the local vault.
Get-ACMECertificate with the -ExportPkcs parameter is used to export the certificate to a PFX file that can be imported into Exchange Server. While a password is not required while overwriting, the import of a PFX file without a password won't work properly. All will appear good, but the certificate will behave as if it has no private key. The -Overwrite parameter is specified because it's assumed that this script will be automated and this allows the file generated to be overwritten each time.
#Assign certificate using EMS commands
#If run as a scheduled task must be run at EMS prompt
$pfxSecurePass = ConvertTo-SecureString -String $pfxPass -AsPlainText -Force
$cert = Import-ExchangeCertificate -FileName $pfxPath -Password $pfxSecurePass -FriendlyName $allAlias[0] -PrivateKeyExportable $true
Enable-ExchangeCertificate $cert.Thumbprint -Services IIS,SMTP -Force
Finally, we import the certificate into Exchange Server. The Import-ExchangeCertificate cmdlet won't accept a plain text password. So, the password is converted to a secure string that can be used and placed into $pfxSecurePass.
To make it easier to identify the certificate on the Exchange server, the alias is used as the friendly name. The private key is also marked as exportable because by default it is not.
After the certificate is imported, then Enable-ExchangeCertificate is used to specify that it should be applied to the IIS and SMTP services. The -Force parameter enables the cmdlet to complete without requiring user input. However, it will replace the default SMTP certificate with this option.
Because the Exchange cmdlets are used, you need to schedule this script so that it runs in the Exchange Management Shell and not a regular PowerShell prompt.
If you are going to use this certificate on multiple Exchange servers, then you should update this script to import and enable the certificate on all of the Exchange servers in the site. The current configuration assumes one Exchange server and it applies only on the local Exchange server where the script is run.
Is it worth it?
In reality, the complexity of keeping this up an running is probably not worth it for Exchange Server. Exchange Server is a mission critical part of your organization. Instead I'd go for a low cost certificate provider and get a cert that lasts for 3 years. A SAN certificate from NameCheap.com costs about $35US per year. That's far less than the value of your time getting this up and going.
That said, this was a fun exercise for me and I'll probably use this for test environments. Now that I have the script it's pretty easy for me to use it.
Alternative for IIS
As part of developing this, I worked out a method to supply the certificate only to IIS. I don't think that it's very useful for Exchange Server since we also want it to apply to SMTP to allow secured SMTP communication. However, I'm including it in case anyone is interested.
#Import certificate from pfx to server
#script must be running as administrator to perform this action
$cert=Import-PfxCertificate -FilePath $pfxPath -CertStoreLocation Cert:\LocalMachine\My -Password $pfxSecurePass -Exportable
#Assign certificate to default web site
Import-Module WebAdministration #required for IIS: PSDrive
Set-Location IIS:\SSLBindings
Get-ChildItem | Where-Object { $_.sites.Value -eq "Default Web Site" -and $_.Port -eq 443 -and $_.IPAddress.IPAddressToString -eq "0.0.0.0" } | Remove-Item
$cert | New-Item .\0.0.0.0:443
Importing the script is done with the Import-PfxCertificate. Again, we specify a path, password, and mark it as exportable.
We manually import the WebAdministration module to get access to the IIS: PSDrive. In Windows Server 2012 and later, modules autoload for using cmdlets, but not PSDrives.
SSL bindings are located in IIS:\SSLBindings. So, the location is set to there. In that location, Get-ChildItem gets a list of the SSL bindings. The binding for all IP addresses on port 443 of Default Web Site is deleted.
The certificate information is piped to New-Item with .\0.0.0.0:443. This creates a new binding in the current directory for that certificate to 0.0.0.0 on port 443.
References: