Skip to content

Create a http to https URL redirect in IIS with Powershell

If you are hosting a website on IIS and would like your visitors to connect securely via https, whether they specify that in their browser or not, then there are a few steps you need to take.

First of, you need to install your SSL certificate into IIS.

Then install the URL Rewrite IIS module/extension which can be obtained from Microsoft here:

To ensure that a secure connection is used we can create a http to https redirect rule. This will mean that when someone types in the URL, the web server will automatically redirect them to

This rule can be manually created but to help save time and ensure consistency, the following Powershell can be used.

$webname= 'dbaland'
$rulename = $webname + ' http to https'
$domain = ''
$inbound = '(.*)'
$outbound = 'https://{HTTP_HOST}{REQUEST_URI}'
$site = 'IIS:\Sites\' + $webname + $domain
$root = 'system.webServer/rewrite/rules'
$filter = "{0}/rule[@name='{1}']" -f $root, $rulename

#Match URL
#stopProcessing not applicable for redirects although with rewrite, it will stop further rules from running
Add-WebConfigurationProperty -PSPath $site -filter $root -name '.' -value @{name=$rulename; patterSyntax='Regular Expressions'; stopProcessing='True'}
Set-WebConfigurationProperty -PSPath $site -filter "$filter/match" -name 'url' -value $inbound
#Conditions -> Logical Grouping
Set-WebConfigurationProperty -PSPath $site -filter "$filter/conditions" -name '.' -value @{input='{HTTPS}'; matchType='0'; pattern='^OFF$'; ignoreCase='True'; negate='False'}
Set-WebConfigurationProperty -PSPath $site -filter "$filter/action" -name 'type' -value 'Redirect'
Set-WebConfigurationProperty -PSPath $site -filter "$filter/action" -name 'url' -value $outbound

In the above code, specify $webname – the first part of the URL e.g. dbaland, and then $domain – the second part of the URL e.g.

When this is executed on the web server, the rule is automatically created and can be seen by double-clicking the URL Rewrite icon in IIS.

It is also necessary to ensure that “Require SSL” is not enabled, make sure this is not set by double-clicking the SSL icon.

This can be tested by browsing to and observing that the after the website has loaded, the URL is now

The newly created rule is written into the websites web.config file. This can be seen by browsing to the website folder on the web server and editing the web.config, the following should be visible:

<handlers accessPolicy=”Read, Execute, Script” />
<rule name=”ebsIntel-tribalcollege http to https” stopProcessing=”true”>
<match url=”(.*)” />
<add input=”{HTTPS}” pattern=”^OFF$” />
<action type=”Redirect” url=”https://{HTTP_HOST}{REQUEST_URI}” />



My DevOps Journey, The Start

I plan to document steps in my new role as a DevOps Engineer from the principles, practises and techniques, through to researching and learning new tools and how we implement all this in the company I work for.

My career in IT started as an Analyst Programmer working in VB6, several further development positions followed creating applications in VB.Net and then C# along with MS SQL Server. I then started working with servers as a Senior Server Analyst, this was followed by some years as a DBA before joining Tribal as a Development Engineer. Four years later I was promoted to Engineering Team Lead and a further two years down the line sees me making another change.

My recent roles have been orientated around databases and servers covering Oracle and MS SQL Server on Windows and RHEL platforms. The work has been varied and has included:

  • server administration
  • estate management
  • physical and virtual servers
  • TFS for source control and builds
  • facilitating test automation with PowerShell and Test Execute
  • developing REST APIs with dotnet Core and Entity Framework

Much of what I have done has given me a decent foundation on which to build upon and the experience I have gained will be invaluable I’m sure.

There will be plenty for me to learn which is just what I love so I’ll be hoping to share some of that learning journey here as I go.

Posts will cover various aspects in parallel, for example, I’m likely to write a high level piece about what DevOps is alongside a technical document on Git – the concepts and how it is used.

I’m not going to get into the nuances of DevOps and how it is not a role or a team, that is something for another time. For the purposes here I shall be using the term DevOps to describe my role, the team I work in and how we use it to further the success of the company.

I’m excited to be starting this new chapter – working on a new product and collaborating with some talented, motivated and personable developers and engineers.

Git repo and Visual Studio 2017

When using Git as the source control system within Visual Studio, if you use a different location to create your local repos than the default it can be a pain to amend it each time. This post will show how the default repo location can be updated…

The default location for Git repos in Visual Studio is:

C:\users\<user name>\Source\Repos

If you want to change this then follow these steps:

Open Team Explorer

Click the Home button

Then click Settings

Then Global Settings

Now set the Default Repository Location to a folder of your choice

Click Update

Hope that helps saves some time and removes one tedious task when using Git with Visual Studio.

If you have any useful suggestions for Visual Studio, Git or Azure DevOps then I’d love to to hear from you so leave a comment below.

Ansible – Part 1

Ansible is one of several tools that can be used for configuration management, this post provides some notes on the various roles that Ansible can perform as well as how it works. For an introduction to Puppet, take a look at my post – “Puppet – Introduction“.

So, what is Ansible – what does it do? It can take care of:

  • Change Management
  • Provisioning
  • Automation
  • Orchestration

Change Management

Define the system state i.e. what the system is meant to look like.

Ansible can be used to enforce the system state. For example, a web server may have the following definition:

  • Apache web installed
  • Apache web at version x.x.xx
  • Apache web started

If Ansible detects that system has changed then a “change event” is triggered to:

  • put the system back (to the defined state)
  • mark the system as changed

The next step is to determine why the system has changed.

Ansible employs idempotence – the function executed is idempotent if the system state remains the same after repeated applications as it was after a single application.


Systems can be prepared and made ready for use taking them from one state to another.

This process is different from cloning a virtual machine as Ansible installs and configures fresh each time. The following steps may be typical in taking a fresh server from just having the OS installed to being a functional web server:

  • Install web server software
  • Copy configuration files
  • Copy website files
  • Install security updates
  • Start the web service


Firstly, define the tasks to be executed automatically, they should be ordered the tasks should make decisions. The tasks could be ad-hoc but they may still be suitable for automation.

It should be possible to set and forgot the tasks once configured for automation.


Automation is on just one system but orchestration co-ordinates the automation across multiple systems such as:

  • Firewalls
  • Web servers
  • Middleware
  • Load balancers
  • Database servers

This is handled by the Ansible Control Server.


Part 2 will touch on reasons to use Ansible and a few of it’s characteristics with subsequent posts covering architecture and how to create a test Ansible environment.

*nix – check disk space

If you’ve tried to use ls -lh to get the size of a directory and it’s content, you’ll have found that it doesn’t give you what you were hoping for.

One method to get the size of a directory, including files, sub-directories and their files is to use du (disk usage).

du -sch

The switches in the above example are:

displays only a total for each argument.

prints a total of all arguments after they have been processed, e.g. the total size used by directories and files.

prints the size in an easily readable format such as 12M and 2GB.

Here we can see the result of using du at the same directory level as the ls example:

You can also get a breakdown of directory sizes by omitting the -s switch:

There are many other switches for du, take a look at man du for a list along with an explanation.

If you have any views on using du or maybe you have an alternative preferred method, please let me know in the comments section below.

More user admin in *nix

Following on from a much earlier post about user administration in Solaris, I have found a few other tasks that may be fairly common requirements.

To change a users login name
usermod -l <new username> <old username>
e.g. usermod -l john.knight john.knoght

“-l” tells usermod that we want to amend the login name.

This is useful if, like me, you made a typo when creating an account 🙂

Add user to supplementary group
usermod <username> -a -G dba
e.g. usermod john.knight -a -G dba

“-a” append the user to the supplementary group, only used with “-G”
“-G” list of groups separated by a comma, or a single group. Must be uppercase.

List the group membership for a user
There are a couple of ways you can check this;

groups <username>
e.g. groups john.knight
would return something like…
<john.knight> :  oinstall dba

id <username>
e.g. id john.knight
would return something like…
uid=35364 (john.knight) gid=35364 (oinstall) groups=35364 (oinstall), 35365 (dba)

Delete a group
If you have created a group which is no longer required, it is easy enough to delete it.

groupdel <group name>
e.g. groupdel oldgroup

Rename logical & physical MSSQL files

This post will provide guidance on how to amend the logical and physical file names of a MSSQL database.

When a copy of a database is restored as a new database, the logical file names will remain the same as the source database.

Firstly, check the current logical and physical file names:

USE master
SELECT name          AS [Logical_name],
       physical_name AS [File_Path],
       type_desc     AS [File_Type],
       state_desc    AS [State]
FROM sys.master_files
WHERE database_id = DB_ID(N'Database_name')

Running this query against a database called ‘SSMATEST’ on one of my database servers brings back the following:

Logical_name File_Path                File_Type State
DIRUT        D:\Data\DIRUT.mdf     ROWS      ONLINE
DIRUT_log    D:\Logs\DIRUT_log.ldf LOG       ONLINE

As can be seen, the physical names and logical names don’t match up the name of the database.

Let’s start with the logical names…


We use pass the current name of the logical file – NAME – and then name that we wish to use as the new name – NEWNAME.

The changes can be verified by running the query at the beginning of the post, the results will show:

Logical_name File_Path                File_Type State
SSMATEST     D:\Data\DIRUT.mdf     ROWS      ONLINE
SSMATEST_log D:\Logs\DIRUT_log.ldf LOG       ONLINE

So, that’s starting to look better, let’s move on to the physical file names.

First, take the database offline, thanks to Perry Whittle for suggesting the use of one ALTER DATABASE statement to achieve the same result as two!

It should be pointed out that you will need to carry this out during a maintenance window if the database is part of a live/production system.


Now rename the files from DIRUT.mdf and DIRUT_log.ldf to SSMATEST.mdf and SSMATEST_log.ldf in the file system via File Explorer or DOS. Once that is done, return to SSMS.

Update the records in the system catalog.


Check the message to ensure that there were no problems.

The file "SSMATEST" has been modified in the system catalog. The new path will be used the next time the database is started.
The file "SSMATEST_log" has been modified in the system catalog. The new path will be used the next time the database is started.

Bring the database back online.


Again, use the query at the top of the post to verify the changes are all good.

Logical_name File_Path                File_Type State
SSMATEST_log D:\Logs\SSMATEST_log.ldf LOG       ONLINE

There we have it!

Both the logical and physical file names have been updated to reflect the name of our database.

If you are new to T-SQL then I recommend checking out this book from the “Sams Teach Yourself” series: