Ed Vaizey Is A Fucking Idiot


He says: “We have got to continue to encourage the market to innovate and experiment with different business models and ways of providing consumers with what they want.

“This could include the evolution of a two-sided market where consumers and content providers could choose to pay for differing levels of quality of service.”

He also suggests that content makers could be charged for the first time for the use of the ISP’s networks – provided they too were clear about what they were getting.

How many times do we have to say this? Slashdot (for example) already pays their ISP for bandwidth to host their content. It is not, therefore, OK for my ISP to then charge Slashdot again for me to access that content. Just like it’s not OK for Slashdot’s ISP to charge me to access that content, because I already pay my ISP for that. This is what Peering agreements are there for and if ISPs don’t feel they’re getting a fair deal then they need to take that up with the other service providers, not the people at the end of the pipe.

This is on top of the fact that this “market” idea for QoS will inevitably end up with ISPs acting like TV companies. “Get our basic package to access the internet very slowly at low priority, only £9.99/month. Want to be able to use the iPlayer during waking hours? Get our BBC pack for only £4.99/month extra. Sorry, but due to a dispute with Google over pricing, we’re unable to offer our Search Engine pack this month, so you won’t be able to find anything on the internet”. And so on.

Exchange 2007/2010 Interop

As someone in the middle of a transition between Exchange 2007 SP3 and Exchange 2010 SP1, I have to ask; MS Exchange Team, what were you smoking when you came up with the interoperability options here?

2003 to 2007 was pretty decent, you could deploy your 2007 CAS servers more-or-less directly in place of your 2003 OWA front-end servers and they would happily serve 2003 mailboxes up to people. Management tool interop was lacking a bit, but that was excusable as 2007 was, at least, a totally new set of tools based on .NET/Powershell and they generally wouldn’t let you do anything to a 2003 mailbox that would break it horribly.

With 2007 to 2010 though, you can’t do any of that. You have to keep your 2007 CAS and HT servers around because the 2010 ones won’t play nice with 2007 mailbox servers, but then the 2010 CAS servers can’t connect to 2007 mailboxes. Except they can, if you copy the 2007 binaries onto them. Except that only works if there isn’t a 2007 CAS server in the same AD site as the 2010 CAS server. So, short of creating new AD sites for all your 2010 mailbox servers you have to have at least one AD site with both 2010 & 2007 CAS servers, which works OK internally, but if you’re publishing to the internets (and who isn’t these days?) then you start to run into issues. There are various articles about this (like this one) but basically you have to create a new internet-facing DNS entry for the 2007 CAS, move the 2010 to the old internet-facing DNS name, change the ExternalURL value on the 2007 CAS to match the new DNS name, change the internal DNS so that there’s an entry for the new DNS name (because Exchange treats other AD sites as “External” as well), buy a new trusted SSL Certificate for the new DNS name (or wildcard it) and pray to God that you’re only publishing 2007 CAS servers from a single AD site; otherwise it gets so complicated that your brain will try and kill itself to escape.

If you think that sounds bad, then wait; it gets better! You can’t use the 2010 management tools (which are now 64-bit only – unless you enjoy Powershell Remoting) to manage 2007 mailboxes. Well that’s not entirely true, you can so some things, just not everything and it’s the same with using the 2007 tools on 2010 mailboxes. It’s all documented on The MSExchangeTeam Blog and it’s a huge mess – if your helpdesk staff (or whoever else manages your mailboxes day-to-day) aren’t really on the ball there’s a good chance you’ll run into problems while the 2007->2010 transition is in progress.

All this is in addition to all the subtle syntax changes they’ve made to a lot of the management Cmdlets so that things which were valid commands in 2007 don’t quite work properly in 2010; I’m looking at you, Import-ExchangeCertificate – let’s go from importing a CA-supplied .cer file to importing the binary contents of said file as a Byte Array supplied as a command line argument, because that’s much easier.

I really like Exchange – 2007 SP1 onwards was top-notch – and 2010 has some great new functionality, but they’ve really dropped the ball when it comes to installation, upgrading, migrating and transitioning.

Exchange 2010 Gotchas: Multiple Domains

If you install Exchange 2010 into a forest with multiple domains then you may come across this issue.

When you are trying to enumerate or move system mailboxes within your Exchange 2010 organisation, you will not typically get any results returned unless you use the -DomainController switch and specify a DC from the root domain as this is where Exchange creates most of the system mailboxes. Without the -DomainController switch, it will use a DC in your current domain, which won’t be able to see any accounts in its parent(s).

i.e. Get-MailboxDatabase | Get-Mailbox -DomainController <dcname.root.domain>

This will typically come up when trying to remove the first (i.e. default) mailbox database created when you installed your 2010 mailbox servers.

Get failed sshd logons from Windows Eventlog Part 2

In answer to my question about a more efficient means of filtering by timestamp with FilterXPath, I’ve worked out how to do it “properly”; it’s not as fast as I was hoping, but it’s still taken the execution time down from 20 to 2 minutes so I can’t complain too much. I’ve also modded the script to deal with cases where Password Authentication is disabled and so you don’t get “Failed Password” events logged, just “Invalid User” or “No supported authentication methods available” instead.

The time value for comparison is in milliseconds, so 604800000 for 7 days, 86400000 for 24 hours, etc.

#sshd failed logon attempt finder
#Adam Beardwood 12/02/2010
#v1.0 - Initial Release
#v1.1 - Dramatically improved event gathering speed and added handling for non-password authentication failures
#Get all SSH events from the last 7 days from the Application eventlog (this may take some time)
$events = Get-WinEvent -LogName Application -FilterXPath "*[System[Provider[@Name='sshd'] and TimeCreated[timediff(@SystemTime) <= 604800000]]]"
#Create array to store IPs
$ips = @()
foreach($event in $events){
	#Convert the event data to XML so we can access the EventData (Otherwise there's no way to access the event message contents with "unregistered" eventids)
	$event = [xml]$event.ToXml()
	#Thin the herd a little and only process useful messages
	if($event.Event.EventData.Data.Contains("Invalid user") -or $event.Event.EventData.Data.Contains("Failed Password") -or $event.Event.EventData.Data.Contains("No supported authentication methods available")){
	#Do regex search of the message data for IP addresses and if found, add them to the $ips array
	$ip = $event.Event.EventData.Data
	$regex = [regex]"\d{1,3}\.\d{1,3}\.\d{1,3}\.\d{1,3}"
	$ip = $($($regex.matches($ip)).Captures).Value
	$ips = $ips + $ip
$date = get-date -Format yyyy-MM-dd
$file = new-item -type file "SSHLog-$date.txt" -force
add-content $file $($ips | select -uniq)

Get failed sshd logons from Windows Eventlog

This script grabs IP addresses of the last 7 days worth (Customisable) of failed logon attempts for sshd from the Windows event log. This is handy if you use a Windows-based OpenSSH package like copSSH and want to be able to generate a list of all the people making random attempts to logon to your machine for adding to a blacklist or firewall rule.

#Get all SSH events from the last 7 days from the Application eventlog (this may take some time). Change "adddays(-7)" to alter the timeframe.
$events = Get-WinEvent -LogName Application -FilterXPath "*[System[Provider[@Name='sshd']]]" | ?{$_.timecreated -gt $((get-date).adddays(-7))}
#Create array to store IPs
$ips = @()
foreach($event in $events){
	#Convert the event data to XML so we can access the EventData (Otherwise there's no way to access the event message contents with "unregistered" eventids)
	$event = [xml]$event.ToXml()
	#Thin the herd a little and only process "Failed Password" messages
	if($event.Event.EventData.Data.Contains("Failed password")){
	#Do regex search of the message data for IP addresses and if found, add them to the $ips array
	$ip = $event.Event.EventData.Data
	$regex = [regex]"\d{1,3}\.\d{1,3}\.\d{1,3}\.\d{1,3}"
	$ip = $($($regex.matches($ip)).Captures).Value
	$ips = $ips + $ip
$date = get-date -Format yyyy-MM-dd
$file = new-item -type file "SSHLog-$date.txt" -force
#Add unique IPs to output file
add-content $file $($ips | select -uniq)

If anyone has a better way to filter Get-Eventlog using FilterXPath by date as well as something else (Provider in this case) rather than having to get the whole thing and “where” it after, please let me know. I know I *should* be able to do it, but I could never get it to work properly (I think it was a timestamp formatting issue) and the documentation is a bit spartan.

Powershell & DHCP Scopes

Really quick one this; I had a DHCP server with 136 scopes on it that I needed to disable and so the following script was born. It takes 1 argument – the hostname of the DHCP server in question; leave blank for localhost:

#DHCP Server-wide scope disabler
#Adam Beardwood 16/02/2010
#v1.0 - Initial Release
Disables all DHCP scopes on the specified server
DNS name or IP of server to work with
	[Parameter(Mandatory=$false,ParameterSetName="Server",ValueFromPipelineByPropertyName=$true,Position=0,Helpmessage="DNS name or IP of server to work with")]$Server = "localhost"
$Server = "\\$Server"
$scopes = netsh dhcp server $server show scope
foreach($scope in $scopes){
$regex = [regex]"\d{1,3}\.\d{1,3}\.\d{1,3}\.\d{1,3}"
$ip = $scope | foreach {$regex.Matches($_) | foreach {$_.Captures[0].Value}}
if($ip -ne $null){
& netsh dhcp server $server scope $ip[0] set state 0

Sophos/Utimaco Safeguard Auto-Sync Script

Those of you who have used Safeguard will know that for reasons known only to the Germans, Utimaco decided not to provide any way to automatically sync Safeguard with your AD domain(s) without resorting to a rather buggy API. They provide some example code for VBScript, but if you want to do it in Powershell instead, look no further:

#Safeguard Directory Synchronisation Tool
#Adam Beardwood 04/02/2010
#v1.0 - Initial Release
#Load Safeguard .NET Assembly for use
[void] [System.Reflection.Assembly]::LoadWithPartialName("Utimaco.SafeGuard.AdministrationConsole.Scripting")
#---Declare Variables---
#DateTime Stamp
$DTS = date -format yyyy-MM-dd--hh-mm
#Root DSN to bind connection to
$dsn = "DC=mydomain,DC=local"
#Location for Log File
$logFileName = "D:\Scripts\SGSync."+$DTS+".log"
#Sync Group Membership
$membership = 1
#Sync Account State
$accountState = 1
#Relocate Move Objects if they have been relocated to another sync'd OU
$takeCareOfMovedObjects = 1
if($args[0] -ne $null){$Debug = $True}else{$Debug = $False}
#---End Variables---
#---Define Functions---
#Function to send email alert on Sync completion
Function SendEmail ($Errs){
$smtpServer = "smtp.mydomain.local"
The Safeguard Enterprise Automated Sync process ran. The following errors occurred:
$(foreach($item in $Errs){$item;"`r"})
$msg = new-object Net.Mail.MailMessage
$att = new-object Net.Mail.Attachment($logFileName)
$smtp = new-object Net.Mail.SmtpClient($smtpServer)
$msg.From = "SGE AD Sync<_SGSync@mydomain.local>"
$msg.Subject = "SGE Sync Process"
if($Debug){write-host "Email sent, also," $Errs}
#Function to actually sync Safeguard
Function SGSync ($OU){
$adsStartContainer = $OU+","+$dsn
if($Debug){write-host "Syncing:" $adsStartContainer}
[ref] $Outcome = $ScriptingDirectory.SynchronizeDirectory($dsn, $adsStartContainer, 1, $logFileName, $membership, $accountState, $takeCareOfMovedObjects)
if($Debug){write-host "Start OU:" $Result}
$Result = $Scripting.GetLastError($Outcome)
if($Debug){write-host "GetLastError returns:" $Result}
#---End Functions---
#Create scripting objects, authenticate to directory and then initialise the sync process
if($Debug){write-host "Synchronization of Users & Computers ... Started"}
try{$Scripting = new-object Utimaco.SafeGuard.AdministrationConsole.Scripting.Base}
catch{write-host "An Error Occurred While Attempting To Load Safeguard Directory Synchronisation Libraries. Quitting...";exit 0}
if($Debug){write-host "Created Base Object"}
$ScriptingDirectory = $Scripting.CreateDirectoryClassInstance()
if($Debug){write-host "Created Directory Object"}
$Result = $Scripting.Initialize()
if($Debug){write-host "Init API:" $Result}
$Result = $Scripting.AuthenticateService()
if($Debug){write-host "Authenticate:" $Result}
$Result = $ScriptingDirectory.Initialize()
if($Debug){write-host "Init SGD:" $Result}
#---Sync the following OUs---
SGSync("OU=An OU")
SGSync("OU=Another OU")
#Free up resources
$Result = $ScriptingDirectory.FreeResources()
if($Debug){write-host "Finalize SGD:" $Result}
$Result = $Scripting.FreeResources()
if($Debug){write-host "Finalize API:" $Result}
#Get errors from the generated log file
$Errs = select-string -pattern "Failure" -path $logFileName
#Send email alert
if($Errs -ne $null){SendEmail $Errs}
if($Debug){write-host "Synchronization of Users & Computers...End"}

Obviously this has to be run on a machine which has the Safeguard .NET assemblies installed (Management Console or Server packages)

ISA Server & Powershell

I’ve recently been trying to find a way to work with ISA 2006 using powershell; it’s not as straightforward as I would have hoped as all the docs are for C and Vbscript and they’re not exactly full of information anyway. That said, a few people seem to have been managing so I dived in and came up with this; it’s a Powershell script to add IP address ranges (In my case, just single addresses) from a text file to an ISA Network object (for the purposes of blacklisting). Requires Powershell v2 and needs to be run on an ISA server. For the record, the ISA API is horrible.

Adds specified IP address ranges to a given Network in ISA
.PARAMETER network
Network Name to edit
Path to file containing list of IPs to add
	[Parameter(Mandatory=$true,ParameterSetName="path",ValueFromPipelineByPropertyName=$true,Position=1,Helpmessage="Path to file containing list of IPs to add")]$path
	[Parameter(Mandatory=$true,ParameterSetName="network",ValueFromPipelineByPropertyName=$true,Position=0,Helpmessage="Network Name to edit")]$network
#Create ISA COM Objects
$root = new-object -comObject "FPC.Root" -strict
#Get our array (well, there is only one in this case)
$server = $root.Arrays | Select-Object -first 1
#Get IPRangeSet for the required network
$ipranges = $server.NetworkConfiguration.Networks.item($network).IpRangeSet
$file = gc $path
foreach($ip in $file){
	$flag = 0
	foreach($iprange in $ipranges){
		if($iprange.ip_from -match $ip){$flag = 1}
	if($flag -ne 1){$ipranges.add($ip,$ip)}