Wednesday, August 13, 2014

Editing SSRS Shared Data Sources with PowerShell 3.0


I have recently had a need to edit the server in the connection string of data sources for several dozen SSRS 2008 R2 reports. Easy I know but this was to be a part of a failover scenario. That means I might not be the one doing it.

So the requirements I had, were:
1.       Automate as much as possible to prevent human error.
2.       Make it easy enough for a trained monkey.

Thankfully there were only a few data sources but due to the requirements just documenting the manual process with pretty pictures wasn’t an option.

I want this scripted in PowerShell.
I want to touch as little as possible with the script. (Should it bugger something up.)
I want it to be simple.
I don’t want to have to answer questions for the trained monkeys.

Originally I was going to run a query against the SSRS servers ReportServer DB with invoke-sqlcmd, edit the output and update the line back in the same way. I ran the query in SSMS to make sure I was getting the correct fields and results.



The results were not what I expected and needed to be converted before it was in plain English. Unfortunately I am not knowledgeable enough in SQL to convert it back, even after some lengthy Google searches. Due to this limitation of my knowledge I started looking at using the SSRS Reportserver service.

When connecting to an SSRS Service through a URL it MUST have “.asmx?wsdl” at the end. Also you should know if your SSRS server is running in “native”, “integrated” or “SharePoint” mode. This tells you which of the following service classes the address needs to include.

ReportService2010

ReportService2006

ReportExecution2005

ReportService2005

ReportServiceAuthentication

So the following code will connect to your MSSQL SSRS 2008 R2 server, look for all shared data sources in the specified “folder”. If the connection string matches the $OldSource variable value it is replaced with the $NewSource variable value and updated. I did many on-line searches so I wouldn’t have to reinvent the wheel but all of them were long, complex and difficult to follow. I think the method I came up with is much easier.

Wednesday, July 16, 2014

Getting Powershell commands, functions, modules, etc without installing anything.


Dave:    Hey Buddy, you should look at this cool script that I found.

Buddy:  Really, what does it do?

Dave:    It’s so cool. It will connect to any SQL server and report back information about each database that is hosted on it.

Buddy:  That’s cool. Did you tell the SQL admin about it?

Dave:    Yup, he said it was easier than the expensive product we just spent 50k on. Plus he didn’t need to do anything extra.

Buddy:  I just tried it but it keeps giving me errors.

Dave:    Well do you have SQL installed on your workstation? Greg and I do.

Buddy:  No, where can I install it from? Do I need a license? How much space does it need?

Dave:    Here try running Import-RemoteCommands.ps1 –Computername someSQLservername instead then try it again.

Buddy:  Awesome it works now, what did that do?

Dave:   The Import-RemoteCommands.ps1 script temporarily imports all commands, functions, modules, etc. from a remote computer to your local session so you can run almost any PowerShell script without having to install things locally to a system.

Buddy:  But what if I want to run something like that on a server? Will I need to submit a change so the boss doesn’t have an aneurism?

Dave:    You shouldn’t because as soon as you close the PowerShell window all those commands go away! You can even use it for a list of computers, so if you want SQL plugins, the AD module and the SCOM module, just list all those server names separated by a coma.

Buddy:  I just opened a new console window but Import-RemoteCommands is giving me an error now.

Dave:    For it to work you need two things, 1. PowerShell remoting needs to be enabled on the computers you list. 2. You need local admin or PowerShell remoting access on those computers.



 OK, so this didn't happen quite the way I portrayed it but it's pretty close.


Sunday, March 16, 2014

Rampant RDP sessions

I've been seeing more and more instances of developers and administrators just disconnecting from RDP sessions on our servers instead of logging off. One I just find this irritating and two It leaves possible applications open on the servers and using resources. So to combat this issue I finally put the following scripts together. 

The first looks for all RDP sessions that have been idle for more than 59 minutes on a filtered list of servers. It then compiles a list of all sessions with the same userID. It looks up the e-mail address of each userID in Active Directory and puts together a report of that list and e-mails it to the individual. Once all individual reports have been sent it sends a master list to a specified address (me). The function that gathers the RDP session info was written by Jaap Brasser http://www.jaapbrasser.com, I was half way through writing my own when I stumbled upon it but why would I want to reinvent the wheel?


The Second one I use to kill RDP sessions remotely quickly and easily. It uses Jaap's "Get-LoggedonUser" function as well but I've omitted it from the script box to save space.

Saturday, March 15, 2014

My office is located in EST and we have a data center onsite but we also have another data center in CST. I don't know why but every so often we get some developers that don't understand that. So I have to sometimes find out why their code is an hour wrong or show them that they need to make their code time insensitive. I use this to do it.


I decided to give up on the code highlighter and throw some business the way of https://gist.github.com
Sorry for the sloppy  script window here, still trying to figure out how to get it to show all neat and tidy on Blogger.

This Function I have found supper useful since at work to save drive letters many of our servers use mount points but we still haven't found a friendly way to check on the capacity and storage usage of them. This in combination of some other functions have become a staple of mine when checking system health and planning for additional storage needs.
 
 
 
  1. Function get-mountpoints {  
  2. Param(  
  3.     [Parameter(Mandatory=$True,Position=1)]  
  4.     [Array]$PC  
  5. )  
  6.     $volumes=@()  
  7.     $TotalGB = @{Name="Capacity(GB)";expression={[math]::round(($_.Capacity/ 1073741824),2)}}  
  8.     $FreeGB = @{Name="FreeSpace(GB)";expression={[math]::round(($_.FreeSpace / 1073741824),2)}}  
  9.     $FreePerc = @{Name="Free(%)";expression={[math]::round(((($_.FreeSpace / 1073741824)/($_.Capacity / 1073741824)) * 100),0)}}  
  10.     $volumes = Get-WmiObject -computer $PC win32_volume | Where-object {$_.name -notlike '\\?\*'}  
  11.     $volumes | Select SystemName, Label, Name, $TotalGB$FreeGB$FreePerc | Sort-Object name | Write-Output  
  12. }  

First Post

I've just decided to create this blog where I hope to share some of the many scripts, functions and modules I've created. I also hope to share some of the lessons I've learned, usual the hard way to that others don't have to go through the hell I put myself through.