Thursday, January 7, 2016

PowerShell: Funny Tech Support Excuse Website

Jeff Ballard created a funny website titled "The Bastard Operator From Hell Style Excuse Server." We can use his funny tech support responses for our own purposes by extracting them from his site and adding them to our PowerShell script. Here's a snippet that'll create a string variable for your use:

$url = 'http://pages.cs.wisc.edu/~ballard/bofh/bofhserver.pl'
$excuse = ((Invoke-webrequest $url).parsedhtml.getelementsbytagname('font')|select -expandproperty outertext)[-1]

Here's a few $excuse samples:
  • asynchronous inode failure
  • Interference between the keyboard and the chair.
  • HTTPD Error 666 : BOFH was here
  • radiosity depletion
Now you'll have some great excuses to reply back with while you're working on the real problem.


Sunday, January 3, 2016

PowerShell: Download Reddit Photos for Windows Background and/or Screensaver

Microsoft Windows® has the ability to display photos for both your computer desktop background and screensaver.  I like to keep a fresh rotation of photos so I decided to write a dirty script to download highly upvoted photos, ensure they are in landscape orientation, and possess at least a 2048 pixel width.  I'll go through each part of the process.

Note: I mention "dirty" as not all photos will download correctly due to various URL's in each post and not all extraneous URL conditions haven't been cleaned up.  Since most Reddit posters use IMGUR for their photos, I focused on resolving those URLs.  Additionally, I write my code in Notepad, not IME, so it's in an old skool batch file type format.  You can edit it to your liking if your OCD is kicking in.

Let's get started (full script at bottom of this post):

#Pulling the JPG attributes requires a query of the extended file properties.  Here I'm calling the shell so we can query it later.

$shell = new-object -com shell.application

#Using the Reddit API and copying a sample search URL, I've created the JSON query to pull in the posts
#Note that I'm searching the Earthporn subreddit that are one month or less old, increasing my result list to one hundred objects, and each post has a net post score of at least 1500

$images = (invoke-restmethod "https://www.reddit.com/search.json?q=subreddit:earthporn&restrict_sr=&sort=relevance&t=month&limit=100").data.children.data|? score -ge 1500

#I noticed sometimes the Invoke-restmethod would get an SSL error so I check for that and run the command again if the $images count is zero.

if ($images.count -eq 0){$images = (invoke-restmethod "https://www.reddit.com/search.json?q=subreddit:earthporn&restrict_sr=&sort=relevance&t=month&limit=100").data.children.data|? score -ge 1500}

#Change $destination to your photos folder.  I put mine in my Dropbox subfolder so my other computer gets the new photos too.

$destination = "C:\Users\MyUserAccount\Dropbox\Earth"

#Creating a filename list so I don't re-download the same photos

$filecheck = gci $destination|select -expandproperty name

#Now that I loaded $Shell, created a Reddit $images result list for downloading, and pulled the list of existing photo filenames from $destination, I start my $images loop:

foreach ($i in $images){
#Cleaning up the URL and destination file name
$name = (($i.url).split('/')[-1]).replace('?1','')
if ($name -like "*.gif*"){continue}
if (($name -notlike "*.jpg") -and ($name -notlike "*.png")){$name = $name + ".jpg"}
if ($filecheck -contains $name){continue}
$fullname = (($destination + "\" + $name)).replace('?1','')
if ($i.url -like "*http://imgur.com*"){$i.url = ($i.url -replace ('http://imgur.com','https://i.imgur.com')) + ".jpg"}

#Downloading the JPG
if ($i.url -like "*://i.img*"){iwr ($i.url).replace('?1','') -outfile $fullname}
if ($i.url -like "*.staticflickr.com/*"){iwr ($i.url).replace('?1','') -outfile $fullname}
if (($i.url -like "*.jpg") -and ($i.url -notlike "*https://i.img*") -and ($i.url -notlike "*.staticflickr.com/*")){iwr ($i.url).replace('?1','') -outfile $fullname}
[array]$files += "URL: " + $i.url

#Pulling dimension property
$fileinfo = $shell.namespace($destination).parsename($name)
$dimension = $shell.namespace($destination).getdetailsof($fileinfo,31) -replace '[\W]',''

#If dimension field is bogus, replace with temp value so file can be deleted from destination
if ($dimension -notlike "*x*"){$dimension = "2047x1023"}

#Checking for landscape photo with horizontal size at least 2048 pixels
[array]$files += "Dimensions: " + $dimension
$horizontal = [int]$dimension.split('x')[0]
if ($horizontal -lt 2048){remove-item $fullname -force;continue}
$vertical = [int]$dimension.split('x')[-1]
if ($vertical -lt 1024){remove-item $fullname -force;continue}
if ($horizontal -lt ($vertical * 1.3)){remove-item $fullname -force;continue}

#If photo passes all checks, that file name is added to $files
[array]$files += "Success: " + $name
}

#End of loop, posting errors and results to $destination folder
$files|out-file ($destination + "\files.txt")
$error|out-file ($destination + "\errors.txt")


Next, save the script and add it to a Scheduled Task.  I used the following command in the task:
Program name: Powershell.exe
Arguments: -ExecutionPolicy Bypass c:\scripts\RedditBackgrounds.ps1 -windowstyle hidden

Now you'll have a scheduled task which automatically keeps your background and screensaver collection fresh.

Actual script:

$shell = new-object -com shell.application
$images = (irm "https://www.reddit.com/search.json?q=subreddit:earthporn&restrict_sr=&sort=relevance&t=month&limit=100").data.children.data|? score -ge 1500
if ($images.count -eq 0){$images = (irm "https://www.reddit.com/search.json?q=subreddit:earthporn&restrict_sr=&sort=relevance&t=month&limit=100").data.children.data|? score -ge 1500}
$DEarth = "C:\Users\MyuserName\Dropbox\Earth"
$destination = "c:\scripts\earth"
$filecheck = gci $DEarth|select -expandproperty name

foreach ($i in $images){
$name = (($i.url).split('/')[-1]).replace('?1','')
if ($name -like "*.gif*"){continue}
if (($name -notlike "*.jpg") -and ($name -notlike "*.png")){$name = $name + ".jpg"}
if ($filecheck -contains $name){continue}
$fullname = (($destination + "\" + $name)).replace('?1','')
if ($i.url -like "*http://imgur.com*"){$i.url = ($i.url -replace ('http://imgur.com','https://i.imgur.com')) + ".jpg"}
if ($i.url -like "*://i.img*"){iwr ($i.url).replace('?1','') -outfile $fullname}
if ($i.url -like "*.staticflickr.com/*"){iwr ($i.url).replace('?1','') -outfile $fullname}
if (($i.url -like "*.jpg") -and ($i.url -notlike "*https://i.img*") -and ($i.url -notlike "*.staticflickr.com/*")){iwr ($i.url).replace('?1','') -outfile $fullname}
[array]$files += "URL: " + $i.url
$fileinfo = $shell.namespace($destination).parsename($name)
$dimension = $shell.namespace($destination).getdetailsof($fileinfo,31) -replace '[\W]',''
if ($dimension -notlike "*x*"){$dimension = "2047x1023"}
[array]$files += "Dimensions: " + $dimension
$horizontal = [int]$dimension.split('x')[0]
if ($horizontal -lt 2048){remove-item $fullname -force;continue}
$vertical = [int]$dimension.split('x')[-1]
if ($vertical -lt 1024){remove-item $fullname -force;continue}
if ($horizontal -lt ($vertical * 1.3)){remove-item $fullname -force;continue}
[array]$files += "Success: " + $name
}
$files|out-file ($destination + "\files.txt")
$error|out-file ($destination + "\errors.txt")
move-item ($destination + "\*.*") $DEarth -force