Search This Blog

Monday, December 23, 2013

Followup thoughts about screencasting as a documentation tool...

I am convinced this is the very best way for me to make some serious headway on my documentation brain dump. After 4 days, I have captured 2 hours and 16 minutes of screentime. This is the video library so far...

I am trying to come up with a naming convention that lets me group the list of a category of video together, such as "Quick Tour" and "Docuvid". When those terms or phrases are searched on Drive, they display a flat list of all "chapters" in that series. The simpler the naming convention, the better. 

The benefit of using this technique to group these together with search is that the files can be located multiple ways. They can be located in the tech library and opened by drilling down through folder structure, which is arranged in what seems like the most logical fashion, but can get changed as the documentation structure matures. The search approach makes their logical location irrelevant, and is therefore more suited to a growing and changing layout.

After I did a few recordings, I shot a note to our training coordinator with a link to a few samples. I suggested that perhaps the quick tour concept would help with training for all staff, on a number of different topics. All it takes is time and money, right?

The more of these that I make, the better I feel about getting it done. Although screencasting doesn't make the IT training and documentation material magically create and direct itself, it does make the process of capturing information much more fluid.

Thursday, December 19, 2013

Quick Tour IT training and documentation videos... or How Screencasts Saved My Sanity!


Right? I will be the first to admit that I am not exactly fond of doing documentation for my job. It's mind numbing, even to a geek like me, but it hasta get done!

By documentation, I mean the tracking, recording, and updating of information on all the bits of the agency IT infrastructure; all the router configs, IT service logins, server setups, network addresses, software and hardware configurations, procedures, protocols, and requirements for all the doohickeys.

I have done significant documentation with Google Docs and Sheets. That is certainly a more immediate, more "malleable", and focused view on any given detail of IT knowledge, but to put that kind of documentation together requires a concentrated effort free of distraction. It must also contain a certain amount of Beginner's perspective in order to avoid leaving out "assumed facts". It is hard to put those detail in a "typical usage scenario" context. Producing screencast videos of these details shows the workflow AND captures details on screen that can easily be neglected (or presented in such density as to be overwhelming) in printable docs.

I have been on a mission this week to produce one or two "quick tour" videos a day of mission critical services and admin consoles using Screencast-o-matic. The videos are between 5 and 10 minutes long usually, and are uploaded to the Google Drive directory for my staff IT account. The documentation is then securely shared with select senior staff as contingency. Finally, I created a subdomain to consolidate and make the entire list of videos accessible at

Being a mostly solo support person with the majority of knowledge trapped in my skull, I am understandably concerned about providing the MOST transfer of that knowledge to a form easily accessed and digested by someone besides me if the need arose. If I am doing this right, someone with a reasonable amount of Windows and WAN/LAN experience should be able to spend a day or two watching these videos and have a solid grasp of the lay of the IT land around here if the someone who is me is not available.

If there's one thing I strive for in this job, it's to be prepared. An IT department cannot be prepared without documentation. Producing these kinds of videos is not all that needs doing to achieve this goal, but it serves a lot of information efficiently. I predict that this will help me fill out the agency "tech library" in record time.

Not that I'm going anywhere!

Thursday, October 24, 2013

Simple vs. Efficient in Desktop Management Strategy

In my time as ARC IT Guy, the quest to Keep It Simple has been a driving force behind many of the choices I've made about how the infrastructure developed here. Admittedly, Simple to me has an element of "existing familiarity" when it comes to implementing tech, and being familiar with a system already reduces the resistance I have to finding a fit for it long-term. Simple also takes into account the need to make IT administration roles and duties accessible and quickly absorbed if a backup IT person is brought in to temporarily or permanently serve as IT Coordinator. SAManage, as our ITSM platform, demonstrates this principle.

If I have to figure in the time cost to learn something new, that weighs heavier in the evaluation of new IT components for long-term deployment and management. Having said that, there are times when Simple starts creating more work for me if it's going to scale up... or the same/less amount of work long term if we adopted a new approach that required significant learning and labbing curve for the person charged with implementation. The test of Simple vs. Efficient is popping up a lot with the H2T project, and mostly in the context of Active Directory, DNS, and domains in general. We do not currently employ any of those mechanisms in our environment.

As I unroll the map of the H2T project, and also listen to how others have seen this same kind of effort succeed or falter, there are certainly indications that reconsidering my stance on AD and DNS adoption here would be in our best interest. With the proliferation of these server features and the platforms they enable, there is a better chance of emergency IT support availability from people already versed in their management. It is now incumbent on me to make sure the Simple is more a reflection of well-established strategies for managing resources at our level, and in the way we use those resources.

And so, I make room in my brains for a learning binge. That string around my head in the blog header is all that holds it in some days!

Friday, October 18, 2013

Windows 7: A multi-profile, concurrently accessible RDP host? Hypothetically...

Say there was a geek in a hypothetical IT lab situation that discovered how to enable Windows 7 as a concurrently multi-user RDP host. A user density of 7 per virtual desktop is the goal. Said geek had enough Windows 7 licenses to cover each user instance connected to a host, but wonders... 

Is this skirting a hard EULA violation worthy of vendor wrath? If a lab discovers how to make this work, could said shop cover the additional profiles on a single machine using CALs vs full Windows 7 licenses going forward?

What does the EULA for  Windows 7 Pro say about all this witchcraft?

From Section 3 

f. Device Connections. You may allow up to 20 other devices to access software installed on the licensed computer to use only File Services, Print Services, Internet Information Services and Internet Connection Sharing and Telephony Services.
g. Remote Access Technologies. You may access and use the software installed on the licensed computer remotely from another device using remote access technologies as follows:
· Remote Desktop. The single primary user of the licensed computer may access a session from any other device using Remote Desktop or similar technologies. A “session” means the experience of interacting with the software, directly or indirectly, through any combination of input, output and display peripherals. Other users may access a session from any device using these technologies, if the remote device is separately licensed to run the software.
· Other Access Technologies. You may use Remote Assistance or similar technologies to share an active session.
Also figure into the equation that all end-point RD guest kiosks will be running as Windows 7 ThinPCs, and the hypothetical shop has the Software Assurance to migrate the full OS to a virtual machine.

Pondering this frankensteining of the go-to desktop for this shop, questions begin to arise.
  • Why would a lab even go down this road? 
  • Why not just go the straight route and get CALs on Server 2008? 
  • Is Server 2008 overkill for the desktop experience? 
  • Can 7 be to Server 2008 what ThinPC is to 7, in the context of simplified, stripped down multi-user virtual desktop hosting roles? 
  • Does the Remote Desktop/Other Users feature of this agreement validate the course taken by this mad scientist?
To that last question, hypothetically, yes.

Wednesday, October 16, 2013

HipChat Lightning Review - Is ARC ready for a group chat platform?

I was perusing Google Apps Marketplace this morning to see what's new out there and found HipChat, a multi-platform business group chat service. I am not sure why this one caught my eye initially, except that it had Google Apps integration. I watched the peppy-background-music video and then goog'd hipchat nonprofit, whereupon I discovered, for us, it is free.

OK, so it integrates with GApps. How much do we use the baked in IM in webmail? Would we used it more if a chat room space was available for multiple users? Would it really cut down on frivolous email chains? That is what remains to be seen. I signed us up. Sure enough, after approving API access for our domain to HipChat, it showed up in the More menu.

Ease of deployment is at the top of the Good Stuff list.

Staff clicks through More > Hipchat and is greeting with a form asking for Job Description and a password. I was somewhat puzzled by the prompt to set a password since it's supposed to be passthrough auth, but I used my agency email password thinking it would then enable passthrough. In any case, the app is where it needs to be in webmail, and all other login can be done in the background.

I set up a chat room for 1340arnold 2 accounts and took a second to figure out how to use the chat window. Found that and started a chat. First thing to configure: disabling the new chat audio bell. Here's a screenshot of settings for the webclient...

Yes, definitely sound off.

I am going to toss this out there for admin staff to try. My sense is that it could be easily adopted, simply because of how pervasive IM has become. I don't want it to become burdensome. If it truly helps cut down on email, then it has value to staff in general.

From an admin standpoint, it automatically recognizes my admin status in Gapps when I log into the web control panel. At the top of the window there is a tab for "Group Admin" where more granular control of permissions can be assigned and managed. Other management modules include individual configuration of notification parameters and paths, browsing of rooms and users, and deletion of account. 

I am curious to see if this gets a good reception and becomes useful. For the cost, it's worth the effort. Staff can access it by going to

Suggested uses...
  • File sharing between staff and guests, staff and staff
  • New employee assistance - newhires can ask questions on monitored rooms to get answers about how things work around here; guided answers can be eventually collected into a FAQ, but chat history is available to search in the mean time
  • New room creation request channel
  • Join request channel for private room access
  • Program, workgroup, or task specific rooms can provide support for staff in every program; facilitates discussion of protocol, method, tools, and guidelines
  • Broadcast information for mobile groups and teams without impacting email
  • State of The Agency broadcast channel
  • HipChat allows for guest access via private URL to specific rooms, enabling instant support channels for families by ARC staff or having group discussions with staff from other agencies 
  • Private channels for management at each level
Honestly, I am not sure this will be any more useful than Google Groups has been, but IM is less structured than Groups, and as such less cumbersome to manage. That reminds me, I should figure out a good use for Groups. A plus about Groups is that there's a Manage option in user permissions and no such equivalent in HipChat that I saw. There is also no obvious "ban" option if guest mode is enabled for a HipChat room, other than to disable and re-enable guest mode (which does generate a new URL if re-enabled).

In summary, HipChat is an interesting option for expanded agency and community collaboration services, falling somewhere between Google Talk's one-to-one approach and Google Groups' "bulletin board" platform on the function spectrum. 

Does anyone really want to manage or absorb another information input source? Adopting new communication tools requires that those tools offset the overhead of adaption with rapidly realized USEFUL benefits in the span of time it takes to learn where all the buttons and options are. Too much for us right now?

Time will tell.

Edit 10-17: A colleague pointed me at this review of Hipchat and 2 other group chat options. Looks like I picked the right one.

Sunday, October 13, 2013

Product Impressions, First Look: ThinKiosk v4 by Thinscale Technology

I have frequently paused to consider doing a product review over the last couple weeks, as I evaluate and digest the options available for managing kiosk/VDI based interface across the agency. 

I have come to learn:
  • there are several interesting contenders in the field of kiosk PC management, and they range from too expensive to free (openthinclient, roll-your-own interface with HTA, several Linux options)
  • having a clearly defined goal and project parameters, with a logical progression of implementation laid out, makes a difference in refining candidacy for adoption
  • writing things down is helping me look at our options with a better sense of organization
  • I should probably use a gantt chart or mindmap at some point
  • I must give more weight to simple approaches and figure out how to measure the workload required in both initial deployment and long term management of new components to my administrative duties
  • I am enjoying this process more than I would have thought
  • there is soooooomuuuuuchmoooooooreto learn
In the case of ThinKiosk - a suite of end-point client profile creation, deployment, and management tools I am currently taking a run at - I am stopping my flurry of research and eval to give an initial impression. Version 4 of this suite released in September of 2013 after a stretch of intense revision from previous versions by Andrew Morgan and his team.

So what features/aspects/problem-solving/superhero bits does this product have that compelled me to write it up? Certainly there are other products with very similar functionality, but these are things that grabbed me right off about ThinKiosk. 

Keep in mind as you read, these are observations from the perspective of a non-profit IT admin lone ranger with limited resources, and a completely full plate.

And in no particular order, these are the things I appreciate most so far:
  • RESPONSIVE DEVELOPERS - this what gets me to recommend spending money on software for our agency, when there are free options for almost every function an IT shop oversees (but which can end up costing more in time and effort to prep the components and put pre-requisites in place).
  • client and server both Windows-based. Windows I know, fiddling with Linux is not something I want to take on right now
  • ease of installation and config for all components (though not on my first try)
  • the LACK of requirement for AD to be incorporated to make it work
  • a central repository for machine interface profiles that isn't AD-based
  • an end-point remote control feature
  • somehow enables an admin to make the machine more secure and more accessible at the same time, in a ridiculously straight-forward way... without AD
  • SILENTLY deployable client with command line options as msi
  • it could very well let me do the 40 remaining desktop XP upgrades remotely
  • full screen shell alternative to explorer, with auto-login, customizable and secure enough to create a dual guest/staff interface to appropriate resources
I have spent a few hours now digging around in the documentation, pouring over the support forum, reading product literature, installing, configuring, testing, cursing, uninstalling, and reinstalling... AND I managed to get the dishes and vacuuming done during all that!

Although my initial installation met with complications based on my own configuration of desktops and network using VPN, the reinstallation on another server using a different network ingress, functions the way I would expect. Accounting for the learning curve, the server and first installed client (on a remote Windows 7 x86 PC at the office) do everything I ask.

To expand on my list above...

As much as I would like to give Linux a place in our infrastructure (for both economic and platform flexibility reason), I have to be practical and know when to bail on an experiment with Windows alternatives. I have to be able to see a long term management cost, and any possible overhead during rollout that would cause me to double back and start from scratch. I encountered this a couple of times with VPN development and deployment, when a solid product or vendor that smoothly integrated early on and held up in production, later developed some fatal shortcoming and I had to start over.

Having an easy installation, for both client and server, is key to making progress and being able to dedicate attention to nuances of configuration before blasting out to the universe. Getting the basic installation down and having a handle on the configurations quickly is what lets me get to putting the product through it's paces. If a product shows enough promise, I will slog through a few hurdles to make it work. ThinKiosk has been fairly well-behaved for a Windows 2008r2 server and Windows 7 x86 client install.

The frequency of Active Directory being featured on my list is directly proportional to how much I am really trying to avoid having to figure it out RIGHT NOW, and add to the prereq list. Don't have that luxury of time or brains to spare yet. Having to consider setting up AD and DNS puts a candidate product in the same class as having to mess around with Linux, in terms of effort cost. Don't get me wrong, I would like the user and configuration management goodness that AD represents, but it has always seemed like beyond the scope of what I can manage for the agency. Maybe someday. For now, with ThinKiosk, that is a non-issue.

I have gradually done what I can to centralize critical IT functions over the last 6 years when the opportunity or solution presents itself. There have been 2 key developments for IT in the last year for the agency that open the doors for the centralization of more mechanisms: upgraded data service at most of our sites, and a VPN infrastructure. In a project such as our H2T initiative, I am taking on an aspect of the desktop experience that hasn't needed as much "hands on" after a deployment as this will. To scale this rollout up after the initial tweaking, a mechanism to manage the interface on each machine from one place is the only way I can keep on top of this in the long term. ThinKiosk's management console makes client configuration and profile deployment very easy.

I would argue that there is no way to effectively manage an IT infrastructure at any size deployment without a remote control tool for remote troubleshooting. Sometimes there is just no substitute (or amount of patience) for trying to talk someone through navigating Windows. Since being there in person for this is very much not an option for me with 11 sites spread all over the county, I need this "be anywhere" magic in the mix. One of ThinKiosk's major benefits is the remote control feature that allows an admin to shadow the client, even when they are in a remote desktop session. I have been testing another product from IntelliAdmin that provides the same mechanism from the other side of the remote desktop session, on the host. Both have their place and value in my toolbox. I tend not to think one can have "enough" remote connection options, honestly.

One of the biggest challenges I have faced in my desktop support career is finding that balance between maintaining PC security and giving folks the option of customizing their desktop environment. If you don't lock things down tight enough, or conversely lock them down too tight, support call volume WILL increase. I don't want to get calls about either a compromised desktop OR an app that won't start or webpage resource that won't work because of UAC. 

The challenge ThinKiosk takes on, and subsequently conquers, is enabling an endpoint framework that thoroughly secures the OS, but supports the means for anyone using the client to access information or services easily, be it staff, management, or visitors. One of the options available with the client is auto-login to a profile created by the ThinKiosk install (the option for using other existing logins is also available). It can allow basic functions such as web browsing (or use of any other app on the PC as set by the kiosk profile) that require no special permissions, and also pre-configured remote desktop shortcuts that connect staff securely to VDI sessions. As such, the end point client becomes much less of a potential attack surface, and existing staff desktops become even more secure. Simply amazing (to me, anyway)!

The ThinKiosk client can be installed using msiexec and various command line options for the install to configure connection broker server, port, and user login. Installing things from command line is a giant time saver, espectially when used in conjunction with a tool like psexec. The mind-numbing click-fest that is software installation can be avoided, and kicked off after hours to wrap up a deployment with minimal effort.

Factoring into the decision process for how to make H2T a reality is the current effort to migrate remaining XP desktops to Windows 7 by April of 2014. Up til now, I have been rebuilding those PCs by hand, one at a time. With ThinKiosk, I have the option of leaving those machines with XP on them til a later time, but still enable the Windows 7 experience with RDP shortcuts. I will still need to put Windows 7 (ThinPC) on them, but in the mean time I can put the interface on these PCs that they will still be using even after an OS upgrade on the client. This is a consolidation of effort that benefits 2 projects.

I have already touched on this, but separating the production desktop experience from the end-point machine makes it possible to increase resource access for all staff. Their individual desktop experience is no longer tied to one machine. If one end-point is down, they can pick up where they left off on another one. Guests can make use of end-points for internet access without compromising the agency infrastructure. ThinKiosk really simplifies the process of making staff desktops more secure, but paradoxically more universally accessible.

So far I am really impressed with the potential advancements I can make with our infrastructure by using ThinKiosk for our end-point management. More as it develops.

Friday, October 4, 2013

Shift Happens Phase 2: Bye bye, ESXi!

Part of staying nimble enough as an IT survivalist is learning how to use the right tool for the right job, and not get hung up on vendor or environment loyalties when it requires extraordinary measures to make a solution fit. 

In my case, this week, it's been assessing ESXi as a hypervisor for the H2T project, and discovering during the process, a few hurdles developed. Since we are using it as the hypvervisor for our admin servers, and I am already familiar with the management aspect. I googled around for awhile to get a sense of what VDI/RDP options would fit into the equation. And I learned:

  • ESXi 5.5, which I wanted to give a run since it is the next iteration of what we've installed, is crippled in a significant way: it's limited to a 60 day trial that requires vSphere to enable the vCenter fat client. A discussion of 5.5 here.

    In general ESXi presented other challenges in the context of this project:
    • We don't have enough host licenses to deploy after POC
    • It's a pain in the ass to interact/mount VMFS volume for faster data transfer in some cases (not possible with Windows except via SFTP to the datastore on host). This would seriously hamper efforts to migrade Windows machines to VDI environment using TIB images to convert the machine directly on the host.
    • It didn't recognize my RAID adapter on install
    • The host doesn't image with Acronis
  • ThinLinc server:
    • Not hard to install on Ubuntu, but configuration to use with Remote Desktop services was not as clear as I'd hoped
    • ThinLinc was only the gateway piece, I still expected some struggle and learning curve on the Windows server side of things
  • Windows Server 2012 installed on bare metal:
    • We have enough licences to get through POC, beta, and phase 1 rollout
    • Is very simple to mount a VHD, attach it, and transfer a large file into it for immediate use by a virtual machine
    • It recognized my POC box's RAID adapter
    • The host will image with Acronis
    • Has a number of other excellent features not available in the free version of ESXi.
My reluctance to start off with Windows Hyper-V was based on anecdotal experiences regarding the version on Server 2008. Server 2012 seems to have removed those challenges and so far has been a dream to work with. I was able to install the OS and enable Hyper-V, build a VM with an Acronis CD image and the agency Windows 7x64 TIB image, all in about 2 hours.

Some initial testing with Hyper-V has shown promise. I configured a Win7 VM with 2GB memory, 50 GB disk space, and 1 CPU. The resources have been throttled to a max of 16%, which means on the test box at that threshold I could run 5 VMs. I would like to have a density of 10 VMs per host minimum (with 2 concurrent users per guest), but that is a fairly arbitrary number and needs further exploration.

I tested responsiveness from a local RD session as well as remote (RD into Helpdesk server, out to a machine in Richmond, and back into the test VM at home. I connected to webmail and ran a Youtube video. It was pretty snappy, considering the remote client config and the lower bandwidth of that remote site. In the host monitor consol, the VM never used more than the configured 16%.

I will add a few more VMs and tune the upper resource limit to see where the connection slows down.

Bottom line is that I feel like I am making greater progress with Server 2012 Hyper-V than I was with ESXi. ESXi will continue to host our servers, but for the H2T project I need a host platform with a more familiar environment. So Hyper-V it is.

For now.

Tuesday, October 1, 2013

Free automated ESXi v5 VM backups for those of us on the FREE Edition!

10-01-2013 - Was just reminded of a low budget backup alternative I cobbled together while reading and responding to a post over on the IntelliAdmin website, so decided to post the mind-numbingly geeky details of the mechanism here in case I forget where else I might have put them.

All of our agency server instances run as guest OSes on an ESXi host. I have backup jobs scheduled from one of those servers to take care of nightly production data archives, but no automated mechanism for backing up the OS volumes from the host datastore itself. I briefly checked into Veeam's products, but we don't have an Essentials license for our hosts (or the cash for the automation upgrade in Veeam's full poduct), so it was a non-starter. Hot backups would have been nice for VM archiving, but if a window for offline archiving exists then this tool is good for that situation.

I figured out how to perform a command-line sync of folders on the ESXi host’s datastores using a batch script that will THEORETICALLY execute Windows scheduled task (run on scheduled or manually from any host on the same subnet) that:

  • runs a batch file that connects PuTTY (free, portable install on server share) via SSH to the ESXi SSH server (which must be enabled and set to run on host startup)
  • (will EVENTUALLY, after more testing) issue a shutdown command to the VM (no vSphere client or target VM console connection required)
  • runs WinSCP (also free, and portable install on server share) via a command script and kicks off a synchronization between an ESXi datastore folder containing your (shut down/powered off) VM and a local or network folder
  • (will EVENTUALLY, after more testing) issue a startup command to the VM 
This SEEMS to work elegantly, and as stated, the job can be run from any host on the network logged on as the designated agency backup user, from portable versions of PuTTY and WinSCP, unattended.

The script to connect the SSH session must be run first, and looks like this:

\\serverpath\putty.exe -ssh [ESXiuser]@[serverIP] -pw [ESXipw]

Running this the first time on a machine prompts for SSL cert confirmation.

The batch script then goes on to spawn a WinSCP sync session:

\\serverpath\WinSCPPE.exe /console /script=\\serverpath\[winscpbacup script].txt

In [winscpbackup script].txt we have:

open sftp://[ESXiuser]:[ESXipw]@[serverIP]
cd /vmfs/volumes/[targetbackupdatastore ID]/[target folder]
lcd \\[backupserver path]\
option transfer binary

synchronize local

To find the value for [targetbackupdatastore ID], you will need to connect once to the ESXi host with the WinSCP GUI to browse to the datastore folders (from root >vmfs/volumes).

This backup will only run to completion if the target VM files are not locked, requiring powering off the guest OS. This can be done from command line in PuTTY as follows:

For command line shutdown (logged into host using ssh putty)... 

vim-cmd vmsvc/getallvms 

To get the current state of a virtual machine: 

vim-cmd vmsvc/power.getstate <vmid> 

Shutdown the virtual machine using the VMID listed in the first column of output from Step 2 and run: 

vim-cmd vmsvc/power.shutdown <vmid> 

Note: If the virtual machine fails to shut down, use this command: 

vim-cmd vmsvc/ <vmid> 

Once backup has completed, powering on the machines from command line is as follows:

Check the power state of the virtual machine with the command: 

vim-cmd vmsvc/power.getstate <vmid> 

Power-on the virtual machine with the command: 

vim-cmd vmsvc/power.on <vmid>

This is a wholesale sync of all data in the datastore folders of each VM. In order to reduce redundant data backup managed by other more configurable, finer grained backup tools, it is good practice to put the data drive instance of a server in a separate folder from the OS partition. If that is not possible, the WinSCP script might be enhanced using file type wildcards.

DISCLAIMER: While I have stepped through these procedures manually, I have not yet tried this mechanism as a scheduled task run without my eyeballs on it. I am not sure how the system will handle the putty and winscp instances in non-desktop mode. Will be testing this before the end of the month.

This post is referenced in the Contra Costa ARC Data Backup Summary [internal link].

Monday, September 30, 2013

Shift Happens: Meet Tootie... the H is silent!

Q1Q42014 - Ubiquity Network
Virtual Desktop Deployment
Formerly H2T ("Here To There")

Project Goal: reduce the number of standalone PC client OS installs through the use of Remote Desktop Server and low-load virtual distributed desktop architecture.

Primary Benefits: reduction of dollar and time cost of standalone OS in maintenance and management ... extended deployment of existing client workstations with insufficient resources to run post-Win7 OS.

The Pitch:

With the challenging times facing our agency, we are having to ask staff to take on more, be many places and play many roles at once. Increasingly with the mobility requirements now facing these staff members, access to the information they need and the tools to process it must be "unlocalized" from their standpoint. In other words, their data and apps need to follow them, to be called up and look the same no matter where they are being accessed.

Enabling access to your desktop environment from a consistent interface on any internet-connected computer - independent of the OS, the location, or the network - is the intended outcome of this project.

This organization will continue the evaluation and promotion of remote desktop environments  - which it has already begun through the introduction of Microsoft Remote Desktop (Q2 2013) - for staff who need access to their work materials and tools from wherever they are. Previous to that it was LogMeIn (2008-2013), and so the remote desktop concept has been exercised for several years here in one form or another. 

The H2T - "Here To There" Ubiquity project is evolving out of that initiative. The goal is maximizing the value of our updated data service and reliability; and for desktops, not just extending the usable life of the current PCs in agency inventory, but also providing an extended access from non-agency PCs to make calling up "your" desktop a simple process of typing in a short web address (eg "") and using your email user and password to log in to your familiar Windows 7 workspace as you left it from the last time you logged in. Your stuff and your space, any place.

This is foundational infrastructure for a "secured anywhere desktop" initiative to enhance staff access to agency-critical computer resources, leverage existing hardware installations, enhance data security, and reduce administrative overhead inherent in Windows user/data security and management workload.


Thoughts and jots ... Updated 10-10-2013

I will be creating a proof of concept test environment with ESXi v 5.5 for a new client topology utilizing Cendio's Thinlinc (free 10 license pack) Windows Server 2012 and Hyper-V to centralize and virtualize a Windows 7 client experience at any internet-connected PC, including HTML5-capable web browser access.

POC Deliverable:

  • "Anywhere" Windows 7 desktop access with environment and data access spawned based on user, role, department, and organization variables submitted securely.
  • More IT security with integrated data access rules and centralized profile controls
  • Less effort to maintain standalone Windows 7 desktop installations as they are converted to Thinlinc native Windows 7 ThinPC "terminals".
  • The same access to unique applications (Boardmaker, CSS Databases) that require CD (or floppy!) media to function, as well as flash media, made available at the station they are using to log in.

Observation 09-30-2013: Really, after slogging through the desktop upgrades of XP machines, I think I would like to make this the last time I have to do bulk by-hand operating system upgrade for this agency. I will find out quickly what admin overhead this could either increase or reduce. 

I will explore: 
  • running the ThinLinc/RDP connections via native Win7 RD clients OR HTML5-enabled browsers on existing desktops
  • installing the ThinLinc "thin7" client on current Windows 7 installs
  • booting ThinLinc Client Operating System (TLCOS) Windows 7 ThinPC from a VHD on a few current Windows 7 machines and bridge the gap between OSes for awhile
  • building bare metal single OS (thin7) Here-To-There client "H2T" 
  • introducing non-MS desktop environments (Mint Linux?)
  • deploying TLCOS on Raspberry Pi hardware
  • deploying Linux OR Android clients on microPCs (MK802IV SE)
  • Alternatives to Active Directory that would play nice with Google Apps user management APIs
  • Alternatives to VNC for remote control of user session
    • 10-01: discovered how to use RD remote control and where to change permissions on users to allow viewing for server 2008).
    • Also found IntelliAmdin's Remote Control product, which has the added benefit of being able to choose among logged on users for both server 2008 AND WINDOWS 7!!! Much more stable than EchoVNC.
  • calling it Tootie (because the H is silent)
  • update 01.29.2014 - decided to rename this whole thing Ubiquity, well because it sounds cooler and has more meaning

The ThinLinc Windows Server 2012 HyperV server will be installed as a host OS on the ESXi host hardware (Intel i7 32GB), as well as an instance of our current Win7Prox64 image (as the virtual desktop host guest. I am hoping to set up the virtual desktop host (VDH) guest instances as NON-DOMAIN clients if it can be done. Taking on the Active Directory bull while attempting to shift the desktop paradigm might be biting off the unmanageable.

Much braining to do.


10-10-2013 - Having spent lots of time looking at platform options for both client and server, it looks like the most straightforward and cost effective way to go is Windows 7 ThinPC (or kiosk architecture such as thinkiosk) running on existing desktops (no less management needed; Software Assurance makes the OS "already paid for")., connecting to virtual machines running Windows 7 full desktop experience, with all the management features in place.

Friday, August 9, 2013

Managing Windows Imaging By Hand - Tales of Cloning

Recently I have been working on resolving The XP Conundrum for the agency before it's staring over my shoulder. This involves a lot of OS rebuilds and new machine deployment. I came across a vendor who offered a more automated, networked, remote-enabled method of managing these images nd their application. Although it was, as expected, way out of our budget to even implement, let alone maintain, I had a chance to hash out my current build and image management approach. 

I maintain 3 "gold" images: XP Pro 32-bit, Win 7 x86, and Win 7 x64. I have not used or updated the XP image in a long time, as any new machine I purchase now is quite capable of running the W7x64 OS. The W7x86 image is used on older machines being upgraded from XP at this point (upper end Pentium 4s with 1 GB of RAM). The product I use for applying those images is Acronis Backup & Recovery Workstation (v10) with Universal Restore. The Universal Restore feature allows the application of these images across disparate hardware vendors (I build all our machines from whatever DIY kits are on sale from Newegg). 

Generally, it takes about 20 minutes to apply an image, and another 15 to finish configuration at deployment time (not counting data transfer from old machines or printer setup). We don't have any configuration variance to speak of, which makes it less of a headache to manage from my end. Everyone uses pretty much the same software compliment, and any special apps they need they install themselves (except for a couple machines for accounting and payroll). 

On average, I am building or rebuilding one or two machines a month. With this push to retire XP, I am hitting a rate of about 5 machines a week til I'm done (about 70 machines at this point).

My situation is not especially involved in terms of machine configuration management, so doing builds "by hand" is not cumbersome. I am not running a domain, and the only unique configurations are in the form of staggered backup and app update schedules. Our only 2 enterprise server instances are 2 Windows Server 2008 machines; one for VPN and one for central backup repository. 

The burden of being the solo computer guy is balanced by the fact that I am in complete control of the process and decisions about how to execute a strategy. I do what I can to centralize my operations (using Batchpatch for centralized Windows Updates management, NiniteOne for core application updates, and psexec with batch scripts for everything else), but my decisions to implement any tool or framework is driven by both budget and a desire to "keep it simple". 

I have to measure the efficacy of implementing any new tool against the learning curve and effort to maintain any associated configurations and server components. I can't afford to dedicate too much time to any one piece of my job, because I am doing it all. You can get a sense of my scope if you visit my LinkedIn page here.