A few months back I wrote an article about getting your load balanced WordPress site up and running with the Rackspace Cloud, an article that was picked up on the Rackspace blog. The focus of that article was getting everything running correctly rather than securing the data, mainly because it was a massive pain in the neck with first generation cloud servers. But since the launch of the Next Generation cloud servers and Rackspace’s Cloud Networks it has become amazingly simple to isolate your vulnerable traffic from prying eyes. I’ve been using the Rackspace Cloud Networks service since it was in beta testing, and given my experience I thought it would be a good idea to revisit this topic and add some pointers on how to quickly and efficiently secure your inter-server data in the Rackspace cloud.
I’ve been playing around with the Rackspace Cloud hosting offerings, and as of right now I’ve got this very blog running load balanced on a set of servers. And while it’s a little more complicated than just setting up a single server it really isn’t that hard. In fact, I’ll step you through the process.
The Windows Home Server I’m running suddenly started acting funny a few days ago. Is started noticing it when certain video files would refuse to play. VLC spat back the same errors as if the file never existed.
So I RDC’d into the server and tried playing the files locally from the storage drives themselves, and it worked. It seemed like, for some reason, the server had re-filed the files in a different place, but forgot to update their location.
WHS uses “reparse points” to link the files on the various hard drives to one contiguous “virtual drive” that lists all the data. When a file is opened through the network, the server checks the reparse point and follows it to the location of the actual file. But when those points are corrupted, the files “disappear”.
Thanks to the WHS forums, i was pointed to the whsCleanup tool, that finds the bad reparse points, logs them, then deletes them as soon as you run a cmd file. The only problem was that the solution involved permanently deleting the affected files. Since the files themselves weren’t bad, just the reparse points, I followed the following steps to get everything back online:
- Ran whsCleanup
- Viewed the cmd file as text, made a note of every affected file
- Went into the C:\fs\ folder and manually copied (not move, COPY) each file to be deleted into a new temp directory in D:\shares (the virtual drive)
- Run the CMD file to delete the affected reparse points. Ignore the “error: drive not attached” errors.
- Move (not copy) the files back into their original folders from D:\shares to D:\shares. Use the virtual drive for this leg of the process.
That took from thursday morning to last night to complete with about 3 TB of data. Thank God for SATA.
Windows Home server forum thread
Yesterday was my last day at the Taxi and Limousine Commission. The majority of the day was filled with minor bug fixes and explaining the applications I created to the people that will be taking care of them from now on.
For lunch, I was invited to a Japanese restaurant called “Megu” with the people I was working for all summer. The restaurant itself seemed like it was pulled straight out of Deus Ex, without any graphics upgrades. The food was pretty good, I had a burger made with Kobe beef and some mystery sauce that tasted like the burgers Cat’s dad usually makes (marinated in butter before being grilled). After lunch, the person who had been supervising me all summer presented me with a very nice “thank you” card, a TLC cap, and a shirt with a New York City taxi medallion on it. It was an unexpected, but greatly appreciated gesture that started to make me realize how much my work was appreciated this summer.
The final realization came around 4 PM, when I was called into the Commissioner’s office to demo the frontend I had coded for the Call Center. I showed him how it was geared towards answering the customer’s questions as quickly as possible, and how easy the interface was to use, and he became very interested and excited about it. It was one of the greatest feelings of accomplishment to see the commissioner of the TLC playing with a system that I made, and describing all the uses it has that I never thought of.
All things considered, this has been one of the best summers of my life. I got to work with a group of intelligent and interesting people, learned about an industry that I never would have thoguht about otherwise, and left them with a useful tool that will hopefully continue to be useful and benefit the TLC, a tool that forced me to learn a whole new language and pushed the limits of my programming ability. And that is something I definitely wouldn’t have gotten out of Bear Stearns.
For the last month or so, I’ve been working on an ASP program that displays data from an SQL database that, until now, was done in a command line interface that looked far too confiusing to be functional. However, there were a few barriers to completing this assignment:
- I’ve never coded ASP before.
- I’ve never connected to a Microsoft SQL server through a server side app before.
- I had no idea what the structure of the database was.
So essentially, 50% of my time has been spent learning ASP, 25% figuring out where stuff is in the database, and 23% actually coding the system. A system which went into beta testing this afternoon.
After writing and debugging close to 3,000 lines of code in a language I’ve never seen before, watching as the intended users interact with the system was one of the best feelings of accomplishment I’ve felt. And what’s even better was, except for a few minor feature requests, there wasn’t a single complaint. It seems like everyone naturally figured out how to work the system, and it displayed exactly what they needed. All that’s left is some tweaking next week, writing 2 more pages, and I can hand it off to the local IT department to maintain as I trudge back to Penn State.
5 more days of work, and then I’m free for the rest of the summer (or at least what’s left of it).