Monday, July 30, 2012

The HiSSS of Infrastructure - Part 4

We've arrived at the end of our acronym-ical journal, and what better way to finish, than with everyone's favorite topic... security. Security is the often overlooked, and even more often derided, facet of information technology that everyone loves to hate. Security means rules, and rules means that we don't get to do everything we want, the way we want to. Security is the fun-killer.

Even though most IT professionals have to deal with security in some fashion, infrastructure has a unique role to play in securing systems. In fact, security needs to be right up there with the four other big paradigms of our philosophy of infrastructure. It needs to be there for one very important reason. In infrastructure we have the ability to make a huge impact in the security of a system, often times for very little effort. By the same token, if we don't take security seriously in infrastructure, we also have the biggest opportunity for a huge impact from a negative direction. More than in any other part of IT, a little effort can go a long way to making everyone's lives easier.

This big-impact-little-effort idea is due to the fact that infrastructure is the foundation of so much of what IT does. From networking, to server administration, security at the level of infrastructure can make all the difference. For example, in the world of networking, securing an router so that it keeps the wrong people out of a network doesn't just affect the router. It affects every single server, and every single router that is downstream from it. If a bad guy is able to penetrate a single router, and gain access to an internal network, every single device that touches that router is vulnerable. By the same token, a farm of servers is only as secure as it's weakest link. If one server in a group is compromised, it often serves as a gateway to getting at more and more servers in an enterprise. So the concept of big-impact-little-effort is key to how we view security in the infrastructure. The concept cuts both ways. If we have a vulnerable device in our enterprise it often means a big impact for the bad guys, for very little effort.

However, despite getting a big impact for some of our efforts, we often don't have enough resources to secure everything 100%. So our second concept is the idea of data valuation. Since we often need to choose where to spend our resources when it comes to security, it's important to know what is the most important thing to secure. This begins with a valuation of data, which simply means, putting a price tag on every field of data in your database. There are a lot of resources out on the internet to help do this, and the will often talk about how much a single social security number will fetch on a black market. If you add up all your SSNs and other 'expensive' data, you start to get an idea of how much it would cost you to lose it. If it's valuable to a bad guy, it needs to be valuable to you. The last thing that any enterprise wants to face is a lawsuit for tons of cash because someone grabbed a bunch of SSNs and birth dates from your Oracle server that still had 'scott/tiger' sitting there from your intial install.

But as the theological would say (since I'm one of them), "money isn't everything!" This is quite true, and our third concept in security. Not only do you need to know how much your data is worth, your reputation should always be considered priceless. How important is it to you to keep your organization off the front page of a news site? If the headline reads "Corporation X leaks 400,000 usernames and passwords!!" then I'm betting you want to do something to protect yourself. Even inexpensive data like usernames and password (we all have unique strong passwords for the websites we visit right?) can be a major embarassment for an organization. So if it doesn't convince you to take security seriously because it; a) can result in big bang for little buck; and b) can cost real dollars by leaking sensitive information that has real value on a black market; then do it for c) the priceless reputation of your organization.

If you notice, I haven't spent a lot of time talking about security techniques, and that's because I'm not an expert. I'm not ignorant in the area, but there's a lot of information out there that can help in your particular situation, and I don't want to ruin MY reputation by giving you bad advice on a random blog rambling. What I hope I've done however, is to ephasize three key concepts about WHY you need to secure your systems, and not just give security the quick one-over, hoping that nothing bad will ever happen.

I hope you've enjoyed this series on my philosophy of infrastructure management, and I hope you stick around the blog for other silly liberal arts technology stuff that I might find worth rambling about.

Thursday, July 26, 2012

Auntie — the sky is falling!!

That's an 'em dash' in the title there if you were wondering about the joke...

So this past week Apple announced it's earnings report, and the worst thing in the world happened. They missed the predictions. Apple only made $8 billion in profit. They were supposed to make $10 billion.

I'll pause a moment to let that sink in.... one more moment... ok... So what does it mean when a company like Apple only makes an outrageous sum of money instead of a OMG FREAKING OUTRAGEOUS sum of money? It means that the pundits sit down at their iPads, at their kitchen tables, with their bluetooth wireless keyboards, and pound out paragraph after paragraph of snarky prose about how it's the beginning of the end for Apple, and that we all told you that it would never survive after Steve Jobs passed on.

I'll pause another moment to let the irony of that last run-on sentence sink in.... ready yet?.... ok.... So as much of an Apple fan boy that I am, let me begin by saying that, yes, Apple is not the same company it was when Steve Jobs was running it. Is it as good of a company? Probably not. Steve Jobs was an amazing individual who could motivate people to pull purple bunnies out of their butt and convince us that we were living in a post-rabbit era. Very few people alive can do what Steve Jobs did, and that's simply a fact that the world needs to accept.

But does that fact mean that Apple is doomed? Let's take a look at a couple other facts.... actually... I'm still kinda stuck on that first fact... APPLE MADE $8 BILLION!! I know people who are unemployed, many people I know are digging themselves out of credit card debt, my house is worth 50% of what it was valued at in 2007.... and Apple didn't make $10 billion... only $8 billion.

So let's put away the tranquilizers, and step back from the ledges. There are a lot of other companies out there that are in much, much worse trouble. Blackberry and Netflix are two that come to mind rather quickly. At some point we all have to come to terms with the fact that most people who want an iPhone 4, probably already have one. The market saturation of smartphones in general is incredible, and at some point it means that sales are going to stagnate a bit. But we don't need to worry for too long. The magical iPhone 5 will be in our hands in a couple months, along with a new post-PC 7 inch tablet, if you believe the rumor mills. Such is the endless cycle of technological innovation, a cycle that in less than 10 years brought us from clunky laptops, phones that couldn't browse the internet, and tablets that involved ink and paper... to an era of ultra-thin laptops and the extinction of removable media; smartphones that will not only browse the internet, but will tell you how many cups of rice you need to cook for your family dinner; and tablets that are so magical and life changing, that bloggers like myself will use them to deliver a simple message about a missed earnings report: "move along... nothing to see here."

Wednesday, July 25, 2012

Seven inch what?

I've been a huge fan of the iPad ever since it was released, despite waiting until the iPad 2 before getting one. I remember one of the first competitors to the iPad was a 7-inch tablet by Samsung (IIRC). At the time I bristled at the idea of a smaller than 10-inch tablet. I just didn't 'get' why you'd want something, that I was viewing as a laptop replacement, in such a small size. But then a few weeks ago, something made me start to think differently.

First off, Google announced their Nexus 7 tablet. A seven inch device that starts at $200. That's a great price for what you get inside. But... it was still seven inches, so what would I use a device like that for? The answer came from a strange enough place. My ex-wife purchased a portable DVD player for the boys to use in the car on trips, and in their beds at home instead of falling asleep to movies on the living room couches. I looked at the devices and it hit me. Seven inches is a great size for a portable media device!

I've pretty much given up on purchasing physical media. I don't care to have 'things' sitting around my house, so services like iTunes (and Google Play for the Android inclined) are great for getting access to media. However, you need to have ways to watch your media, and although my 10-inch iPad is awesome, it's not the most ideal for lounging in bed. It's just a slight bit too big for holding a long time while dozing off. Sure, you can use a 4-inch phone, and it works OK, but it doesn't feel like a sweet spot. But 7-inches? That could work.

There's a ton of rumors going around the internet right now about a possible Apple seven inch iPad. I hope that if Apple decides to go that route that they push the portable media device aspect of it. I think that could be a winning direction to go with that device, similar to how Amazon and Barnes & Noble have come out with color tablets, that are media devices for their eco-systems. For now, I might give a Nexus 7 a try to get some hands on experience with the form factor and see if it's truly as good as I think it will be.

Monday, July 23, 2012

The HiSSS of Infrastructure - Part 3

We've arrived at the second 'S' in the HiSSS infrastructure philosophy, and that S is for Scalability, which interestingly isn't even a real word according to my spell check. However, mangling the English language is pretty second nature for people in the information technology field, so we can all be forgiven for yet another faux pas.

Scalability, simply put, is the ability for a system to grow as it's needs increase. Although this sounds like a simple concept, it actually is incredibly hard to achieve. When a software developer, or a systems engineer, sits down to design an application or build a host, they're usually most concerned with how they're going to accomplish their immediate needs. The idea of how they can grow their system to infinity is often something that gets considered later in a design cycle. Some shops are much better than others at considering scalability, but often times the answer is tossed back in infrastructure's lap as "deploy more hardware."

Despite being snarky in my comment, deploying more hardware is often a crucial part of increasing the growth potential of an application. The difficulty comes in determining the best, and most customer friendly way to implement added capacity. Simply adding another application server to a product sometimes doesn't have the full effect that was hoped for, and unless planning is done, the additional hardware can end up being under-utilized. So the first concept in scalability is being able to do smart-management of a systems capacity.

Infrastructure needs to be concerned not just with how many boxes that are running, but how work is being divided between the resources. Having a cluster of 10 machines does little good, if you don't have a traffic cop that spreads the work around to more than 4 or 5 of them. Smart-management takes a best practices approach to analyzing the best method for spreading work around. Sometimes that's a queue method, where a system stores up work to do, and then the nodes ask for work as their ready. Another common method is a simple round-robin, where work is handed out in sequential order to each system as it comes in. These are examples of any number of methods that can be deployed for smart-management.

This type of management is also often helped by utilizing open, documented, and proven standards. Very often others have already solved the problems that you're looking to solve. Or there are standards that are in place, that allow you to plug your needs into an established paradigm. So it's important that infrastructure builds on the work of others and embraces standards, especially open ones that usually have an entire community built around them.

Of course all of this work is pointless if you don't have good metrics. System interrogation is a crucial component to scalability. Beyond just simple monitoring of a system, interrogation allows you to see the 'how' and 'why' of your system's operations. For many applications and systems this involves a tool like Splunk that indexes log files and other recorded data, and then gives the ability to search, graph and analyze that data in many different ways. Being armed with data about how a system is performing, not just in a moment-by-moment monitoring and alerting fashion, but over the course of a period of time, is key to the idea of capacity management and scalability. After all, you don't know where to add more space until you know where to add more space ;-).

Scalability is one of those infrastructure concepts that is sometimes hard to implement well, and it's not very "flashy", even in the decidedly un-flashy field of infrastructure. But being able to grow an infrastructure to meet business needs means more customers, and depending on the business, more money and revenue for the company. Even for those organizations that are not profit driven, being able to handle interacting with more of your users is never a bad thing.

Friday, July 20, 2012

Best of... "To iPhone or not to iPhone"

I've decided to bring over a couple of my posts from my personal blog that talk about technology. This one quoted below is from when I was considering jumping from Android to iPhone. A few weeks after this post I did jump ship, and was happy I did, but I liked this post because it talked about my thinking at the time and why I considered it a big bonus to get inside the Apple ecosystem.


To iPhone or not to iPhone….
Apologies up front if this post is a bit more geeky than normal for me, but I’m going to dive into the realm of cool smartphones for a bit and ramble about things that most people might not care less about. 
A year ago I decided to make the plunge and join the smartphone revolution. Since I’m a Verizon customer my choices were based on an Android phone, Blackberry, Windows Mobile or Palm WebOS. The iPhone was not an option at the time. Well, after a bunch of research I decided to go with the Palm Pre+ with WebOS. I loved how it was an entirely different view on how to create a mobile smartphone device. The gestures that you use to interact with it were different and intuitive, and the way that it handled things like multi-tasking were amazingly solid. All in all it was a great little phone (especially for a discounted price). However, as the months went by the major downfall of Palm started to show itself. The hardware wasn’t nearly powerful enough to handle what the operating system was trying to do. Add to this, the fact that Palm, as a company, was in the position of being sold and app developers were not jumping on the platform. So you ended up with a great operating system, with iffy hardware, and dwindling application development. By October I decided I was done. 
In the technology world, things often come down to a duality. Mac vs. Windows; Firefox vs. Internet Explorer; Word vs. Word Perfect, on and on. It’s very hard to maintain consumer interest, and developer interest in more than two platforms at a time. So as cool as WebOS was, it just seemed like it was too little too late. The battle seems to be down to iPhone vs. Android. So in October I decided to jump ship and join the Android world, and got an HTC Incredible (also at an incredible price). From the moment I fired up the phone the difference was amazing. Everything was fast and responsive. The interface, though not as elegant as WebOS was decent and allowed me to do most of what I wanted to do. Additionally, since this is an Android phone, the number of apps available was much, much larger than WebOS, and the platform’s growth potential has many more developers hopping on board. The hardware is more solid, and things like the GPS system just work. Overall, this has been a great smartphone for me. 
However, now we come to February, and the big announcement that Apple’s iPhone is now available for Verizon. I’ve been asked many times over the past few weeks if I’m going to pick one up. To be honest, I’m not 100% sure yet. Unlike my experience with the Palm phone, I’m not lacking much on my current Droid Incredible. It does most of what I want it to do, and does it quickly and efficiently. However, there’s one thing that it can’t do, and never will be able to do. That’s the ability to integrate in to the rest of my digital media world seamlessly and easily.  My home computer is a Mac, my work computer is a Mac, I have an Apple TV in my living room, and I’ve owned a myriad of Apple ‘i’ devices over the many years. I’ve also purchased a great deal of entertainment in the Apple world, from the iTunes store. None of this “just works” with an Android phone. Sure there are tricks and utilities that can kinda get you close with some work… but it’s not quick and easy. 
There also are a few more apps that interest me on the iPhone side, that simply aren’t available on the Android side. However, that’s a bit of a smaller issue for me. The bigger question is that integration piece. I know a few people who get all up in arms about how Apple is a closed eco-system, but that closed nature has allowed them to provide some wonderful integration between environments. I acknowledge that Apple is a walled garden. But if the garden is where you want to live… then it really isn’t that bad of a deal.
So I guess I’ll be thinking about it for a few weeks, not going to jump on anything that quickly. But the idea of a phone that can actually use a bunch of the media that I own is a pretty compelling idea. 
Oh, and as a postscript, I did just see the new WebOS devices that HP (which bought Palm) released yesterday. There’s some serious potential there. Again though… might be too late to get in the game.

Wednesday, July 18, 2012

Under the Surface of Microsoft

One of the big tech announcements recently, that caught the world by surprise, was the new Microsoft Surface tablet. Although many people expected some sort of tablet annoucement, I don't think anyone thought that Microsoft would pull out a full-on iPad competitor, complete with massive innovations in design and functionality. My first impression of Surface is that it's a really great piece of technology, and things like the built-in kickstand, and the smart-cover-like touch keyboard are really inventive. Since I'm writing this on an iPad with a wireless keyboard, I know there are plenty of times when the marriage of an old-school physical keyboard input method with modern touch screen interfaces results in something even better :)

The thing I wanted to comment on though wasn't the introduction of new hardware, because I think that story is still evolving, and Microsoft's involvement with it's OEM's could be quite the fireworks show. What I want to ramble on about is Windows 8. In my opinion, this is either the beginning of the end for Micosoft, or the start of a resurgence. Lots of commentaries recently have pointed out that in the past couple of years, Apple has started to seriously catch up to Microsoft, more than at any point in the two companies histories. Although I doubt Apple will ever completely overtake Windows on the desktop, the continued surge of Mac OS X should trouble Microsoft, and in fact I think they're finally beginning to take it seriously. Microsoft needs a hail-mary to shake things up a bit, and that desperation play is Windows 8.

At first glance, Windows 8 (and Windows 7/8 Phone) looks nothing like traditional Windows. In fact it feels much more akin to the XBox 360. Considering that Microsoft started from nothing and is now massively entrenched in the gaming console market, this is probably a good strategy. Gone from the face of Windows 8 is the ubiquitious Start Menu in the lower left corner. Instead we have a completely different view of the operating system. A series of live-tiles present various aspects of our internet connected life with small thumbnails of recent updates. Most of the main applications are designed around the Metro theme with large squares of information, and smooth scrolling between panes of information, as well as a full-screen view by default. This is so unique for Windows that I don't think most people have gotten over the shock of the paradigm shift.

In fact, Microsoft is one-upping what Apple started a couple years ago, the marriage of the mobile iOS with Mac OS X. In the latest Mac OS X we got the Launchpad, which presents us with an iOS style listing of our applications. However, it's sort of half-baked at this point, and feels more like a work in progress than a full on paradigm shift. In Windows 8 however, the tile based UI is the default, and primary method of working with the system. I commend Microsoft for making this bold move. I think in the end it will benefit them, but it comes with risks.

First, they're forcing a lot of people to re-learn how they interact with Windows. That's a risk, but considering that they're thinking a couple of steps ahead, I think it's going to pay off. When we really get down to it, most of us use 5-6 apps on a computer most of the time anyway. Web browsing, e-mail, word processing, spreadsheets, music and movie management, and photo management come to mind as a solid half-dozen that most people use. Throw in a few games from time to time, and that's most people's needs met. This is why, for many people, even a device like an iPad can suffice. Since their integrating their mobile user experience, and game console experience, in to their computer operating system, I predict that most people will make the transition without too many problems.

Second, and perhaps the biggest risk, is that they need to go "all in". If they truly want to make this the future, they need to commit, fully and with lots of cash. Microsoft has the resources to do just that, but they need to not get cold feet and back out at the last minute. It also means that they need to be willing to shatter other, more entrenched paradigms, such as the way that Office functions. One of the biggest disappointments when looking at Windows 8 on the Surface tablet is that the Office apps are just the same old apps. Apple decided to embrace a new paradigm and created iWork for iOS that takes on a completely different user experience, that better suits a tablet environment. Yes, it means that people need to re-learn that app, and for the time being it means some features get cut until the UI and UX paradigms catch up, but it's the right move to make. I'd love to see Office for Windows 8, that looks nothing like Office for Windows 7. I think that would be a huge win for Micosoft.

So despite a couple risks, I think Microsoft has a chance. Despite being an Apple geek, I'm actually rooting for them in the smartphone market in particular. I love to see companies take things in a totally new direction, instead of just mimic-ing what's gone on before. WebOS tackled a new paradigm, and to this day I still love some of the features of that smartphone OS, and am sad that they didn't survive. Android is really only now starting to get it's innovation groove moving beyond small tweaks to already established iOS features. So that still leaves room for a number 3, that wants to be taken seriously, and can present a compelling case for something TRULY different. Windows 8 is certainly different from anything out there, so here's to hoping Microsoft can make it happen.



Monday, July 16, 2012

The HiSSS of Infrastructure - Part 2

In the first installment of this series, I outlined my Infrastructure Management methodology, called HiSSS. In that first posting we talked about the concept of High Availability. In this segment I'm going to tackle the notion of Stability.

Stability is pretty self-explanatory, simply, you don't want a system that is repeatedly tipping over. Just like we might say that an athlete has a stable stance when they're competing, we want our infrastructure to be strong and stable, so that it doesn't fall over, and leave the customer with a bad taste in their mouth. One of the first ways to do this is to control the rate of change in our systems.

Controlling when and how things happen in our systems is often called 'change management'. We want to manage any changes that are occurring, and mitigate any risks that might impact the stability of our systems. Often times, this change management process relates to software being developed, but it's just as important for infrastructure management. In particular, you want to have very strict guidelines as to when changes will be applied to systems, or when new software will be rolled out. You want good processes in place for handling ordinary operational business, and procedures for how non-standard changes get prioritized and implemented.

Perhaps an example would be good here, and one that also relates to software development. In order to maintain stable systems you want to control the number of times you need to deploy an application, because every time you do a deployment, it's a window for risk to creep in on. But, at the same time, infrastructure needs to embrace the model of Continuous Deployment, because in many shops, it's the future of software development. So how do you manage two, seemingly, opposite demands in this situation? With good change management controls. Instead of fighting against frequent software releases, infrastructure should support a strict change model that allows software to be deployed frequently, but with the least amount of disruption. Clear procedures, and timelines, can make the relationship between development and infrastructure very smooth. If everyone knows that there is a monthly (or weekly) release, and everyone knows exactly how the procedure will work on release day, QA sign-offs, and everything related to a release, then it becomes so ingrained to the workflow that velocity can be increased or decreased with minimal fuss.

But all the controls need to be in place for this to work right which leads to the second major factor in stability, the notion of automation. In order for software deployments (to continue our example) to work right, you need to make the process as automated as possible. Automation allows for repeatability, and repeatability usually means that you've achieved a level of stability. Even when tracking down bugs, being able to repeat a bug, means it's a stable bug, and much easier to locate and fix.

For every process that you want to repeat on a regular basis, it should revolve around pressing one button, running one command, or allowing a timed automated job to run. The amount of human intervention in any process should be bare minimum. As many possible contingencies should be accounted for and mitigated preventatively, and then everything should be set to run with as light a touch as possible. Automation is often one of those things that companies want to achieve, but it often requires going back, after the fact, to add it in. Many times, procedures and workflows develop over time, and it takes a lot of forethought to get everything automated right from the get go. Since that often doesn't happen, it gets easy to ignore it in the future. But that's the wrong choice to make. For a system to be stable, automation needs to play a key role, a crucial role, in infrastructure management.

But how do you know if your system is stable? That's the final aspect I want to present regarding stability, the notion of monitoring. It doesn't do much good to have a large system, with lots of moving parts, if you have no idea how those parts are functioning. Good system monitoring is key to maintaining a stable system. For basic system stability I like to refer to what I call 'short-term monitoring' (my monitoring philosophy will be a whole different blog series hehe). Short-term monitoring is the view into the current system state, and involves quick alerting of issues that need to be addressed immediately. Good short-term monitoring will often let operational staff discover problems before the customer. Pro-active fixes are ALWAYS positive. So having a good view into a system is crucial. If you don't know what's going on, how can you fix it?

The more a system evolves into a completely stable system, the better it can adapt and meet the future needs of the customer. A stable system, along with a highly available one, are two important keys in infrastructure management, and go a long way to achieving that "hum" of a well tuned engine that every infrastructure manager loves to hear.

Saturday, July 14, 2012

Old faithful moves on...

About 7 years ago I sank a LOT of money into a personal LaserJet printer. The HP LaserJet 1320 was a great little personal printer, despite costing a fair amount of money for a printer (around $500 MSRP at launch). For years that printer has cranked out page after page after page of beautiful black text. Originally we purchased it because our kids loved to color, and it was getting really expensive to print all their coloring sheets on an inkjet printer. So a laser printer was the perfect solution. We went through reams of paper printing outlines of Star Wars characters, dinosaurs, and other assorted topics that the kids were in to from time to time.

The time has come though to retire old faithful. Despite being a solid printer, and most likely having some solid life left, it's reached a point where I needed something with a few more features. In particular I've found myself in need of a scanner on a number of occasions recently for various document signings, and with my last cheap scanner sitting in a junk heap, I was stuck using places like Kinko's. I didn't want to just pick up yet another cheap scanner that would take up more room and another port on my computer. So I started doing some research on newer multi-function printers, and had narrowed my choices down to a few solid contenders. Well, today I was wandering through Office Max, and there was one of the printers I had been looking at (HP M1212nf MFP), on sale for 10% off of regular price. So I decided to pick it up. It's amazing how much technology gets cheaper. I paid about 60% LESS than what I bought my 1320 for, and got tons more options.

In addition to full copy/scan functionality, it also is AirPrint compatible (and ePrint), meaning that my mobile devices can print directly to it now. The only caveat is that I didn't get a wireless model, so I had to put it next to my router in the living room so I could hardwire it in. Not a huge deal, just a little funny seeing the printer sitting next to my TV and Xbox. It all works though and is a great upgrade.

Thursday, July 12, 2012

Where am I, and why am I running?

As many of my friends know, I took up the sport of running back in 2010 when I needed to get in shape. As I entered middle age, my metabolism did a revolt on the eating habits of my younger years, and I had to shed some pounds and regain my health. So what does this have to do about tech? Well, as a gadget guy, with a distinct interest in all things that make life easier, I wanted to utilize some form of technology in my running.

For most runners, the go-to gadget is GPS. In the old days, runners had to plot their distance using courses that had been pre-measured, or through maps, or a variety of other methods. You would go for a run, and you may or may not be at the correct distance that you think you are. Well, in the wonderful age of technology we can solve that problem. And solve it we will!

For the gadget minded runner (or biker/walker/hiker for that matter), there are a few different options for determining where you are at any particular time. If you feel comfortable carrying your smartphone with you then you need to look no further than one of the many cool apps that are available for tapping in to your phone's built-in GPS features. Many of these apps tie in to services on the web for mapping, storing and sharing your running data. My personal favorite is Endomondo a great service for mapping your runs and keeping statistics, but it also includes a great socail networking component where you can see all the running that your friends are doing and comment and 'like' each other's workouts. You can even see when someone is running in real time and give them a text-to-speech 'peptalk' in the middle of their workout.

There are plenty of other sites out there, similar to this including MapMyRun and Runkeeper, but the overall point is that they all have smartphone apps where you can tap in to your phones GPS system to track yourself. But let's say that you're not comfortable carrying your phone with you as you're out sweating to the oldies. Another option is a GPS watch, such as those made by Garmin.

There are a plethora of options for these watching, including touch screens, water proofing (for tracking swims), heart-rate monitors, and options for how you want your laps broken up and so on. GPS watches are great for doing basic run tracking, and for people like myself who often run over the lunch hour in the middle of a work day... a watch allows you to disconnect from the internet and just focus some time on being active. By using a watch I'm able to unplug during my runs and not worry about the texts, emails, etc that are waiting for me back at my desk. Also, a GPS watch has to be uploaded after the fact, so no live run pep talks or tracking. Most sites accept GPS files from these watches just fine, so importing is quick and easy.

So what other points of comparison can we talk about besides ease of carrying? One that's worth mentioning is accuracy. The best route tracking tool is only as good as the GPS technology that it employs to do the tracking. In the case of smartphones vs. watches, the watches beat the phones for accuracy every time. With.... one notable exception. When you first start up a GPS device it needs to find the satellites floating around the planet and get a lock on your position. Often with a GPS watch this can take 30-60 seconds of standing still, and even longer in the middle of a downtown where you're surrounded by tall buildings blocking the sky. There was even a recent joke motivational poster of a woman staring at her watch that said "There's a fine line between waiting for a satellite lock, and starting at your wrist like an idiot." So despite being super accurate for most of your run, a GPS watch falls down a bit in the beginning as it's trying to get it's lock on.

This is one area where the smartphone can shine. Since it's a fully functioning phone with all kinds of different radios on it, it can call upon secondary sources to pinpoint your location. So by utilizing cell phone towers and wi-fi hotspots, a phone can get a signal lock within seconds. Even in a crowded downtown area. This assisted GPS is really great if you want to get started right away and not wait for a signal from the clouds. However, this quick startup does little for the distance of the run, where often times a cell phone GPS will drift and slide, leaving you with a lack of precision. Granted, it's often a small variation, but over a long run it can add up. Plus, it sometimes makes it look like you're running through people's lawns, or through buildings, when in fact you're running next to them.

So if you're a runner, or biker, or walker, or hiker, or whatever, next time you head out on the trail, consider giving one of these GPS tools a try and hooking up with a tracking site and find out where the heck you really were.

Wednesday, July 11, 2012

I love typing on my iPad

Ok, before you think I've gone crazy and suddenly believe I like smacking away at a non-responsive touch-screen, let me clarify that title... "I love typing on my bluetooth keyboard on my iPad." Like many people, I took the plunge and got a wireless keyboard for my iPad, because for any serious typing work, you really can't beat the smooth responsiveness of the Apple Wireless keyboard.

But, just to clarify things further, it's not the bluetooth keyboard that is the reason I love typing on my iPad. Let's correct that title one more time... "I love writing on my bluetooth keyboard on my iPad." There we go, that's better, and it gets to the heart of what I wanted to share in this post. I'm going to make a bold statement, which I'm sure tons of people will find issue with, but here is it. The iPad is a perfect writing tool. Ya, that's right. I just said that a small 10 inch device that you need to purchase an additional keyboard for to type well on, is a perfect writing tool. I know... it doesn't even have a mouse!!

So why have I lost touch with reality? Frankly, it has little to do with all the great features that an iPad has, or it's awesome iOS app store. It comes down to something more simple. In fact, that's the key right there... it's 'simple'. The reason I love writing on my iPad (or frankly any decent sized tablet for that matter), is that I don't have a mouse. I also don't have an instant messaging client up all the time, or twitter feed just off to the corner of my vision, or notifications that stick around the dock long past their toaster pop-up, nagging me, begging me... "Check me!".... "I have something to say to you!" What I do have on the iPad, is limited distraction. And that is why I love writing on it.

Too often in our ultra connected world we fill our lives with as much information as possible. On my desktop computer, I have every possible way to connect to the internet available and running at my fingertips. Sure, I could shut everything down, turn off all notifications, and find a writing app that can go full-screen as a way to completely detatch from all the insecure applications begging for my attention. But... I'm still sitting in my office, still surrounded by bills that need to get paid, still surrounded by papers that need to get sorted, and on and on.

Now, the iPad can certainly be filled with all kinds of things to distract me, but for the most part it's minimal, and can be managed pretty easily without radically changing everything I have running. I don't run an IM program by default, but I do get an occasional notification about other things. But on the iPad, the notification is a small rectangle at the top of the screen that goes away quickly. Since all apps on the iPad are full-screen apps, I have to physcially move my hands from the keyboard and touch the home button to see anything else on the device. The other apps can't come and find me after their initial 'cry for attention'. For some that my be a 'failing', but for me, it's a blessing. And to top it all off, the iPad is insanely portable, making it easy to travel and use anywhere I want.

In the past couple of days since I've started this blog, I've written more than I ever have in this short amount of time. I have articles queued up over a week in advance, and I'm continuing to create more. The key for me was finding a place to write, using a tool that made it comfortable and pleasent to write. For me, a nice distraction free iPad with a wireless keyboard at my kitchen table is just the spot.





Monday, July 9, 2012

The HiSSS of Infrastructure - Part 1

Over the course of my career, I've come to specialize more on a portion of Information Technology called infrastructure. Namely, the underlying support systems that allow all of the cool internet based services we know and love, to flourish and operate without a second thought. These support systems consist not only of physical hardware, such as servers, switches, routers, storage arrays, and so on, but also of the support software that drives these physical systems. Often that includes things such as application servers, proxy servers, network device operating systems, and various shared applications such as e-mail, messaging, and workflow management. Although in the case of most shared software, a team outside of infrastructure manages the application from a user perspective, infrastructure often takes the lead in managing upgrades, software patches, and physical implementation design.

The method that is used to manage these types of systems are varied, and depend greatly on the situation as well as personal philosophy. As a liberal arts technologist, the 'philosophy' behind how you do something has great value to me, so I'm going to spend some blog posts outlining my philosophy of infrastructure management. In that same liberal arts vein, I've come up with an acronym for my philosophy which I call the HiSSS of infrastructure.
  • Highly Available
  • Stable
  • Scalable
  • Secure
In this first installment, we're going to talk about High Availability. Simply put, in non-technical terms, a system is highly available if it is always available when it is expected to be. High availability doesn't just apply to large infrastructures, but to things in our everyday life. We expect our cars, alarm clocks, refrigerators, air conditioners, etc., to all be highly available. We want them to be running when we expect them to be running, without question. Just like when our air conditioner suddenly refuses to fire up at the start of summer, we get just as upset when our computer systems, such as Facebook, e-mail, or Google, suddenly disappear. We have a high expectation of when we want these systems available for our use, so lots of smart people, spend a lot of time and money to make sure that these infrastructures are highly available. 

So how are systems made highly available? One of the most common methods in infrastructure management is called redundancy. Very simply, you never have just one piece of hardware doing a single function. You always duplicate things, so that if one piece of hardware or software malfunctions, you can seamlessly switch over to another system. Unlike our houses where we don't have multiple washing machines, or multiple furnaces, most infrastructures are built on the basic premise that redundancy will be built in to every single facet of the system. You never want to have one single point of failure if at all possible. Redundancy is such a basic fact of infrastructure management that it gets applied down to the level of multiple power supplies, multiple network interfaces, and so on, inside a single piece of server hardware. 

Although having perfect redundancy is great, there are times when systems have to be brought down for various reasons. Hardware maintenance as well as software upgrades are one example of situations where a system might be removed from a highly available pool. Another aspect of infrastructure management and high availability goes beyond physical hardware, to developing a set of policies and procedures to ensure that when a system is taken out of service, it isn't noticed. Being 'invisible' is another key factor in high availability. A primary motivator in any infrastructure management plan is to never be seen unless you have to be.

At one employer, we utilized a system of multiple independent application servers to achieve invisibility. Since we had 3-4 machines serving the public at any one time, we could pull one out of service for a hardware or software upgrade, and then rotate it back in to service when it was completed, continuing the process for all the systems in the pool. This allowed us to do even large software upgrades with almost no disruption to the end users. That meant better service to the customers, and happier management.

A sister concept to invisibility is the notion of segmentation. One of the reasons that we were able to maintain such invisibility, was because we could often pull out and replace just small portions of the systems at a time. By choosing to modularize many of our systems, it allowed for upgrades that were often small and very isolated to one single function of the system. This type of segmentation doesn't always come cheap, and takes a very strong architectural design to implement, both from an infrastructure perspective as well as an application development one. However, with good segmentation most of a system can survive upgrades and maintenance without even notices things going on in other portions.

Being highly available, with it's goals of redundancy, invisibility, and segmentation means that concepts such as Continuous Deployment and other Agile development and business methodologies are able to happen much, much easier. Many shops talk about wanting to move in these new directions, but many times you need to first establish a solid foundation before you can build the mansion. High availability is one pillar in that foundation.

Saturday, July 7, 2012

Great Googly Moogly

Recently the tech world buzzed with a series of announcements, one after the other, from Apple, Microsoft, and Google. Despite writing this blog entry on an iPad, I want to tackle my thoughts on Google first. I'm not seeking to do a product review here, this is more of just a rambling commentary :).

As many people know, I've completely "drank the kool-aid" when it comes to Apple products, and most of my ecosystem revolves around 'i'-Somethings. However, I also utilize a lot of other services, and one that has a lot of my information (for better or for worse) is Google. As many a geek did, I jumped on Google very early on in it's history, as a search engine, and continued to add their services to my lists of technologies. My domain is a Google Apps for domain, I really like Google+ for it's layout and functionality, and Google Docs is quickly becoming my prefered method of cloud based document creation/sharing/storage. Now I'm also utilizing Blogger for this blog. So to say that I'm JUST an Apple geek certainly isn't accurate. In fact, I've even been known to sport an Android phone from time to time.

Which brings me to the big announcements from Google I/O recently. The biggest in my mind was the continued evolution of the Android platform with Jelly Bean (4.1). Google announced the new version of the OS, along with a Platform Development Kit that has the high goal of getting manufactures up to speed quicker with new OS changes. This PDK is a long time coming in my mind, although I'm not 100% certain it's going to have the full effect that Google wants.

One of the biggest, and most widely talked about, problems with the Google Android system is fragmentation. Because Google wanted to create an open platform for people to expand and build upon, they had to sacrifice something that Apple refused to compromise on... control. Say what you want about Apple, but because they control the entire ecosystem, and control it tightly, most things work seamlessly and effeciently. Google, on the other hand, by allowing other people to determine how Android is presented on their devices, has ended up with a large mess of varied OS versions in the wild, and no clear path as to how to bring everyone together. This has the added impact of causing developers to need to pick and chose carefully which versions they will support, and deal with manufacture's 'skins' on Android that may or may not cause an issue with their applications. It's a mess, and Google knows it. Which is why I think we're seeing a lot more guidance and support from Google on how to best show of it's nifty system.

So despite my love of everything fruit-flavored, I did use an HTC Droid Incredible for a period of time. For the most part, I really loved the phone. One thing that Google's strategy allows is for much more diversity in physical form factor. You can get an Android phone in just about any shape you want, and that's cool. As much as my iPhone is great, I still loved the feel of the Incredible physically. It was a great extension of my hand, light-weight and perfect dimensions for me. I did take the plunge to iPhone though after realzing how much of the rest of my media ecosystem was Apple, and how much simpler it would be to work with that on an iPhone device. But that's probably not my biggest complaint about the Incredible.

I stopped using the Incredible in 2010, and it was on Android version 2.2. There has now been an upgrade to 2.3 that's been released, but the chances of that phone ever getting 4.x... zero. Compare that to Apple, where that same iPhone I bought in 2010 (iPhone4 Verizon) is still going strong, and getting software updates, and will probably continue to get updates for the foreseeable future. Even the iPhone 3gs is still getting modern updates all the way to iOS 6. That's the type of longevity that you can't get with the model Google has pursued in the past. Hopefully, their new PDK and massive improvements in stock Android UI will start to change that course.

So that brings us to 4.1, Jelly Bean. With the 4.x series it feels like Android has really started to come in to it's own. Many of the quality-of-life aspects of iOS are arriving, but thankfully Google has added their own spin on many of them. The app switcher, with a preview of the app, is reminicant of WebOS cards, and a great refinement. The notification center continues to improve, and has always been a step above any competitor. Plus, Google has really started to leverge more of their infrastructure of services with more maps integrations and Google Now. Overall, they're starting to address perhaps the biggest complaint I've had with Android since the beginning, just duplicating iOS.

From the start, I've always felt like Android was trying too hard to just be "the iOS alternative" and not innovating nearly as much as their competitors. It's one the things that drew me to WebOS originally, since Palm had completly shattered a ton of paradigms with their implementation of a smartphone OS. Too often the Android phones would scream "Look at me, I'm just like an iPhone, but I'm not!" Say what you will about litigation, there was a good reason that Apple is pursuing Samsung. Their TouchWiz interface over Androind 2.x was so iPhone like that even I did a double-take when a friend handed me his Galaxy.

Hopefully, those days are quickly moving behind us, and we can start to get a real choice between smartphone OS's that is driven on real, solid, feature differences, and not just on the idea that "it's not Apple, but it acts like it." So will I pick up an Android phone again? I'm a bit of a collector, so the chances are I might try one out again sometime. If Verizon's 3G network here in the Twin Cities continues to suck, I might be lured to try a 4G phone sometime (despite the miserable batterly life). For the meantime though I'm excited to see how Google continues to differentiate their OS, and position itself as an even better choice in the smartphone world.

 

Yet another new blog...

So I've decided to start a new blog, which I do from time to time. I have my main personal blog at http://boolah.dupadee.net/ which is where I can feel free to write about various topics of interest to me. However, I wanted to try something a bit different for any writing I do about technology. Often my tech posts feel out of place on my personal blog, and my personal posts feel out of place for the tech people who might read my tech posts. So I'm going to start up a new targeted site here and see how it goes. It might get abandoned as many blogs do, but who knows, something might stick.

I also wanted to take the opportunity to try the new Blogger. I haven't used their service in a looooong time, and from the looks of it, it's gotten a complete and total over-haul from older versions I've used. So far I'm liking it, though I'm not sure it's quite as full featured as good old faithful WordPress.

Now, about the name of this blog. Many people know that I appropriate the term Liberal Arts Technologist as a way to describe myself, and so that's the name of this blog. Instead of re-explaining the term, I'm going to paste a blog entry here from my personal blog where I wrote about Steve Jobs. In it, I talk about how Apple presents the notion of Liberal Arts and Technology, and why it feels appropriate to me. Enjoy!

Steve Jobs and Me
I joined millions of people yesterday in my sadness at losing Steve Jobs from this world. He was a one of a kind innovator and the world is a much different place because of his work. In fact his work directly affected me and where I’m at in my life.
I’ve spent my entire adult life in a career in technology, and much of that is due to my very young exposure to computers. When I was 7 years old my mom was working as a teacher’s aide at a school. During the summer her boss let her bring home an Apple II computer for me to use. It was my first exposure to computers, and it hooked me. Even though the first computer that I owned was a C64, Apple computers were everywhere in my educational development. It’s because of this early use of computers, that even though I went to school for a completely different field, I still was able to move in to the technology field and make a successful career.
That brings me to the second piece of my connection with Apple/Jobs. In recent years, Jobs presented the concept that Apple wanted to be at the intersection of Technology and Liberal Arts, presenting this as an image of two street signs at a road intersection. When he first mentioned the idea it struck me because it’s exactly the type of technologist that I have built my career on being. In my interview for my latest job, I even started out my personal introduction by saying I’m a “liberal arts technologist.” I say this, because I have no formal Computer Science education, and have no desire to be a scientist in technology. My goal is to enable technology to enhance life, and solve problems. I’m a very practical and gadget oriented person when it comes to technology. 
This is because at heart I want to solve problems. Even though it sometimes has annoyed people close to me, I always want to fix problems, and come up with solutions. This is what I do in technology. I take my life and experience with history and theology, and many other disciplines, and try to find ways to get technology to help solve the problems I’m faced with. It’s why I love things like smartphones and tablets and multimedia devices. They help to solve problems, and when used properly, can add to our experiences of life. Even looking at innovations such as the Internet, and how I’ve developed multiple, deep, personal friendships from my connections there, is an example of how technology can shape our lives in a positive way. 
That’s the intersection of technology and liberal arts, and it’s where I live. Thank you Steve Jobs for all the work you did to make that possibility a reality.