Cloud Computing

Five Things Rackspace Can Do To Win Again

May 14, 2013 at 12:00 AM

I’ve been reading about slow growth for Rackspace cloud and how apps are pulling support. I shouldn’t be so surprised given that my own usage of the Rackspace cloud has also dwindled, despite the ORD datacenter being one of the most rock solid facilities I have ever used. I know that Rackspace has spent the last few years working hard and innovating, but somehow they seem to still be missing the boat. Here is a list of key things that made me go back to AWS and that Rackspace can implement to reverse this trend.

Read More...

Permanent Link — Posted in Cloud Computing , Technology Management

Arch Linux AMI for Amazon EC2

April 2, 2013 at 12:00 AM

Update August 21, 2016

I am no longer maintaining Arch Linux images for Amazon EC2, and I no longer recommend using Arch Linux on servers. The attitude in some of the core pieces of the system has become far less disciplined and… what I will in a politically correct way say is more centered around agenda than users or system use.

Specifically the issue that broke this for me is the way versions of pacman since the file reorganization effort remove symlinks in the root path install path of a package. This bug has been brought up several times in pacman’s history. The author and current Arch czar has stated that symlinks are improper and should be replaced with bind mounts. This approach breaks the best practice of being able to separate the OS from the data, and using bind mounts causes disk metrics, analysis and monitoring to misreport. In previous instances, this bug was fixed, however so far this time it is not being addressed.

Read More...

Permanent Link — Posted in Cloud Computing , Amazon Web Services , Arch Linux

Arch Linux Boot Script for Amazon EC2

January 17, 2013 at 12:00 AM

I have an updated Arch Linux image for Amazon EC2 that is systemd. I created a boot script that sets the hostname and root keys. It will even update DNS in Route53 and send you an email letting you know the instance IP.

Released under the MIT license on github.

I am working on cleaning up the base image that I use on Amazon EC2 and publishing the AMI as well.

Read More...

Permanent Link — Posted in Cloud Computing , Amazon Web Services , Arch Linux


Adjusting IT for Cloud Computing

June 19, 2012 at 12:00 AM

Cloud Computing is not just a paradigm shift for infrastructure. IT operations, accounting and even staffing structure need to be updated to effectively harness the benefits.

In a previous article I illustrated deploying a multi-terrabyte RAID array in the cloud. That takes just a few minutes these days but it used to take most organizations over a month to provision that much storage through their IT channel. Moving to cloud will allow organizations to reduce and potentially eliminate IT staffing around procurement.

Read More...

Permanent Link — Posted in Cloud Computing , Technology Management

Increase Amazon EC2 Reliability and Performance with RAID

May 25, 2012 at 12:00 AM

While I haven’t *knock on wood* had any EBS failures in Amazon’s cloud myself, I have heard the horror stories and that makes me uneasy. Another issue with disks in cloud that I do run into a lot is latency. The disk io in many cases is slower to begin with, and random bouts of latency tend to crop up.

I have addressed both of these problems by deploying RAID 10 on my Amazon EC2 instances. It sounds techie but you don’t have to be a rocket scientist to do this. If you are managing an EC2 instance you can do it and I have published a script that will get you there in a few steps.

Read More...

Permanent Link — Posted in Geek Tactics , Cloud Computing , Amazon Web Services

Update Amazon Route53 via python and boto

April 18, 2012 at 12:00 AM

I wrote a python script to update DNS on Amazon Route53. You can use it on dynamic hosts by putting it into cron, or on boot for cloud instances with inconsistent IP addresses.

It uses the boto Amazon Web Services python interface for the heavy lifting. You’ll need that installed. (Arch Linux has a python-boto package)

You need to edit the script to place your AWS credentials in the two variables near the top of the script (awskeyid, awskeysecret). Then it’s ready to go.

Read More...

Permanent Link — Posted in Geek Tactics , Cloud Computing , Amazon Web Services

Cloud Architecture Best Practices

August 31, 2011 at 12:00 AM

“Plan for failure” is not a new mantra when it comes to information technology. Evaluating the worst case scenario is part of defining system requirements in many organizations. The mistake that many are making when they start to implement cloud is that they don’t re-evaluate their existing architecture and the economics around redundancy.

All organizations make trade-offs between cost and risk. Having truly fully redundant architecture at all levels of the system is usually seen as unduly expensive. Big areas of exposure like databases and connectivity get addressed but some risk is usually accepted.

Read More...

Permanent Link — Posted in Cloud Computing

Understanding Cloud Computing Vulnerabilities

August 19, 2011 at 12:00 AM

Discussions about cloud computing security often fail to distinguish general issues from cloud-specific issues.

Here is a great overview from Security & Privacy IEEE magazine of common IT vulnerabilities and how they are impacted by the new cloud paradigm.

The article starts off defining vulnerability in general and then goes on to establish the vulnerabilities that are inherent in cloud computing models.

It really boils down to access:

Of all these IAAA vulnerabilities, in the experi­ence of cloud service providers, currently, authentica­tion issues are the primary vulnerability that puts user data in cloud services at risk

Read More...

Permanent Link — Posted in Cloud Computing , Security

How The Cloud Changes Disaster Recovery

July 26, 2011 at 12:00 AM

Chart of recovery time vs. cost

Data Center Knowledge has posted a great article illuminating the effect that cloud computing is having on the economics of disaster recovery (DR) for information technology. Having fast DR used to mean adding a large considerable expense to your IT budget in order to “duplicate” what you have.

Using cloud technologies not only is this less expensive, but is a great first step towards transitioning IT infrastructure into the cloud paradigm.

Read More...

Permanent Link — Posted in Cloud Computing

Next »