Sunday, December 13, 2009

Windows Azure and Cloud Computing Posts for 12/11/2009+

Windows Azure, SQL Azure Database and related cloud computing topics now appear in this weekly series.

 
• Update 12/12 and 12/13/2009: Maureen O’Gara: Savvis To Peddle Cisco-Based Private Clouds; David Lemphers: Remote Command Service for Windows Azure!; Chris Hoff: Cloud Computing Public Service Announcement – Please Read; B. Guptill, M. Koenig and B. McNee: SAP Influencer Summit Reveals Cloudy Strategy, Path, and Challenges; Arik Hesseldahl: Forecast for 2010: The Coming Cloud 'Catastrophe'; Joseph Rago: Health Care's 'Radical Improver'; and others.

Note: This post is updated daily or more frequently, depending on the availability of new articles in the following sections:

To use the above links, first click the post’s title to display the single article you want to navigate.

Cloud Computing with the Windows Azure Platform published 9/21/2009. Order today from Amazon or Barnes & Noble (in stock.)

Read the detailed TOC here (PDF) and download the sample code here.

Discuss the book on its WROX P2P Forum.

See a short-form TOC, get links to live Azure sample projects, and read a detailed TOC of electronic-only chapters 12 and 13 here.

Wrox’s Web site manager posted on 9/29/2009 a lengthy excerpt from Chapter 4, “Scaling Azure Table and Blob Storage” here.

You can now download and save the following two online-only chapters in Microsoft Office Word 2003 *.doc format by FTP:

  • Chapter 12: “Managing SQL Azure Accounts, Databases, and DataHubs*”
  • Chapter 13: “Exploiting SQL Azure Database's Relational Features”

HTTP downloads of the two chapters are available from the book's Code Download page; these chapters will be updated for the November CTP in January 2010. 
* Content for managing DataHubs will be added as Microsoft releases more details on data synchronization services for SQL Azure and Windows Azure.

Off-Topic: OakLeaf Blog Joins Technorati’s “Top 100 InfoTech” List on 10/24/2009, SharePoint Nightmare: Installing the SPS 2010 Public Beta on Windows 7 of 11/27/2009, and Unable to Activate Office 2010 Beta or Windows 7 Guest OS, Run Windows Update or Join a Domain from Hyper-V VM (with fix) of 11/29/2009.

Azure Blob, Table and Queue Services

Alden Torres explains How to: Smooth Streaming and Windows Azure Storage in this illustrated tutorial of 12/11/2009:

This is a pending blog that I have [had] since I posted the announce[ment] of my Smooth Streaming video using Windows Azure Storage. I've been playing a lot with every component of the Smooth Streaming technology, combining it with ASP.NET, IIS, Mono, Linux, PHP and more recently Windows Azure. There is someone out there playing with Amazon S3 but as part of a commercial offer and I don't know if he's making any money out of it.

For playing with Azure I took my inspiration from the Tim Heuer's blog Using Azure as a Silverlight Streaming replacement and the few comments regarding Smooth Streaming, the latest is about progressive download but pointed out the right tools to work with Azure Storage. Specifically CloudBerry for Azure Blob Storage despite having some trouble working with the special container $root was extremely helpfully. …

<Return to section navigation list> 

SQL Azure Database (SADB, formerly SDS and SSDS)

Mostafa M. Arafa Elzoghbi shows you how to How to build your DB on MS SQL Azure in this 12/11/2009 post, which begins:

I would like to share with you some tips for how to move your on-premise DB to MS SQL Azure on the cloud.

First, To be able to connect to SQL Azure Prepare your environment with the following:

1) Install MS SQL [Server] 2008 R2 on your machine - Nov CTP, having any other version of SQL 2008 will not give you the full functionality like working with object browser in SQL Server 2008 R2.

Tips :

a) Before installing SQL [Server] 2008 R2, please check your existing environment by using SQL Server upgrade advisor (it is a part of SQL [Server] 2008 R2).

b) Using your credentials in SQL Azure, you will be able to connect using SQL [Server] 2008 R2 with no issues and you will be able to navigate through your objects in the DB.

and continues with brief scripting recommendations. Note that a full installation of SQL Server 2008 R2 November CTP requires about 2.5 GB of available disk space.

<Return to section navigation list> 

AppFabric: Access Control, Service Bus and Workflow

• Bruno Tekaly delivers a detailed and fully illustrated tutorial for the newly named Windows Azure AppFabrik in his Basic Client and Service Working with the .NET Service Bus post of 12/13/2009:

Our goal here is organic. It is the fundamental “Hello World” in terms of communications.

Access Control Service, Relay Service

This blog entry introduces an authenticating mechanism, which determines which clients can connect to our service.

We also introduce a relay service, which now makes it possibles to connect two or more parties, regardless of whether any connections are behind any firewall, NAT device, switch, or router.

The code below is simple - some service that exposes some endpoint, some protocol, some security model. That service is secure and can be reached by almost any client on earth.

Primarily it is about using Azure and it’s fundamental relationship to the AppFabric.

The application we are about to write will allow this person to send the name of a city to our weather service, and our weather service responds with a string about weather conditions for the city name passed in.

The Infrastructure for the Service Bus

Here we can see that a whole infrastructure is transparently available to the developer.

This infrastructure makes it easy to model client / server scenarios without worrying about NATs, Firewalls, routers, and switches. From the server side, your service will expose an authenticated endpoint. From the client, you will connect to the endpoint.

Notice the client interacting with the service and the use of (1) access control service; (2) relay service.

image

Bruno, who includes links for downloading the project’s source code and the Windows Azure Platform Training Kit, is a Microsoft developer evangelist.

Sam Gentile’s AppFabric a.ka. "Dublin" Links post of 12/12/2009 has links to the following Windows Server AppFabric resources:

Nicholas Allen’s [Windows Server] AppFabric Talks and Slides post of 11/30/2009 provides an introduction to the Windows Server AppFabric version:

Windows Server AppFabric is a set of integrated technologies that make it easier to build, scale, and manage WCF and WF applications running inside of IIS. AppFabric is the brand name for the hosting and caching features previously called Dublin and Velocity.

and offers these links to additional resources:

The .NET Services Forum has been renamed to Windows Azure Platform AppFabric.

See the AppFabric Team’s report of a major software upgrade coming on 12/17/2009 in the Cloud Computing Events section.

<Return to section navigation list>

Live Windows Azure Apps, Tools and Test Harnesses

Neil MacKenzie’s Dallas – Data as a Service post of 12/13/2009 begins:

There has been much interest in the various as a service features of the cloud: Infrastructure as a Service (IaaS); Platform as a Service (PaaS); and Software as a Service (SaaS). One of the principal benefits driving much of the interest in the cloud is the demand elasticity exhibited by the various XaaS features. This elasticity is particularly useful in data analytics where the cost associativity whereby one thousand instances for one hour costs the same as one instance for one thousand hours means that large datasets can be processed much quicker than they could previously. The rental feature of cloud computing then makes this a cost effective technique. I think there we are at the beginning of a revolutionary change in our ability to perform data analytics.

The drivers of this revolution are the ability to process large amounts of data and the availability of large datasets through Data as a Service offerings from cloud computing providers. They can negotiate directly with Government and commercial entities for the rights to host and make available the data stored by these entities and in doing so democratize access to that data. Amazon AWS has a Public Data Sets offering in which it exposes non-proprietary and public-domain datasets such as US Census Bureau and human genome data to developers. At PDC 2009, Microsoft announced Project Dallas which exposes both commercial and public-domain datasets as a service in Windows Azure.

And continues with sample code to return pages of data items in the Atom format. Neil observes near the end of his post:

One issue with the Atom feed format is that it is very heavy. For example, a single Atom feed entry in the FAO3510 data series consumes almost 350 bytes even as the FAO3510Item representing it comprises three Int16 objects, one Double and a String. I think some more efficient data representation will be needed for very large datasets.

Cloud providers don’t worry much about high-overhead data formats when they charge by the GigaByte for data transfer.

• Cory Fowler (SyntaxC4) describes the effects of namespace changes of the Windows Azure SDK November 2009 CTP in his Latest Windows Azure Project Conversion… post of 12/13/2009:

While Guelph Coffee and Code was participating in the Technology Focus Panel on Windows Azure, I attempted to upload the BlogEngine.Net sample from Code Plex during my demo on how to deploy to the Windows Azure Platform.

During the Publishing Step there were a number of Errors while it was trying to compile the project. These errors are due to a namespace change that you can read about in my blog post Moving from the ‘ServiceHosting’ to ‘WindowsAzure’ Namespace and a follow up post entitled Windows Azure Guest Book – Refactored. [I will be writing a post shortly on the Microsoft.WindowsAzure.Diagnostics namespace which replaced the RoleManager.WriteToLog()].

I will be deploying the BlogEngine.Net application to Windows Azure and am looking for Developers that have been deploying Projects into the Cloud to post their experiences on the blog.  If you would like to guest write on the Canadian Windows Azure Experience Blog please contact me with a link to your Windows Azure application and I will create an account for you.

Anton Staykov cites an incorrect DiagnosticConnectionString value as Another common reason for Windows Azure role to st[i]ck in Initializing/Busy/Stopping post of 12/13/2009:

There is another common reason that will cause your role to [become] stuck into that loop when you deploy it on Windows Azure. And it is common for Web and Worker Roles.

You will most probably encounter this error if you are using latest Windows Azure Tools (as you should to) and you are creating a new project (rather than upgrading your existing one).

The new project template comes with enabled and started DiagnosticsMonitor. In order for this Monitor to work properly and collect and store data it needs a Windows Azure Storage account. The account configuration is saved as Role Configuration Entry in the .CSCFG file. The default “Connection string” is this one:

    1 <Setting name="DiagnosticsConnectionString" value="UseDevelopmentStorage=true" />

Note the “UseDevelopmentStorage=true” in the value for our DiagnosticsConnectionString. If you just make a “Hello World” Web Role and you upload it to Azure environment, that role will never start. It will go a Initializing/Busy/Stopping/Stopped loop. You need to either stop the DiagnosticsMonitor or change the connection string. …

• Joseph Rago explains “How medicine's digital revolution can empower doctors and patients, with or without ObamaCare” in this 12/12/2009 Wall Street Journal interview with Health Care's 'Radical Improver', cloud-based athenahealth’s Jonathan Bush:

athenahealth… It is a massive logistical undertaking. Athena's main facility is housed in a decommissioned World War II arsenal on the Charles, where 30,000 pounds of paper is processed every month, most of the tonnage being paper checks. Incredibly, doctors also receive on average 1,185 faxes each month—mostly lab results—and those are handled too.

State Medicaid programs, by the way, are easily the worst payers, according to Athena's annual ranking. In New York, for instance, claims must be tendered on a dead-tree form instead of electronically and in blue ink—black is grounds for rejection—and then go on to spend a full 161 days, or almost a half year, in accounts receivable.

winterrago

While streamlining this disorder frees up time for the company's clients to treat patients, it also throws off vast data, which are fed in central servers, aggregated and analyzed.

This "athenanet" system is among the few health-tech offerings based on "cloud computing"—in the sense that the applications are accessed on the Web, instead of a computer's hard drive, allowing constant updates and refinements. If a regulation changes or an insurer adjusts a payment policy, it is reflected on athenanet almost in real time; on the clinical side, the program can adapt at the same rapid pace as medicine itself.                                                                (Graphic: Zina Saunders.)

Mr. Bush thinks the main benefit is the "collective intelligence" that he is starting to weave together from the 87% of American physicians who practice solo or in groups of five doctors or fewer. "We found one of the last few remaining crowds in health care, which are these independent practices. Now you can argue that this decentralization is not the best thing in the world," but what's most important, he argues, is that "they're still allowed to go and make their own decisions."

In effect, as the network gets bigger, it gets smarter, while opening the space for innovations to feed off one another and spread. There really can be "radical improvement" in health care, Mr. Bush says, but only if there are "radical improvers" able to set themselves apart and lead the forward advance. "No one ever says, 'Here's to the average,'" he declares pointedly.

The Athena model is superior to most electronic medical record systems, or EMRs, which are generally based on static software that are inflexible, can't link to other systems, and are sold by large corporate vendors like General Electric. One reason the digital revolution has so far passed over the health sector is sheer bad product. The adoption of EMR in health systems across the country has been dogged by cumbersome interfaces, error propagation and other drawbacks. In 2003, for instance, Cedars-Sinai in Los Angeles dumped a $34 million proprietary system after doctors staged a revolt. …

According to a July 13, 2009 press release, Cook Children's Physician Network Partners with athenahealth to Offer Affiliated Physicians Electronic Health Record and Practice Management Services:

Through Cook Children's new clinical platform, PedsPal physician members using athenaClinicals will also be able to create a personal health record for their patients through Microsoft's HealthVault that will be automatically populated from athenahealth's centrally-hosted EHR with patient data, with no additional effort required by the families Cook Children's serves. …

Along with athenahealth's and PedsPal's partnership to help connect CCPN's affiliated physicians across 22 states, practices owned by Cook Children's will also be using athenaClinicals and athenaCollector, which will be interfaced with Microsoft's Amalga and HealthVault products in support of a larger extended enterprise connectivity strategy. Cook Children's new clinical platform will offer a lifetime medical record for all of its patients that will enable the health system to provide better pediatric specialty care to patients across the country. Having greater access to data for all of its patients at all points of care will help Cook Children's to provide the best possible outcomes and to lower heath care costs. [Emphasis added.]

• David Lemphers describes a simple Remote Command Service for Windows Azure! to aid debugging in his 12/12/2009 post:

So a while back I blogged a little sample code on how to get some info about what’s happening inside your instance using a simple process within a web page: Looking Inside Windows Azure!

Well, I decided to refresh this code sample using one of my favorite features, InputEndpoints!

So let’s take a quick tour of the sample. At a high level, this is what’s going on:

image

Let’s dig deeper. The client is a simple console app, and the service is a worker role that exposes a TCP InputEndpoint, connected to a socket. The socket accepts a single inbound client connection, and then executes commands on the local instance and returns the results to the client. …

[You can] execute commands such as DIR or NETSTAT (or others you could try), against the local VM. This is very handy when trying to debug local instance issues.

You can download the code here. It contains the code for the VS2010 Worker Role cloud service project and the simple console client.

Also, you’ll need to update the diagnostics connection string before deploying it to the cloud, and I’d suggest deploying the worker role with a single instance to avoid any load balanced strangeness.

• Brad Abrams explains why Visual Studio users often have problems running sample code downloaded from the Internet in his fully-illustrated Visual Studio Project Sample Loading Error: Assembly could not be loaded and will be ignored. Could not load file or assembly or one of its dependencies. Operation is not supported. (Exception from HRESULT: 0x80131515) post of 12/11/2009:

Some folks have mentioned to me that they are having trouble getting some of my samples to work.  And in fact, just the other day, I ran into a problem getting my own samples to work.  It turns out to be a problem with the way windows treats the sample that you down load.  

Specifically, because you downloaded the sample from the internet, windows treats the sample as “untrusted” content.  When you unzip untrusted content you get a directory full of untrusted content.  Visual Studio is not so good and running untrusted content.  Unfortunately, you get some really bad error messages like the one [below]. 

image

… Luckily fixing this is very easy.  Just go back to the ZIP file you downloaded, select properties, then “unblock” the content. 

image

Brad is a Microsoft program manager for Visual Studio. His is the longest blog title I’ve ever seen.

David Lemphers apologizes in his My PDC09 Session – Code and Clip! post of 12/11/2009:

So one thing I’ve been terribly slack about since returning from PDC09 has been posting my session clip and code snippets for folks to download/reference, so my sincerest apologies, and without further ado:

  1. My session
  2. My slide deck
  3. My code
  4. My banner (picture care of JLD)!

Bill Zack describes his recent presentation about the Windows Azure Platform to the 2nd Annual ISV Architects Workshop in New York City in his Windows Azure in 20 Minutes! post of 12/11.

Bill is a Microsoft developer evangelist.

Bill Kallio explains Silverlight, TFS and Cloud Deployments in this 12/10/2009 illustrated tutorial:

So your boss has decided to give this whole Azure Services Platform a shot, and now you need to come up with ways not only to develop on the Azure platform, but to preserve your current code management and source control nightly build best practices at the same time? Look no further! I think I've got what you need. Hang on though, because we're going to have to get our hands dirty.

First off, I don't feel that I can really write on this topic without giving a shout out to Dominic Green, who came up with a quick and dirty guide to setting up automated deployments to the cloud from TFS. You can find his post here. What I needed to do, however, was not only automate a build deployment, but a) make it a bit more configurable and b) deploy Silverlight xaps within my package. As any SL dev can you, TFS Build doesn't always play nice here.

So let's get down to it:

First off, we need to handle the pesky issue of xap file copies failing on your standard TFS build. Now there are numerous workarounds to this, but the one I went with was to create a custom solution configuration (I know, scary) and then add a condition based on that configuration to my web role project. The reason for this is that the standard "IsDesktopBuild condition" workaround doesn't cut it with Azure projects because the Publish call from the IDE treats the build as !IsDesktopBuild, which means that when you Publish from the IDE with this project condition, your xap files won't be included in your package (total bummer). …

Bill then shows you how to solve the TFS problem.

tbtechnet suggests Try Windows Azure in December and Win! in this 12/10/2009 p…ost:

There’s a second Windows Azure challenge just launched today. Talk about easy ways to get cool gadgets.

I want one of those Amazon Kindles!

In 2-3 easy steps you can enter to get one: http://www.codeproject.com/Feature/Azure/

Bruce D. Kyle reports that CodeProject has free Windows Azure tokens for immediate release for the next 21 days. 314 tokens were available on 12/11/2009 at about 1:00 PM.

Return to section navigation list> 

Windows Azure Infrastructure

• Scott Wylie offers the following graphic as The Microsoft Cloud Story – in One Slide in his 12/13/2009 post:

S S Triangle

Simple eh? I’ll follow up with some posts on some of the underlying details and business justification but for now feel free to share and use in your thinking around the cloud.

Scott is Microsoft’s director of developer and platform strategy for New Zealand.

• Joe Weinman asserts that pricing for cloud services might migrate from today’s fixed per hour, month, and GB charges to more flexible models, such as those used by airlines, in his Hedging Your Options for the Cloud post of 12/13/2009 to the Giga Om blog:

With the second decade of the millennium now just weeks away, I thought I’d offer up some possibilities for the cloud computing market as it continues to evolve. Cloud services — whether infrastructure, platform or software — share similarities with other on-demand, pay-per-use offerings such as airlines or car rentals. But what’s past in those industries may be prologue for the cloud. …

He then goes on to list “… some key aspects of those services that could become integral to the cloud in the coming decade.” Joe is Strategy and Business Development VP for AT&T Business Solutions.

Michael Krisman’s Cloud economics: The importance of multi-tenancy post of 12/12/2009 to ZDNet’s IT Project Failures blog quotes several Enterprise Irregulars on the multi-tenancy topic and concludes:

For cloud software customers, vendor architecture should ideally make no difference at all. Rather than worrying about vendor economics, it would be best if customers could focus only on features, functions, costs, business fit, and so on.

Realistically, however, enterprise buyers must consider vendor reliability, health, and sustainability as important elements of the buying decision. The connection between multi-tenancy and vendor performance in the software as a service market is therefore a key factor for buyers to understand and evaluate.

Michael Krigsman is CEO of Asuret, Inc., a software and consulting company dedicated to reducing software implementation failures. Enterprise Irregulars are a group of experienced IT thought leaders who like to write.

Eric Lai quotes analyst Rob Helm: “Move could lead to rivalries between Microsoft groups targeting same audience” in his Analysis: Realigning Microsoft's Azure unit could boost innovation, squabbling post of 12/11/2009:

Unifying Windows Azure with its on-premises counterpart Windows Server should hasten the rollout of new features in Microsoft Corp.'s enterprise cloud platform, but it also will pit Azure against Microsoft's other cloud services for the loyalty of a key customer base, says one independent Microsoft analyst. …

The realignment, while major, was not a surprise. Muglia had said a year ago that Windows Azure would be moved into STB before Microsoft begins selling it on January 1, 2010.

"This is a natural development. Azure is becoming a real business, and thus it's moving into a real business group," said Rob Helm, an analyst with the independent firm Directions on Microsoft.

Unifying Azure with Windows Server should help the former catch up feature-wise with its older sibling, Helm said, and eventually lead to features being released first on Azure. [Emphasis added.] …

Irfan Khan claims “Dramatically improving the agility, flexibility, performance and efficiency of data centers ultimately leads to business success” in his Seeding the Cloud: The Future of Data Management post of 12/11/2009:

Doing more with less is a familiar refrain for IT professionals, and today's challenging business environment has only increased the pressure on managers to achieve efficiencies, maximize performance and improve responsiveness of the data center. More and more frequently, IT is turning to virtualization to accomplish its mission-critical goals.

The hot new trend in cloud computing is a natural extension of this drive toward virtualization. In the case of the public cloud, IT can add processing power and infrastructure as needed, and in the case of the private cloud, IT can improve the utilization of existing infrastructure. In other words, cloud computing platforms offer IT the opportunity to increase efficiencies and become more agile, transforming the data center into an environment that delivers greater benefits to end-users.

This tremendous potential of cloud computing can be seen by examining how organizations currently manage, analyze and mobilize data. Cloud has given IT organizations the opportunity to fundamentally shift the way data is created, processed and shared.

In order to take advantage of cloud computing for data management, IT must familiarize itself with the latest issues and trends. This requires a greater understanding of each of the three prominent cloud models - private, public and hybrid - along with proper evaluation of the criteria around a cloud strategy. …

The fact that Irfan Khan is CTO of Sybase lends additional credence to his clairvoyance.

JP Morgenthau asserts The Big News This Year Is Cloud Computing in a “semi-humorous look at 2009 the year in review for IT and insight into next year's top IT topic:”

The big news in IT this year has to be Cloud Computing.

As pundits and vendors raced to gain mindshare as a means of priming the economy-dulled sales pipeline, IT buyers were caught trying to figure out, "is this the next great frontier, or am I going to look like an ass for recommending yet another useless acquisition?"

Interestingly, cost is not the major driver behind these offerings, but instead access to compute resources on demand that don't require up-front capital expenditure. Now, they only have to figure out how to actually make it so their applications can work in the Enterprise, but shift to the Cloud for additional resources when needed.

This should lead to some interesting Fail Blog stories next year since most companies barely have the right mix of resources to keep the lights on let alone actually architect (Webster actually lists this as a verb now, so if jiggy is a valid word in the dictionary, I guess I have to accept architect as a verb now) a distributed application capable of crossing compute boundaries. I will be watching a number of fronts, such as application virtualization and Open Virtualization Format (OVF) to see if these do anything to help with this issue. …

JP Morgenthal works as a Senior Principal Architect with QinetiQ North America's Mission Systems Group.

Mary Jo Foley quotes analyst Mark Anderson’s claim “There will be a Cloud Catastrophe in 2010 that limits Cloud growth by raising security issues and restricting enterprise trust.  CIOs will see the cloud as the doorstep for industrial espionage” in her Analyst: 'It is game over for Microsoft in consumer' article of 12/11/2009 for ZDNet’s All About Microsoft blog:

Microsoft started out its life as a consumer/developer-focused company. The company subsequently switched strategies and became a largely enterprise-focused vendor. These days, consumer is king for Microsoft — at least as far as corporate strategy and where its ad dollars go.

But what if Chief Software Architect Ray Ozzie and other leaders at Microsoft are wrong and integrating the consumer and business worlds doesn’t really matter? One very influential market watcher, Mark Anderson, author of the Strategic News Service newsletter, is betting that instead of a melding, there will be an increasing chasm between the consumer and business market.

Anderson, whose word carries a lot of weight with corporate execs (including those from Microsoft), venture capitalists and other movers and shakers, held his annual predictions dinner in New York City on December 10. His list of 2010 tech predictions, which included a number of things that won’t surprise some tech watchers, Anderson acknowledged, were pretty dire for Microsoft. …

Here’s the first two of Anderson’s ten predictions:

1. 2010 will be The year of Platform Wars: netbooks, cell phones, pads, Cloud standards.  Clouds will tend to support the consumer world (Picnik, Amazon), enterprises will continue to build out their own data centers, and Netbook sector growth rates continue to post very large numbers.

2. 2010 will be The year of Operating System Wars: Windows 7 flavors, Mac OS, Linux flavors, Symbian, Android, Chrome OS, Nokia Maemo 5. The winners, in order in unit sales: W7, Mac OS, Android. W7, ironically, by failure of imagination and by its PC-centric platform, actively clears space for others to take over the OS via mobile platforms. …

Clearly, Anderson doesn’t share the enthusiasm of other industry analysts and reporters for public clouds. Mary Jo writes, “Anderson doesn’t believe that Ozzie’s ‘beautiful world’ of three screens and the cloud will ever come to pass.”

The VAR Guy claims Windows Server Meets Windows Azure: Smart Move in this 12/11/2009 post:

The VAR Guy has a hunch: Microsoft’s best hope for future growth is Windows Azure, the company’s cloud-based operating system. So Microsoft’s decision to merge Windows Server and Windows Azure into a single organization — called the Server & Cloud Division (SCD) — was a smart one. Here’s why.

In the 1990s, Microsoft worked overtime to recruit skeptical channel partners and software ISVs — Oracle, SAP, Computer Associates, IBM, Lotus and more — onto the Windows NT Server platform. As NT gained application support, Microsoft’s business fortunes rose. Dramatically.

Now, Microsoft must repeat that entire process with Windows Azure. Specifically, Microsoft needs to inspire Windows Server ISVs to leap into the Windows Azure cloud. …

Bob Evans reports “With CIOs looking to the cloud to help rekindle growth and CEOs dazzled by the economic promise, Riverbed is very bullish on cloud computing” in his Riverbed Sees Cloud Computing Boom In 2010 article of 12/11/2009 for InformationWeek’s Global CIO column: 

Few tech companies managed to grow in 2009, so it's interesting to take a closer look at those that did in hopes of understanding how they avoided the general tendency toward shrinkage. For Riverbed, whose 2009 revenue is up 18% year over year including the company's first $100-million quarter, two things stand out: first, the rise of cloud computing; and second, speed is once again cool. [Emphasis added.]

"From 2004 through 2007, we sold speed," said senior vice president of marketing and business development Eric Woolford in an interview at Riverbed headquarters this week. Then came the global recession and for many companies speed became a luxury they couldn't afford, so for most of 2008 and 2009, Riverbed has sold efficiency as customers looked to consolidate data centers, remote offices, branches, servers, staff, and everything that wasn't nailed down along with some stuff that was.

Mary Hayes Weier chimes in on the Windows Azure + Server merger with her Why Microsoft Azure Should Matter To Businesses article of 12/10/2009 for InformationWeek’s Plug into the Cloud section:

I'll make a prediction that 2010 will be the most exciting year yet for cloud computing, and that's partly related to Microsoft's ramp-up of Azure. Anyone attending Salesforce.com's Dreamforce conference last month understands the potential of cloud platforms for rapid application development. And this is all coming together at a time when IT shops need to quickly and cheaply turn out innovative new applications for the business.

To recap, Microsoft on Wednesday created a new commercial unit, the Server & Cloud division, as it prepares to launch the Windows Azure cloud services platform on Jan. 1. It also announced a three-year partnership with storage and virtualization vendor NetApp to help companies develop private clouds if they don't want their clouds hosted by Microsoft.

To understand Microsoft's potential in this area, look what's happening at Salesforce.com. The company drew more than 15,000 people to Dreamforce last month, largely due to growing interest in Force.com. Starry-eyed entrepreneurs were there hoping to make money off of apps they've built on Force.com, which are hosted in Salesforce.com's data centers. Demand for this type of thing is growing. I talked to several CIOs who were delighted—bordering on ecstatic—that their IT teams could so quickly develop apps on Force.com.

So here's the deal: The cloud vendor (Salesforce.com) hosts the servers, and already provides the core application logic. Using their programming tools, developers quickly write applications that run on the platform. Then they pay the vendor a monthly fee to use the app for as long as they want to. One example is RehabCare, which built an iPhone app for patient admissions on Force.com within several days.

Microsoft Azure appears to be very similar. The servers, the maintenance, and the core application logic are already there, hosted by Microsoft. If Microsoft does this right, it could be a great alternative to, say, spending months on .Net development of an app that has become obsolete to the business by the time it's finished. And Microsoft's partners, if they're wise, will understand the opportunity here. …

<Return to section navigation list> 

Cloud Security and Governance

• Chris Hoff (@Beaker)’s Cloud Computing Public Service Announcement of 12/11/2009 is brief and to the point:

If your security practices suck in the physical realm, you’ll be delighted by the surprising lack of change when you move to Cloud.

• Arik Hesseldahl warns “Mark Anderson predicts a big remote-computing service disaster, lost momentum for Microsoft, and growth in consumer payments for online news content” in the subtitle for his 12/11/2009 Forecast for 2010: The Coming Cloud 'Catastrophe' article for Business Week:

Cloud computing enthusiasts be warned. Next year, computing services handled remotely and delivered via the Internet may undergo some kind of "catastrophe" that alerts companies and consumers to the risks of relying on the so-called cloud, says Mark Anderson, chief executive of Strategic News Service, an industry newsletter circulated to senior executives at technology companies including Intel (INTC), Dell (DELL), and Microsoft (MSFT).

A growing number of businesses and individuals are handing storage and various other tasks to outside providers, from photographers archiving pictures with Yahoo!'s (YHOO) Flickr to companies turning over complicated computing operations to Amazon (AMZN). Tech prognosticator Anderson suggests that the tendency could backfire in some high-profile way in the coming year. "It could either be a service-outage-type catastrophe or a security-based catastrophe," he says. "In either case, it will be big enough. It will be the kind of disaster that makes you say, if you're a [Chief Information Officer]: 'That's why I didn't get involved with the cloud.'" …

Anderson is particularly bearish when it comes to the cloud. "My hunch is that there will never really be a secure cloud," he says. Businesses will view cloud services more suspiciously and consumers will refuse to use them for anything important, he says.

Hesseldahl quotes Chris Hoff:

Cloud computing experts note that high-profile security breaches have already occurred. "Clouds don't make applications fail-safe," says Chris Hoff, director of cloud and virtualization services at Cisco Systems (CSCO). He points to Magnolia, the social bookmarking service that crashed and lost all its data earlier this year. "There will be other events like these in 2010, as there were in 2009 and 2008," Hoff says.

See my entry for Mary Jo Foley’s Analyst: 'It is game over for Microsoft in consumer' post in this post’s Windows Azure Infrastructure section for more details on Andersons prognosis. @Beaker coined the term “cloudifornication;” is Anderson engaging in “cloudinostication?”

Brad Anderson’s Microsoft Acquires Opalis Software post of 12/11/2009 to the Microsoft System Center blog begins:

Great news for enterprise, government and service provider customers who use System Center. I’m excited to announce that Microsoft has acquired Opalis Software, a Toronto-based private company that is a leader in IT process automation software. Opalis will become a wholly-owned subsidiary of Microsoft.

I believe this acquisition is a pivotal piece to deliver on our dynamic datacenter initiative, which I’ve spoken about before on this blog. This deal brings together the deep datacenter automation expertise of Opalis with the integrated physical and virtualized datacenter management capabilities of Microsoft System Center. I believe Opalis’ software together with the System Center suite will improve the efficiency of IT staff and operations, and customers will gain greater process consistency. Opalis’ software captures the IT processes, in a documented and repeatable way, which can be run over and over again. These capabilities will be added to Microsoft System Center to help customers automate complex IT processes, increase cost savings and shorten timeframes for IT service delivery across physical, virtual and cloud computing environments.

So if you don’t know Opalis products, you should get to know them. Their products include:

  • Prepackaged workflows and automation expertise for the common tasks in managing a datacenter. Examples include Run Book Automation, virtual machine lifecycle management, and incident ticket management.
  • An automation platform that lets customers create and execute workflows across datacenter management tools.
  • Deep integration capabilities to orchestrate tasks across server infrastructure and systems management products, including those from BMC, CA, IBM, HP, Microsoft, and Symantec.

So what does this acquisition mean for Opalis and Microsoft customers? Opalis will continue to fulfill its existing commitments to customers and continue to deliver support as we transition Opalis to the Microsoft systems and processes. What’s interesting is that a large majority of Opalis customers use System Center today. We intend to make RBA capabilities a core part of System Center going forward. You can get more details of the transition at Microsoft Pathways. This site will provide the most current information on the transition and provide answers to frequently asked questions. Also, be sure to read today’s blog post from Todd DeLaughter, CEO of Opalis: http://www.opalis.com/Blog.asp.

The Patrick O’Rourke added an identically named post of the same date to the Virtualization Team Blog with more details on the Opalis product line.

The Opalis acquisition meshes nicely with Microsoft’s newly announced intentions to deliver private as well as public Windows Azure Platforms.

Steve Clayton’s Microsoft acquires Opalis post of 12/11/2009 describes Opalis’ product line in a nutshell:

Opalis v6.0 automates the integration and orchestration of cloud actions such as:

  • Cloud Bursting – automate public cloud provisioning to handle peak loads and prevent SLA violations
  • Cloud Cover – automate failover to public or private clouds
  • Private Cloud Operation – create and manage service driven, flexible capacity with automation
  • Sophisticated triggering – subscribe to external events to trigger workflow processes that add, reduce or fail-over to cloud resources according to policies and SLAs

Lori MacVittie warns about “a reduction in the focus on XML and security” in her The XML Security Relay Race post of 12/11/2009:

A recent tweet about a free, Linux-based XML Security suite reminded me that we do not opine on the subject of XML security and its importance enough. SOA has certainly been dethroned as the technology darling du jour by cloud computing and virtualization and with that forced abdication has unfortunately also come a reduction in the focus on XML and security.

That’s particularly disturbing when you recognize that what’s replaced SOA – primarily WOA and RESTful APIs – exchange data primarily via one of two formats: XML and JSON. Whether you prefer one over the other is a debate I’d rather not have (again) at the moment. It is enough to note that XML is widely used by developers to exchange data between applications and between clients and servers, and that it is the core data format for Web Services used to manage many cloud computing environments, such as Amazon’s AWS.

It’s important, then, to remember that many – nay, most – of the security risks associated with SOA were actually due to its reliance on XML. SOA is an architecture and aside from certain standards proscribing the use of data formats – SOAP, WSDL, UDDI – it carried with it very few tangible security risks. XML, on the other hand, carries with it a wide variety of security risks that have not suddenly “gone away” simply because it’s being used to implement APIs in both RESTful and SOAPy environments. …

John Bodkin asserts “A project code-named Sydney is described as a security structure for cloud environments” and quotes Azure senior architect Hasan Alkhatib at the Xconomy Forum:Cloud in his Microsoft talks cloud computing security and plans to offer private cloud software article of 12/10/2009 for NetworkWorld:

With Microsoft's Azure cloud computing platform set to go live on New Year's Day, the company is looking ahead to later in 2010 when it will unveil a new security structure for multi-tenant cloud environments as well as private cloud software based on the same technology used to build Azure.

Hasan Alkhatib, the Azure senior architect, described the Microsoft security project code-named "Sydney" Thursday at an Xconomy forum on cloud computing held at Microsoft's New England R&D Center in Cambridge, Mass. …

In addition to embedding greater security into the public cloud, Alkhatib said Microsoft is planning to help customers build private cloud networks within their own datacenters, using the same software Azure is based on.

"Every customer says 'where can we get a private cloud?'" Alkhatib said. "We're building them. Within a short period of time private clouds will be available with the same technology we've used to build Windows Azure."

However, Alkhatib said he thinks private clouds lack most of the benefits of public clouds, and focused most of his talk on the Azure services that will be offered publicly over the Web.

Project Sydney, unveiled last month at Microsoft's Professional Developers Conference, addresses security in virtualized, multi-tenant environments in which customers are typically sharing data center resources. …

<Return to section navigation list> 

Cloud Computing Events

David Chou reports FREE Windows Azure Training in Irvine on December 15-17 in this 12/10/2009 post:

Metro Instructor Led Training Windows Azure

Abstract The Microsoft Developer & Platform is pleased to announce a Metro Instructor Led Training event focused on Developers covering the Windows Azure Platform, for Microsoft Partners and customers. Delivered through workshop style presentations and hands-on lab exercises, the workshop will focus on three key services of the Windows Azure Platform – Windows Azure, SQL Azure and .NET Services.

This event is targeted at partners and customers who have projects that they are potentially looking to deploy to the Windows Azure Platform.

Attendee Prerequisites

This workshop is aimed at developers, architects and systems designers with at least 6 months practical experience using Visual Studio 2008 and C#.

Agenda Topics

  • An introduction to Windows Azure
  • Working with Windows Azure storage
  • An introduction to SQL Azure
  • Building applications with SQL Azure
  • An introduction to .NET Services
  • Building solutions using the .NET Service Bus

Fees This event is free of charge. However, delegates are responsible for booking and paying for their own travel and accommodation. Both breakfast and lunch will be provided for session days.

Note: Participation in the Windows Azure platform Metro event requires an NDA.  The products and technologies cannot be discussed outside of this NDA

Date:  12/15 to 12/17/2009 9:00 AM-5:00 PM

Location: Microsoft Technology Center, Irvine, California       

Click here to register. Space is limited to two attendees per company.

Questions: Please email metroevt@microsoft.com

David is an Architect in the Developer & Platform Evangelism organization at Microsoft, focused on collaborating with enterprises and organizations in many areas such as cloud computing, SOA, Web, RIA, distributed systems, security, etc., and supporting decision makers on defining evolutionary strategies in architecture.

The requirement for signing an NDA to attend an Azure training session is intriguing. There’s no mention of an NDA on the signup page.

The AppFabric Team reports Scheduled Maintenance Notification - Windows Azure platform AppFabric (Dec 17th, 2009):

Windows Azure platform AppFabric will be undergoing planned maintenance on December 17th, 2009, starting at 9AM PST, and ending at 12PM PST due to a major software upgrade. We apologize in advance for any inconvenience. [Emphasis added.’

When: START: December 17th, 2009, 9AM PST; END: December 17th, 2009, 12PM PST

Impact Alert: Windows Azure platform AppFabric Service Bus and Access Control will be unavailable during this period.

<Return to section navigation list> 

Other Cloud Computing Platforms and Services

B. Guptill, M. Koenig and B. McNee wrote this SAP Influencer Summit Reveals Cloudy Strategy, Path, and Challenges Research Alert for Saugatuck Research on 12/11/2009 (site registration required):

On December 8th and 9th, Saugatuck participated with almost 300 other industry thought leaders in SAP’s Influencer Summit in Boston, MA.

During the Summit (labeled “The Clear Path Forward”), SAP laid out a coordinated vision and a series of plans and actions designed to drive the legacy enterprise software developer into the reality of Cloud Computing, while attempting to maintain its legacy enterprise software business.

Presentations and discussions articulated evolving strategies for overall business, development, offerings and partner ecosystems related to both the core (enterprise-targeted) Business Suite as well as the portfolio of SMB-targeted offerings.  Key points that Saugatuck took away from the Summit include the following:

  • SAP is clearly in the midst of re-positioning itself …
  • Change to this point has not been easy for SAP …
  • Why is it Happening? …
  • But we must emphasize that SAP embracing a Cloud-driven future is not the same as SAP moving everything to the Cloud …
  • SAP Influencer Summit only further crystallized the increasingly obvious fact that Cloud Computing changes everything for IT vendors …
  • We do believe that SAP has learned first-hand that such fundamental changes are wrenching and expensive, and can take more time and effort than initially planned …

The authors embellish their key take away points with considerable detail.

• James Governor reports:

I am on the record as saying SAP Business ByDesign, the company’s SaaS [cloud-based] suite for medium-sized businesses is going to be a huge success, a somewhat contrarian position.  Three pieces of news from the Summit confirmed my thinking

  • Blades powerful enough to run the system at reasonable cost are now available
  • SAP has decided to get into bed with Microsoft and an army of .NET developers and VARs by tightly integrating ByDesign with Visual Studio.NET for model-driven development and UI work and customisation. Think about it – SAP now sees Microsoft code as part of its Platform as a Service play for ByDesign, extending the Java/Netweaver based back end and creating margin opportunities for firms that want to sell, say, Carpet retail apps, or other micro-verticals.
  • SAP is now finally saying multi-tenancy is solved.

in his SAP: Out with the Old, Shrugging Off The Tag post of 12/10/2009 to the Monkchips blog.

Ray Wang offers a related analysis in his detailed Event Report: 2009 SAP Influencer Summit - SAP Must Put Strategy To Execution In Order To Prove Clarity Of Vision of the same date:

Re-innovation Now At The Heart Of SAP’s Focus And Strategy

SAP has faced a rough two years.  From the continuing market pressure on new license revenue, false-start launch of Business By Design (ByD), management restructuring, and issues with user groups and Enterprise Support, one could kindly say its been a brutal period.  Looking forward to a fresh start in 2010, senior executives and key personnel have been hard at work “re-innovating” SAP at both the product and marketing level.  As intended, many of the 275 analysts, bloggers, customers, influencers, and media attendees of this year’s SAP Influencer Summit left Boston with the perception that the company is in the midst of such transition. However, the clarity of that message and the perception of innovation depended on the topic at hand. …

Ray goes on to list “[f]ive key themes [that] drove most formal and informal conversations throughout the event” and then “[a]ppl[ies] a quick Vendor Scorecard grading system, … a subjective evaluation of SAP’s 2009 efforts to date.” He also includes many useful links to other analysts’ take on SAP’s current status.

• Maureen O’Gara reports Savvis To Peddle Cisco-Based Private Clouds in this 12/11/2009 post:

… Cisco's first big server score, Savvis will make Cisco's industry-alienating Unified Computing System (UCS) the cornerstone of Symphony, the private cloud platform it previously called Project Spirit and use it as the basis of the virtualized private data centers it hawks to the enterprise.

UCS includes servers, networking, storage access and virtualization managed as a single system. Savvis will also be using Cisco's Unified Service Delivery solution to combine the UCS architecture with Cisco's Nexus switches, ACE and MPLS next-generation network.

The pair has a reference architecture tested and validated at Cisco that makes Symphony, which is still in beta and not due out until spring, into Infrastructure-as-a-Service (IaaS) it can sell to the enterprise.

Savvis is one of the first service providers to deliver Cisco-based private clouds. The companies will co-market.

Savvis runs 28 data centers globally and claims 4,000 customers, including 40% of the Fortune 100. …

Apparently, Savvis will compete primarily with Amazon Web Services. See Peter Burrows’ Cisco story for Business Week below.

• Apparent NetworksPerformance Advisory: Amazon Web Services Outage 12/9/09 provides additional detail on the 44-minute outage:

Apparent Network's Cloud Performance Center (www.appparentnetworks.com/cpc) recently confirmed Amazon Web Services experienced connectivity loss which caused an Amazon Elastic Compute Cloud services outage in a single availability zone in their Northern Virginia Data Center.  The outage occurred on December 9, 2009 beginning at approximately 3:34AM EST and lasted until 4:19AM EST or about 44 minutes. Amazon later confirmed the connectivity loss was due to a power failure.

During that time, access to systems in the Northern Virginia Data Center was unavailable.  Businesses utilizing Amazon Elastic Compute Cloud services from this datacenter were affected. …

Peter Burrows writes “The CEO says he's making headway in efforts to challenge Dell, HP, and other computer companies in the $40 billion market” to set the stage for his 12/11/2009 Cisco's Chambers: Tough Talk on Data Centers article for Business Week’s “Computers” section:

Cisco Systems Chief Executive John Chambers had pointed words for journalists skeptical of his company's foray into equipment for big corporate data centers, an area of technology traditionally dominated by computer makers. "We've taken on big peers in the past," Chambers said over a bento-box lunch for members of the media at Cisco's financial analyst conference on Dec. 8.

He was referring to Cisco's move early this decade into corporate telephony, a market where it now has a 33% share. "We didn't know anything about that market when we got started," he said. By contrast, data centers are run by big companies, which have been relying on Cisco (CSCO) for networking gear for 25 years. "That's our home," Chambers noted during the break in a series of presentations for analysts.

Tough talk for a company that's just nine months into an ambitious—and some say risky—incursion into this $40 billion market. The first step was to enter the cutthroat market for servers, the powerful computers that handle Web traffic and hefty IT jobs. On Mar. 16, Cisco unveiled its first server, the centerpiece of a "unified computing" vision that placed the networking company in competition with computer giants Dell (DELL), Hewlett-Packard (HPQ), and IBM (IBM), all longtime partners that have resold billions of dollars' worth of Cisco gear each year. Cisco's servers began shipping in August.

The article goes on to analyze Cisco’s Unified Computing System (UCS) in several private clouds.

<Return to section navigation list> 

blog comments powered by Disqus