Home » Microsoft survival (Page 6)

Category Archives: Microsoft survival

4.5 Microsoft talking about Cloud OS and private clouds: starting with Ray Ozzie in November, 2009

Part of: Microsoft Cloud OS vision, delivery and ecosystem rollout

1. The Microsoft way
2. Microsoft Cloud OS vision
3. Microsoft Cloud OS delivery and ecosystem rollout
4. Microsoft products for the Cloud OS (separate post)

4. 1 Windows Server 2012 R2 & System Center 2012 R2
4.2 Unlock Insights from any Data – SQL Server 2014
4.3 Unlock Insights from any Data / Big Data – Microsoft SQL Server Parallel Data Warehouse (PDW) and Windows Azure HDInsights
4.4 Empower people-centric IT – Microsoft Virtual Desktop Infrastructure (VDI)
4.5 Microsoft talking about Cloud OS and private clouds: starting with Ray Ozzie in November, 2009

4.5.1 Tiny excerpts from official executive and/or corporate communications
4.5.2 More official communications in details from executives and/or corporate

4.5 Microsoft talking about Cloud OS and private clouds: starting with Ray Ozzie in November, 2009

4.5.1 Tiny excerpts from official executive and/or corporate communications

From: Ray Ozzie & Bob Muglia: PDC 2009 [speech transcripts, Nov 17, 2009]

RAY OZZIE:

image_thumb10Windows Azure, which we introduced right here on this stage last year, is our cloud computing operating environment, designed from the outset to holistically manage extremely large pools of computation, storage and networking, all as a single, dynamic, seamless whole, as a service. It’s a cloud OS designed for the future, but made familiar for today.

Microsoft’s New Leader of Server and Tools: ‘Our Mission Is to Cloud-Optimize Every Business’ [feature article for the press, June 22, 2011]

Satya Nadella shares his thoughts on trends in the technology industry and Microsoft’s unique position providing infrastructure to move the industry forward.

image_thumb8… “As the industry moves more and more towards the public cloud – which will take time – we’ll move from the private cloud ‘datacenter OS’ that represents thousands of processing cores to a ‘public cloud OS’ that will need to understand a million cores. Our customers will want a vendor who is both battle-tested in the operating system and in the cloud scale services. Microsoft will be that vendor.”

“You can’t head-fake your way into running a public cloud service,” he notes. “You have to live it.” …

alias “MS private cloud PR”: Microsoft Brings the Cloud Down to Earth for Enterprises [press release, Jan 17, 2012]

System Center 2012 is a true “private cloud builder.”

  • All Together Now: Private Cloud Simplicity and Best Economics
  • The Microsoft Private Cloud: Built for the Future. Ready Now

From: Meet the Team That Puts ‘Amazing Power’ at People’s Fingertips [Microsoft feature article for the press, Feb 14, 2012]

Members of the Windows Server team speak with Microsoft News Center [MNC] about their groundbreaking work in moving customers to the cloud—and what else they find fascinating.

  • MNC: How is your work going to change the world?
  • MNC: What’s next for you at Microsoft?

Windows Server “8” beta available now! [Windows Server Blog, March 1, 2012]

Bill Laing
Corporate Vice President, Server and Cloud

Microsoft Ushers in the ‘Era of the Cloud OS’ [press release, June 11, 2012]

Company shares updates to cloud platform and developer tools.

  • Connecting With the Cloud
  • Enabling Developers With Cloud Tools

Welcome to the Era of the Cloud OS for Infrastructure! [The Official Microsoft Blog, June 11, 2012]

Posted by Satya Nadella
President, Server & Tools Business, Microsoft

From: Satya Nadella, Scott Guthrie and Jason Zander: TechEd 2012 Day 1 Keynote [speech transcripts, June 11, 2012]

SATYA NADELLA:

What we are going to discuss over the next 90 minutes, the modern datacenter, the modern application framework that make up the cloud operating system, the basic underpinnings for this new era of connected devices and continuous services.

Microsoft Announces New Cloud Opportunities for Partners [press release, July 10, 2012]

New guidance, training and programs for Windows Server 2012 and Windows Azure unveiled at Worldwide Partner Conference.

In addition, the new program announced on stage, Switch to Hyper-V, will allow partners to grow their virtualization, private and hybrid cloud computing practices while also helping customers improve IT agility at a lower cost with Microsoft’s cloud infrastructure.

In addition, in a keynote that further reinforced how Microsoft is working with its partners to transform businesses throughout the world, Microsoft Business Solutions President Kirill Tatarinov highlighted the incredible opportunity in the year ahead for partners focused on selling business solutions based on Microsoft Dynamics.

A New Era Together: Partners and the Microsoft Cloud OS [The Official Microsoft Blog, July 10, 2012]

Posted by Takeshi Numoto
Corporate Vice President, Server & Tools Business, Microsoft

  • Partner Opportunity with the Cloud OS

From: Satya Nadella: Worldwide Partner Conference 2012 Day 2 Keynote [speech transcript, July 10, 2012]

The thing that I want to talk today about is the back end and how the back end is changing in the next era. And we refer to this as the cloud operating system or the cloud OS.

Windows Server 2012 Powers the Cloud OS [press release, Sept 4, 2012]

New server is built from the cloud up for the modern datacenter.

  • Enabling the Modern Datacenter
  • Customers Find Success With Windows Server 2012

Satya Nadella: Windows Server 2012 Launch Keynote [speech transcript, Sept 4, 2012]

For more than 50 years information technology has powered global innovation. And today, IT is in the midst of radical change as cloud computing transforms the landscape.

How can your organization take advantage of the new opportunities? Imagine data centers without boundaries, capacity on-demand. Imagine information crossing the globe seamlessly and securely, a modern platform for the world’s applications.

At Microsoft we unlock the full range of possibilities. We call it the Cloud OS and it’s here now.

Microsoft Reaches Agreement to Acquire StorSimple [press release, Oct 16, 2012]

Microsoft to acquire leader in Cloud-integrated Storage.

From: Satya Nadella, Scott Guthrie and Jason Zander: Build Day 2 [speech transcripts, Oct 31, 2012]

On the client side, we’ve talked about how we’ve reimagined Windows from the developer platform to the user experience. Again, in support of the kind of applications that you’re building for the devices today. These fluid, touch-first applications that also take advantage of capabilities in Windows RT to be able to truly bring to life all the new application capabilities.

Very similarly on the back end, we’re reimagining Windows for cloud services. It’s a pretty concrete thing for us. We refer to this as the Cloud OS. At the hardware level, for example, the core of any operating system is to think about the hardware abstraction. And the hardware abstraction is going through a pretty radical change. At the atomic level, we’re bringing compute, storage and network together, and then scaling it to a datacenter on a multidatacenter scale. So this is no longer about a single server operating system, but it’s about building distributed, virtualized infrastructure that includes storage, compute, and network and spans, if you will, across the datacenters.

alias “OS Moment PR”: Microsoft Advances the Cloud OS With New Management Solutions [press release, Jan 15, 2013]

New offerings deliver on the commitment to help customers and partners deliver cloud services and manage connected devices.

  • Transforming the Datacenter
  • Hosting Service Providers and the Cloud OS
  • Unified PC and Device Management

What is the Cloud OS? [The Official Microsoft Blog, Jan 15, 2013]

post from Michael Park, Corporate Vice President of Marketing in the Server & Tools Business at Microsoft

The Cloud OS: New solutions available today advance Microsoft’s vision [C&E News Bytes Blog, Jan 15, 2013]

Solutions announced today include:

  • General Availability of System Center 2012 Service Pack 1: …
  • Windows Intune: …
  • Windows Azure services for Windows Server: …
  • Global Service Monitor: …
  • System Center Advisor: …

Transform Your Datacenter with System Center 2012 SP1 [Server & Cloud Blog, Jan 15, 2013]

Mike Schutz
General Manager, Windows Server and Management Product Marketing

  • Windows Server 2012 and SQL Server 2012 Support
  • Software Defined Networking (SDN)
  • DevOps with Global Service Monitor and Visual Studio
  • Hybrid Cloud Management

Delivering Unified Device Management with Windows Intune and System Center 2012 Configuration Manager SP1 [Windows Intune blog, Jan 15, 2013]

Mike Schutz

General Manager, Windows Server and Management Product Marketing

This blog post highlights new device management capabilities in Windows Intune and System Center 2012 Configuration Manager SP1.

  • Windows Intune addresses new challenges IT departments face when managing devices, including: …
  • Configuration Manager 2012 SP1 enhancements, including: …
  • Endpoint Protection 2012 SP1 enhancements, including: …

Modern Lifecycle on the Cloud OS [Brian Harry’s blog, Jan 15, 2013]

Brian Harry
Microsoft Technical Fellow,
Product Unit Manager for Team Foundation Server.

Our approach to Enterprise DevOps is anchored in Visual Studio 2012 and System Center 2012. The wave of Cloud OS announcements today integrates these with bunch of new application lifecycle management capabilities. These include:

  • Global Service Monitor (GSM)
  • Lab Management & Windows 2012
  • Incident integration

Michael Park and Mike Schutz: Cloud OS Announcement [speech transcripts, Jan. 15, 2013]

Corporate vice president of Server & Tools Marketing Michael Park is going to lead off with a brief overview of the Cloud OS, and then our general manager Mike Schutz here in Server & Tools will talk through the new offerings and how they fit into the Cloud OS story for customers and partners.

alias “Hybrid Cloud PR”: Microsoft unleashes fall wave of enterprise cloud solutions [press release, Oct 7, 2013]

New Windows Server, System Center, Visual Studio, Windows Azure, Windows Intune, SQL Server, and Dynamics solutions will accelerate cloud benefits for customers.

  • Hybrid infrastructure and modern applications
  • Enabling enterprise cloud adoption
  • Data platform and insights
  • People and devices in the cloud
  • Software as a service business solutions

New Windows Server 2012 R2 Innovations – Download Now [Windows Server Blog, Aug 6, 2013]

Windows Server 2012 R2 is in preview right now and ready for your evaluation. We have been rolling out detailed information on our Cloud OS vision through Brad Anderson’s What’s New in 2012 R2 blog series. That will continue but we thought you would like a short consolidated list for consideration. Here are some key innovations in Windows Server 2012 R2.

  • Storage transformation – Delivers breakthrough performance at a fraction of the cost
  • Software defined networking – Provides new levels of agility and flexibility
  • Virtualization and live migration – Provides an integrated and high-performance virtualization platform
  • Access & Information Protection – Empowering your users to be productive while maintaining control and security of corporate information with Windows Server 2012 R2
  • Java application monitoring – Enables deep application insight into Java applications.

This is by no means a comprehensive lists of new features and benefits, but we just wanted to give you some information on the key focus areas.

Announcing the General Availability of Windows Server 2012 R2: The Heart of Cloud OS [Windows Server Blog, Oct 18, 2013]

For years now, Microsoft has been building and operating some of the largest cloud applications in the world. The expertise culled from these experiences along with our established history of delivering market-leading enterprise operating systems, platforms, and applications has led us to develop a new approach for the modern era: the Microsoft Cloud OS.

Delivered as an enterprise-class, the simple and cost-effective server and cloud platform Windows Server 2012 R2 delivers significant value around seven key capabilities:

  • Server virtualization.
  • Storage.
  • Networking.
  • Server management and automation.
  • Web and application platform.
  • Access and information protection.
  • Virtual Desktop Infrastructure.

To compete in the global economy and keep up with the pace of innovation, IT organizations must improve their agility, their efficiency, and their ability to better manage costs while enabling their business and end users to stay continuously productive.

alias “Cloud OS Network PR”: Leading cloud service providers around the globe bet on Microsoft [press release, Dec 12, 2013]

Cloud OS Network partners provide customers with consistent cloud platform.

  • Hybrid benefits for customers
  • Cloud service provider opportunity
  • Worldwide reach


4.5.2 More official communications in details from executives and/or corporate

From: Ray Ozzie & Bob Muglia: PDC 2009 [speech transcripts, Nov 17, 2009]

RAY OZZIE:
Earlier, when I talked about the three screens and a cloud environment, I talked fairly abstractly about this cloud computing back-end. Sometimes I might have referred to this back-end as a server or sometimes as a service. And for customers it really doesn’t matter, and that’s entirely the point, because our software plus services strategy is centered on the notion of technology convergence and skills leverage across both.
Windows Azure, which we introduced right here on this stage last year, is our cloud computing operating environment, designed from the outset to holistically manage extremely large pools of computation, storage and networking, all as a single, dynamic, seamless whole, as a service. It’s a cloud OS designed for the future, but made familiar for today.
Windows Azure at its core is Windows. It’s Windows Server. You should think of it as a vast, homogeneous array of Windows Server hardware and virtualized Windows Server instances, and all these servers are under the control of a sophisticated, highly parallel management system called the Azure Fabric Controller, which you can kind of think of as an extension of System Center’s management capabilities in the enterprise.
With Windows Azure, Windows Server, and System Center, there’s one coherent model of managing this infrastructure as a service across Microsoft’s public cloud to private cloud to clouds of our partners who host.
To most developers, to developers like you, Windows Azure appears as a model based extension to Visual Studio, enabling you to build apps that leverage your skills in SQL, IIS, ASP.NET, and .NET Framework.
Alternatively, and of course it’s your choice, you might leverage your skills by using MySQL and PHP within Azure, or you might instead take advantage of our new Azure tools for Java and Eclipse.
Reaching all developers is incredibly important to us, and working closely with the community developing for Windows Azure this year has come a long, long way.
It was only one year ago at PDC ’08 that we launched Azure by inviting you as PDC participants to our Community Technology Preview. We committed to spending the year engaged with you, listening and learning, and reshaping Azure before we took it live.

Microsoft’s New Leader of Server and Tools: ‘Our Mission Is to Cloud-Optimize Every Business’ [feature article for the press, June 22, 2011]

Satya Nadella, the new president of Microsoft’s Server and Tools Business (STB), recently told Microsoft employees that “Microsoft has always stood for democratizing access to computing platforms. We did it with PC-based server computing, the biggest democratizing force ever. We have a similar opportunity now with cloud computing that will make it possible for companies of all sizes, and countries of all GDPs, to really take advantage of latest technology to improve productivity and people’s lives.”

Nadella added that his top priority as STB president is to cultivate a vibrant engineering community armed with best tools around. That community in turn will lead the company through a computing shift every bit as transformational as the rise of the PC.

As the industry moves toward what he calls the “post-virtualization era,” Nadella reflects on the industry trends that are driving the shift. First, the notion of a modern operating system is shifting from software running on a single, physical server, to software running across an entire datacenter of servers. Services traditionally managed by a machine – storage, networking, compute – are no longer bound to a particular machine. This notion of an “elastic” infrastructure can have significant business benefits for customers. Moreover, he says, the data itself is becoming a platform developers can build on that leads to a whole new set of innovative application scenarios.

Delivering companies new value through the cloud will be at the center of everything STB does moving forward: “Our strategy in a nutshell is to cloud-optimize every business,” he said. That means offering businesses on-demand, scalable infrastructure and the ability to tap massive amounts of data for new business insight.

Nadella, a Microsoft veteran since 1992, was appointed to his new role in February. As president of STB, he is tasked with leading Microsoft’s enterprise transformation into the cloud and providing the technology roadmap and vision for the future of business computing.

Nadella said his vision for STB has been shaped by the previous stops on his Microsoft journey. Up until a few months ago, he was senior vice president of R&D for Microsoft’s Online Services Division (OSD). There he oversaw the technical vision and engineering of some of the biggest Web services in the world, such as Bing, MSN, and adCenter. Those online operations illuminated the sheer scale of infrastructure needed to run them; Bing alone is powered by 250,000 servers, which manage upwards of 150 petabytes (1 petabyte=1 quadrillion bytes.)

You can’t head-fake your way into running a public cloud service. You have to live it.

– Satya Nadella, President, Server & Tools Business,

The last time Nadella had thought about computing at that scale was in the abstract at graduate school. His boss at the time, OSD President Qi Lu, told him to embrace the new perspective.

“Qi would stress to me, ‘Look, as long as you don’t get Internet scale in its full-glory detail, you just don’t get the systems you need to build going forward,’” he said.

During four and half years at OSD, Nadella absorbed the lesson. He said his time at OSD prompted him to relearn infrastructure – something he wants to help Microsoft’s server business to do as it presses on into cloud computing. Massive systems infrastructure is required to handle workloads like Bing or Microsoft adCenter, which runs 20,000 simultaneous auctions each time a search query happens. That scale has shaped his thinking about the back-end infrastructure Microsoft must build going forward.

“As the industry moves more and more towards the public cloud – which will take time – we’ll move from the private cloud ‘datacenter OS’ that represents thousands of processing cores to a ‘public cloud OS’ that will need to understand a million cores. Our customers will want a vendor who is both battle-tested in the operating system and in the cloud scale services. Microsoft will be that vendor.”

“You can’t head-fake your way into running a public cloud service,” he notes. “You have to live it.”

Nadella doesn’t have to think twice when asked about his plans for STB’s future. “We have the leading server operating system share and the most widely used database, professional developer tools, and mission critical developer framework in the industry. But we can’t be complacent – we will continue to grow our existing business, but the cloud will shape the future of the industry, and we aim to be the industry leader.”

alias “MS private cloud PR”: Microsoft Brings the Cloud Down to Earth for Enterprises [press release, Jan 17, 2012]

System Center 2012 is a true “private cloud builder.”

In an online broadcast today from Microsoft Corp. headquarters, Satya Nadella, president of Microsoft Server and Tools Business, laid out how Microsoft’s private cloud solution will help businesses move faster, save money and better compete in 2012. He highlighted how companies, such as webcast participants Lufthansa Systems, T. Rowe Price and Unilever, can use Microsoft System Center 2012 to build and operate private clouds for the delivery of business applications across both private and public cloud platforms. System Center 2012 is available today in a Release Candidate as a single, integrated private cloud management solution for the first time.

“IT leaders tell me that private cloud computing promises to help them focus on innovation over maintenance, to streamline costs and to respond to the need for IT speed,” Nadella said. “We are delivering on that promise today. With System Center 2012, customers can move beyond the industry hype and speculation, and progress into the here and now of private cloud.”

All Together Now: Private Cloud Simplicity and Best Economics

New advances in System Center 2012 demonstrate Microsoft’s commitment to easing the acquisition, deployment and economics of private cloud computing.

“A private cloud is our answer to corralling our server infrastructure into a single entity we can use to more rapidly deliver services that really matter to our business,” said Peter Daniels, vice president of IT at T. Rowe Price. “System Center 2012 is truly a game changer.”

image281System Center 2012 integrates eight separate component products into one unified solution, streamlining installation and reducing the time it takes to deploy from days down to hours. The number of product versions has also been simplified, so customers will be able to choose between the Standard and Datacenter editions of the product, based on their virtualization requirements. And because System Center 2012 Datacenter edition licensing covers unlimited virtual machines, customers can continually grow their private clouds without additional licensing costs for virtualizing their infrastructure and applications.

The Microsoft Private Cloud: Built for the Future. Ready Now

Lufthansa Systems and Unilever are also relying on System Center 2012 and the Microsoft private cloud.

“We are making the move to cloud computing across our company, and after looking at our options, Microsoft offers the right solutions for us,” said Holger Berndt, head of Microsoft Servers at Lufthansa Systems. “With the integrated approach and technology, we can use the people and skills we have in place now to build the private cloud services we need to meet the complex IT requirements of our customers. Microsoft brings it all together, including the clear path to public cloud on Windows Azure.”

“Our private cloud will help us meet our goal of doubling Unilever’s business without increasing our environmental footprint,” said Mike Royle, enterprise services IT director at Unilever. “Working with Avanade, we are betting on System Center 2012 as the management platform to extend our investments in virtualization toward private cloud, to automate processes, and to ensure the reliability of our infrastructure and application services.”

More information is available at the Microsoft Server and Cloud Platform website, including the on-demand broadcast, links to the Microsoft private cloud evaluation software and more. The conversation on Twitter can be followed at #MSFTprivatecloud.

From: Meet the Team That Puts ‘Amazing Power’ at People’s Fingertips [Microsoft feature article for the press, Feb 14, 2012]

Members of the Windows Server team speak with Microsoft News Center about their groundbreaking work in moving customers to the cloud—and what else they find fascinating.

  • Betsy Speare, a principle program manager lead in the Windows Server Manageability team
  • Erin Chapple, a partner group program manager in the Server and Cloud Division
  • Jeffrey Snover, a Distinguished Engineer who is also the lead architect for Windows Server

What Chapple, Speare, Jeffrey Snover …, and the rest of their team are working on right now is the next version of Windows Server, code-named “Windows Server 8” [Windows Server 2012], which will provide better management capabilities, increased security and significant cost savings. Windows Server 8 will also help many Microsoft customers move more of their business to the cloud.

“Windows Server 8 really sets us up to enable the little guy to get ahead,” says Speare, whose responsibilities include overseeing Group Policy, the most widely used management tool in the world. “That’s what the cloud does; it puts this amazing power at everyone’s fingertips. With this release, we’re building the platform for that. When people who aren’t deeply technical have the capability to create solutions because the power is right there, it will be amazing to see what happens.”

Microsoft News Center (MNC) recently sat down with Chapple, Speare and Snover, to talk about Windows Server and life in general.

MNC: How is your work going to change the world?

Snover: Servers really are changing the world. Literally. Look at all these mobile phones. The reason why they exist is because of servers at the back end. In the past, when you had to run entire applications on your client device, the client had to be a big monster machine or you couldn’t do stuff. Now that most of the processing is done in a data center, you can get a great experience on a very small device.

Windows Server 8 is the biggest, most transformational server release we’ve ever had. It’s not just the great advances we’ve made in storage, networking and virtualization. What’s most transformational is the change of identity. In past, we always viewed Windows Server as an operating system for a single server. With Windows Server 8, we now see it as a cloud operating system, which is to say an OS for lots of servers and all the devices that connect them. That means we’re able to give customers a far more coherent experience at lower cost and lower effort on their part.

Chapple: One of the key things we work on is a technology called PowerShell. As you think about what’s happening in the world today, with the proliferation of servers, devices and services, our customers need a way to manage all those components that is efficient, one-to-many, repeatable and consistent. PowerShell is our answer to that.

Customers tell us they feel overwhelmed by the number of things they have to do to manage their environment. PowerShell can help them gain control over their environment and get them out of that world of chaos.

Snover: By enabling the move to the cloud, servers are even transforming the way we do science. In the past, science was driven by hypotheses. Someone would think about the world, generate a hypothesis, and then run a set of experiments to validate or invalidate it. But with the large data centers we have today, we can take an entirely new approach. We’re now able to configure these servers to throw massive computing power at a problem, and to reverse engineer a hypothesis based on what the data is telling us. This allows us to solve bigger problems than we’ve ever been able to solve before.

MNC: What’s next for you at Microsoft?

Snover: What’s next for me is figuring out how we take this vision of a cloud OS and break it down into discrete steps. That’s really a 10-year-plus vision. It’s a dramatic increase in the scope of what we want to do. So how do we break that down and ensure that Windows has a smooth transition between where we are and where we need to be?

Bill Gates once said, “Vision is cheap.” At the time, I thought he was a bit of a jerk for saying that. But I then realized that he was right. Vision is cheap. The hard part is figuring out how to get from here to there. There have been many projects with grand visions that have run themselves onto the rocks because no one could break them down into a step-by-step approach. That’s what my job is.

Chapple: I was fortunate enough to take my sabbatical last fall and travel the world for three months, which gave me a chance to clear my head, recharge, and figure out what I want to do. It gave me this great perspective.

I feel like I’m at the end of one journey and the beginning of another. I’ve been working in manageability for the last five years or so, and when I started in manageability it was a four-letter word. People were like, “I don’t want to think about how I make my product manageable; I want to just build great features.” With the move to the cloud and the move to services, the manageability of our system has become more of a focal point and an asset. With Windows Server 8, we really have pulled all the pieces together and we’re delivering a great solution.

I am just so proud of the work we’re doing. We’re at this inflection point from a cloud perspective. There’s a great opportunity to think about what we want to do with Windows Server, and how we hope to help people migrate to the cloud. So I’m all in, in terms of figuring out what the next turn of the crank means for Windows Server. I think we have more opportunities than we ever had in the past, and it’s exciting to be part of that.

Windows Server “8” beta available now! [Windows Server Blog, March 1, 2012]

Bill Laing
Corporate Vice President, Server and Cloud

The beta of Windows Server “8” is now available for IT professionals and software developers around the world to download, to evaluate, and to give us feedback on.

In September we introduced Windows Server “8” with a preview to help developers and hardware partners prepare new and existing applications, systems and devices. The response from that community, along with hundreds of customers in our early adopters program, has been incredibly positive. A common theme of feedback has been how broad and deep the new capabilities are.

Now is the time for you, IT professionals in organizations of all sizes, to get your hands on this new release, discover the new capabilities and contribute to the development of what we call the cloud-optimized OS.

I’ll highlight in this post just a few examples of new capabilities that you’ll want to explore.

With the new Hyper-V we are taking virtualization above and beyond to provide a multi-tenant platform for cloud computing. For example, with Hyper-V Network Virtualization you can create virtual networks so different business units, or even multiple customers, can seamlessly share network infrastructure. You will be able to move virtual machines and servers around without losing their network assignments.

In Windows Server “8” we are delivering high availability and disaster recovery through software technology on much more cost effective hardware. For example, with File Server Transparent Failover you can now more easily perform hardware or software maintenance of nodes in a File Server cluster by moving file shares between nodes with little interruption to server applications that are storing data on those file shares.

We’re also delivering a tremendous amount of new capabilities for multi-machine management and automation. You will want to explore the dramatic new improvements to Server Manager, as well as the new Windows PowerShell. With 2,300 commandlets provided out of the box, Windows PowerShell allows you to automate everything you can do manually with the user interface. And, with technologies like Intellisense, we’ve made it very easy for you to master all of that power.

Additionally, Windows Server “8” provides a powerful server application platform that enables you to develop and host the most demanding of application workloads. For example, with .NET Framework 4.5 you can take advantage of new asynch language and library support to build server and web applications that scale far beyond what other platforms provide. Our new IIS 8 web server provides better security isolation and resource sand-boxing between applications, native support for web sockets, and the ability to host significantly more sites on a server.

This is just a brief taste of the hundreds of features and capabilities you will find in the beta. (My team has written a number of other posts you can read here.) If you have been using and providing feedback on the developer preview of Windows Server “8,” thank you! I can’t wait for more people to start trying out Windows Server “8” and letting us know what they think.

Microsoft Ushers in the ‘Era of the Cloud OS’ [press release, June 11, 2012]

Today at the 20th annual TechEd North America conference, Microsoft Server and Tools Business President Satya Nadella welcomed a sold-out crowd of more than 10,000 to the era of the cloud operating system (OS) for infrastructure. Nadella described how the cloud OS drives both the modern datacenter and enables the development and management of modern applications, demonstrating how customers can benefit from this transformation with agility, focus and lower costs. He also announced updates to the company’s developer tools and availability of the next release of Windows Intune, the company’s cloud-based solution for PC and mobile device management and security.

Built on decades of experience gleaned from running massive datacenters at scale, Windows Server 2012 is the cloud-optimized server OS for customers of all sizes, and Windows Azure, updated with new services and features, delivers both infrastructure-as-a-service and platform-as-a-service capabilities. Built to complement each other with consistent development, management and identity, they make it easier to create, migrate, deploy and manage applications across public, private and hybrid clouds.

“The operating system does two things: it looks after the hardware, and it provides a platform for applications. The modern datacenter and modern apps put more pressure than ever on infrastructure to become truly cloud-optimized, and that’s where Microsoft builds on our legacy with the OS to help our customers,” Nadella said. “Microsoft is your partner in the transformation of IT because only Microsoft offers the modern, yet familiar, platform that enables you to connect with the cloud on your terms.”

Connecting With the Cloud

Every customer’s path to the cloud will be unique, which is why Microsoft Corp. is optimizing its technologies and tools so customers can easily connect to public, private or hybrid clouds when they are ready. Customers, such as Aflac Inc., ING Direct and Tribune Co., are already working with Microsoft to expand their datacenters into the cloud to scale on demand and help reduce infrastructure costs.

“Delivering content to thousands of users across multiple devices and platforms requires a level of infrastructure that’s easier to manage and more affordable with the cloud,” said Denise Schuster, senior vice president of Digital Innovations for Tribune Company. “We are currently using various methods to deliver content via cloud services, and we’re moving our digital content to Windows Azure to help us find new ways to better deliver a targeted, more personalized experience for our customer base.”

To ease customers’ transitions to the cloud, Microsoft underscored the release candidate of Windows Server 2012, recent updates to Windows Azure and the release of Windows Intune. Together, these new releases make it even easier for customers to leverage one modern operating system across public, private and hybrid clouds and manage a multitude of devices connected to those clouds.

  • Windows Server 2012 Release Candidate (RC). Released May 31, 2012, the Windows Server 2012 RC is available for customers to download and evaluate today. Advancements in storage, networking and scalability have been drawn from Microsoft’s experience running public cloud services.
  • Windows Azure. Windows Azure is newly updated with preview support for Virtual Machine and Virtual Network, support for Windows and Linux images, and additional support for Java and Python.
  • Windows Intune. The next release of Windows Intune, now available athttp://www.microsoft.com/en-us/windows/windowsintune/pc-management.aspx, includes expanded management and security benefits through mobile device management and adds people-centric management capabilities and upgrade rights to the latest version of Windows.

Enabling Developers With Cloud Tools

To streamline and accelerate application development and deployment, Microsoft Team Foundation Service, an application lifecycle management tool hosted on Windows Azure, is now even easier for teams to integrate into their development process. Team Foundation Service features collaboration tools, a code repository, and powerful reporting and traceability tools to help teams more effectively manage software development. Beginning today, a public preview of Team Foundation Service is available at http://tfspreview.com.

To help developers and IT professionals build immersive experiences that scale across devices and the cloud, the company announced that Microsoft LightSwitch, an easy-to-use development tool for quickly building applications, will now render HTML5. Integrating HTML5 into LightSwitch will enable developers using the tool to target any device or platform supporting HTML5.

Those who want to learn more about today’s news or watch the keynotes should visit the TechEd North America 2012 virtual press kit. Those who want additional context on the news should visit Satya Nadella’s post on the Official Microsoft Blog.

Founded in 1975, Microsoft (Nasdaq “MSFT”) is the worldwide leader in software, services and solutions that help people and businesses realize their full potential.

Welcome to the Era of the Cloud OS for Infrastructure! [The Official Microsoft Blog, June 11, 2012]

Posted by Satya Nadella
President, Server & Tools Business, Microsoft

Twenty years ago at Microsoft’s first annual TechEd conference, we gathered to talk about the industry transformation from mainframe to minis to client/server computing. The vehicle for that transformation was the Windows operating system. Today at TechEd, we’re again talking about the industry transformation: the transformation to the cloud. Once again, the Windows operating system is the vehicle for this transformation.

Let me step back. At the most basic level, any operating system has two “jobs”: it needs to manage the underlying hardware, and it needs to provide a platform for applications. The fundamental role of an operating system has not changed, but the scale at which servers are deployed and the type of applications now available or in development are changing massively. On the hardware front, the “unit” of hardware abstraction that a server OS manages has now reached the “datacenter” level. And by that I mean a datacenter ranging from the smallest cluster of a few servers to the very massive footprint of one of Microsoft’s global installations with thousands of servers across multiple geographically distributed datacenters.

In response to the needs of large-scale service providers pushing the limits of technology every day, networking, storage and compute vendors have responded by delivering significant innovations to help increase scale, performance and to help remove bottlenecks. These industries have all driven this transformation in parallel. Now, we must think beyond a server at a time and instead look at the OS as the driver of the datacenter. Today’s datacenter is a scalable, intelligent, automated environment spanning all of the shared resources, and it is the magic of software that brings this all together to orchestrate the three resources of the datacenter: network, storage and compute. In other words, a cloud OS.

Just as the job of managing hardware has been transformed, the job of running applications has also shifted. We live in an era of many devices, where applications need to span across PCs to phones to tablets with an adaptive backend that can keep it all together. The ways in which we interact with those applications – and by that I mean both through touch and swipe and click AND through “likes”, “follows” and “shares” – pushes us forward. And the huge amounts of data that feed and enable these modern applications through the cloud needs to be managed. Again, the need for a cloud OS.

Microsoft has been at the center of this transformation. As a large-scale service provider, we’ve been experiencing all of these changes real time, in our datacenters and through our services, learning and applying those learnings to what we build even as we work with the industry to push the limits of the technology. Our conversation since that first TechEd, and the focus on the operating system, isn’t new: the OS is still the intelligence that makes it all work. What has changed is the scale, the scope, and the range of the infrastructure OS to deliver against the opportunities the cloud presents. With Windows Server 2012 and Windows Azure, we’ve taken everything we’ve learned from running datacenters and services at global scale and are delivering the next generation of operating systems – the “cloud OS” – to help our customers seize the opportunities of the cloud.

I’m looking forward to talking a lot more about this new era of the cloud OS at TechEd this week and how we are helping customers make the very most of this transition. If you aren’t able to join us live in Orlando this week, I hope you’ll have a chance to view the keynotes online. It’s an exciting time to be in IT!

From: Satya Nadella, Scott Guthrie and Jason Zander: TechEd 2012 Day 1 Keynote [speech transcripts, June 11, 2012]

SATYA NADELLA:
… what we are going to discuss over the next 90 minutes, the modern datacenter, the modern application framework that make up the cloud operating system, the basic underpinnings for this new era of connected devices and continuous services.
When you talk about the modern datacenter, it’s perhaps best to start with what’s happening at the system level, what’s happening at the silicon, what’s happening to a single blade, a single system, and then a cluster.
The fundamental thing that all of us at this point are tracking pretty closely is the notion that storage, compute, and network are co-evolving. I mean, if you think about the compute power, for sure Moore’s Law is still continuing to work in its full glory. It probably is resulting in more core density versus perhaps single-thread performance, but still we’re able to pack amazing amounts of compute power.
Once you have a lot of compute power, there is not much use to it if you can’t really have the IOPS that have to go with it. And so the revolution in storage, especially around tiering, is what’s really in full play, because the disk speed itself is not something that’s going to faster, but at the same time the fact that SSD costs and Flash costs are coming down give us a huge opportunity to rethink, especially if you think about what’s happening in the database world with in-memory, you can sort of see that you can start thinking about applications and application performance and IOPS per dollar in a very different way in terms of the nexus between CPU utilization and storage access.
But one of the things which is really an artifact of the CPU to storage connection is the network. It’s the fast interconnect, it’s sort of the era of the fast interconnect between storage and compute that’s driving a lot of innovation.
And the key to this co-evolution of storage, compute and network is really software control. You really cannot afford to have the control fragmentation, because if you do that then you’re not going to be able to achieve the economic benefits, the agility benefits and the innovation benefits that these new systems at high density can provide.
Now, perhaps you can sort of say, well, that’s something that’s been true. After all, Moore’s Law has really played this out even in the past, over the last 10 years in particular as we have gone to these clusters and blades and software-based solutions, but one of the fundamental things that I believe has changed is services at scale, and this is a very big difference for me personally since 1993. When we were building Windows NT, we didn’t have in-house at scale workloads on NT. Subsequently we got onto a fantastic virtuous cycle, which is the fact that we had hit workloads in Exchange, in Lync, in SharePoint, in SQL Server ensured that with each release of our server operating system we were able to get the feedback from you and learn continuously from you and make that product better and better and better and more robust, and that’s sort of testament to sort of all the deployments that we have.
But for the first time now the same kind of cycle of learning is playing out when we talk about Internet scale services. Just think about the depth and breadth of the first-party workloads that Microsoft is running today on a daily basis. We have Xbox LIVE that’s doing some fascinating GPU simulation in the cloud for some of their games. You have Office 365 which is Exchange and SharePoint at scale. You have Dynamics CRM which is a stateful transactional application in the cloud. You have Bing, which is really a big data-applied machine learning application in the cloud. You have things like HealthVault, which is secure transactional consumer applications. So, you have a very broad spectrum. So, we run approximately 200 very diverse workloads across Microsoft.
That diversity is what’s really making us build the right operating systems, the right management stack, the right tools. In fact, perhaps the best way to illustrate it is what happens to us on a daily basis. For example, just in terms of the physical plant we have around 16 major datacenters across the globe, we have around a thousand access points, we have a couple of hundred megawatts of power powering hundreds of thousands of machines. We have terabits of network out of our datacenters. We have petabytes of data. In fact, Bing itself has got approximately 300 petabytes of data. We write something like 1 terabyte of records each day.
Now, all of that you could say is fascinating statistics; what does that really have to do anything with infrastructure that we build? It’s just that we are battle-testing every piece of software. Just, in fact, last week, we upgraded all of the Bing front ends to Windows Server 2012 RC. And so today, Bing is running the release candidate of the next server operating system in full production workload.
That type of feedback where we are constantly able to take the learning internally is what’s shaping the host OS, the guest OS, the frameworks, the tools, the performance, and that we believe is not something you can easily — you can’t just head fake it, you can’t just go in and say we’ve built it for scale without having run if yourself, and I think that that’s perhaps in the long run going to make one of the biggest distinctions.
Of course, none of this matters if you can’t scale minimize it, because it’s not as if every deployment of a private cloud or a virtualization instance is going to be at the scale we run. So, the key is for us to be able to take all of that power, all of that learning, package it up into the smallest of clusters, half a rack, a full rack or what have you, and that’s really what our intent is.
And when you think about that as the backdrop, the criteria to look at a modern datacenter, there are four key attributes that I would say that one should look at. The first one is the scalability and the elasticity, and you need the elasticity to go with it, especially in the context of a heterogeneous set of workloads when you’re running in particular highly virtualized distributed environments, because you want to get utilization up and without elasticity you’re not going to be able to achieve it with all the amount of scale.
The second one is always up, always on. There’s no point having all of the scale and elasticity without the continuous availability.
Shared resources, building out for multitenancy from the ground up; it could be when your private cloud, the fact that you’re running two departments, two applications, and you want to be able to isolate them.
And then, of course, automating, because you can’t linearly scale your operations with your infrastructure, and that means automation, automation, automation, and that is something that is a very super important thing to make sure that the system provides the hooks for you to be able to achieve that and then lower your costs.
So, that’s what really inspired us to build Windows Server 2012. It’s an amazing, amazing release. In fact, you know, as we were preparing for this event perhaps the biggest struggle we had is we have a 90 minute keynote, we have a lot to show, what features of Server 2012 do we even get to demo and talk about is perhaps the thing that really troubled us the most, but Windows Server 2012 has hundreds of features, and I just wanted to highlight a few of them in the context of this notion of a modern datacenter.
When it comes to scalability and elasticity, the performance gains, the sheer capability gains of the host operating system, Hyper-V 3.0, are just stunning. Just one of them, the notion that we now have in one VM the ability to support 64 virtual procs and 1 terabyte of memory is something stunning, because we can now run pretty much 99 percent of the tier one SQL workloads can be virtualized on Hyper-V 3.0, and it’s a pretty stunning figure and you’ll see a lot more of that.
Always up, always on is something that again has been built deeply into the system. Something which is sort of a feature that I love the most is this ability to update the cluster without having to bring down the cluster nodes, and have continuous availability.
Continuous availability of storage, huge gains in that dimension.
Shared resources. Again, multitenancy both with System Center and Windows Server has been built into the foundation. The ability to have network virtualization, storage virtualization to go with server virtualization is what makes it possible for you to have a fully virtualized environment that is sharable.
And if you have these multiple workloads from multiple departments you can isolate them using policy, you can monitor the resource usage using policies and make sure that there isn’t one workload that takes away all of the resources. So, a lot of gains again when it comes to sharing of resources across the virtualized infrastructure.
And lastly, when it comes to automation and self-service we have done a lot in terms of exposing the surface area of PowerShell. It’s actually a pretty amazing release for those of you who are big PowerShell users in terms of the commandlet explosion that we have had so that you can automate pretty much anything that’s there in Windows Server. We have 2,400 commandlets in PowerShell. We have built-in standards based management, and of course with System Center you have a full capable datacenter management suite.
So, to show you some of this in action I wanted to invite up onstage Jeff Woolsey from our Windows Server 2012 team. Jeff? (Applause.)
JEFF WOOLSEY: Thanks, Satya. It’s a pleasure to be here.
How’s everybody doing? (Cheers, applause.) Oh, come on! How’s everybody doing? (Cheers, applause.) Awesome. Welcome to Orlando. It’s a big, big, exciting show.
Well, Windows Server 2012 is about making your business more agile. It’s about making your datacenter more flexible, and providing you the ability to extend your datacenter to the cloud securely on your terms. Quite simply, Windows Server 2012 is about providing the best cloud OS.
Let’s start with scale. With Server 2012 we want to virtualize those workloads considered non-virtualizable, workloads that require dozens of cores, hundreds of gigabytes of memory, are likely SAN attached and with exceptionally high IO requirements.
Well, today, we want to redefine performance, we want to redefine scale. So, today, with Server 2012 and Hyper-V we’ll support up to 320 logical processors per server, up to 4 terabytes of memory per server, and up to 64 virtual processors per VM.
In addition, you can see we support I’ve got 100 gigabytes of memory allocated to this virtual machine, but we’ll support up to a full terabyte of memory for a VM. And whether this VM has been allocated 10 gigabytes, 100 gigabytes or a full terabyte, it still costs the same.
In terms of virtual storage our virtual disks now support up to 64 terabytes per virtual disk. That’s 32 times anyone else in the industry.
We also support the largest clusters with 64 nodes and up to 4,000 virtual machines in a single cluster.
Now, if I give you a virtual machine with 64 virtual processors and a terabyte of memory, quite honestly that’s irrelevant if I can’t provide the ability to give you the IO to actually keep those workloads and those resources busy.
So, let’s take a look at Hyper-V IO performance. Now, before I do, let me tell you a little bit about the hardware I’m about to show you. This is an industry-standard four socket server. It’s got 80 logical processors, 256 gigabytes of memory. It has five LSI HBAs attached to 40 SSDs.
Now, you may be thinking, hold on here, why is he using SSDs, why is he not using traditional spinning media? Well, for this next demo we certainly could have used 15k SAS disks. However, we would have needed 4,000 disks in 10 full sized 42U containers, racks, full of disks. So, we decided to opt for SSDs instead.
Let me show you. I’m going to switch on over here the Iometer. Iometer is an industry standard tool and, in fact, the configuration and test that I’m going to run is industry standard. This is 4k random IOPS. This is the hard stuff, not the easy sequential stuff. This is 4k random IOPS, cued up to 32, 40 concurrent threads.
By the way, the guys over at VMware claim that they can deliver up to 300,000 IOPS from a single VM.
Well, let me show you with Windows Server 2012 we’re delivering 985,000 IOPS from a single virtual machine. Let me say that one more time: over three times more IOPS from a single virtual machine. (Cheers, applause.)
And let me be very clear: This is not a Hyper-V limitation. We can go much, much higher. This is as fast as the hardware will go. We couldn’t put any more host bus adaptors in this machine.
So, with support for up to 64 virtual processors, a terabyte of memory, and nearly a million IOPS in a single server, we can run over 99 percent of the world’s SQL Servers.
Now, while we’re talking about storage, by the way, let me talk about some of our other investments in storage. For example, in Windows Server 2012 we’ve made some huge investments in file-based storage. For example, we have a new scale-out file server. With the scale-out file server it intrinsically, because of the architecture, it’s an active-active architecture, which intrinsically inherently means as I add more nodes I get more scale, but I also get more continuous availability because I can remove or add nodes without any down time. It’s an extremely powerful new capability in Server 2012.
And then there’s what we’ve done with SANs. Quite honestly, this is earthshattering with offloaded data transfer or ODX. With offloaded data transfer Windows Server 2012 can leverage the native SAN array capabilities in your array.
Let me show you. In this first example I’m going to copy a 10 terabyte file using non-ODX storage. Now, you can see in this example from a CPU standpoint I’m getting about somewhere between 35 to 40 percent CPU utilization. In terms of networking you can see we are fully saturating Ethernet. We’re getting about 78 megabytes per second; not too bad, but in this case the server is performing all of the copying. It’s reading from the source and writing to the destination, reading from the source and writing to the destination.
Well, now on the split screen let me actually copy the same file, 10 gigabyte file, using ODX enabled storage.
Now, make sure you don’t look away. I’d hate it if you missed the demo here.
Again this is a 10 gigabyte file, and what are you seeing? You’re seeing that I’m copying and getting over a gigabyte per second. I’m copying a 10 gigabyte file in 10 seconds. (Applause.) Awesome ODX-enabled storage from our partners over at EMC. And by the way, there was no network utilization at all, because this was leveraging the capabilities in the array. When you couple ODX with a bunch of our other enhancements in storage, virtual fiber channel, cluster enhancements for replication and synchronous replication, as well as a swath of other capabilities, quite simply if you own a SAN, Windows Server 2012 is a no-brainer, it’s really that easy.
Now let’s talk about networking. In Server 2012 we made a huge investment in networking, for example network virtualization. With network virtualization I can have multiple companies, disparate organizations, all sharing the same physical fabric with secure multitenancy. In addition, we have features like Windows NIC teaming that brings LBFO into the box, and literally dozens and hundreds of new capabilities when it comes to Windows Server networking.
In terms of the Hyper-V switch we’ve done a tremendous amount of work in the Hyper-V switch for performance, security, manageability, automation, and one of the things we did was we knew that we couldn’t be all things to all people. So, what we decided to do was also make it open and extensible.
So, for example, I’m going to go here to the virtual switch manager and you can see I’ve got the Cisco Nexus 1000V for Hyper-V running right here.
Now, in this case you can see in the split screen I’ve got a couple VMs over here that are using quite a bit of bandwidth, and my network admins, they like to keep an eye on their network utilization and they want to apply a QoS port policy to this. No problem. I can manage it because I’m using the Cisco Nexus 100V in the same way I manage my other infrastructure. And in this case I’m going to use simply the Nexus 100V admin tool, and I’m going to modify the port profile, and I’m going to use a QoS port profile. And like that, I’m applying QoS port profiles on my virtual switches just like I can on my physical switches.
Now, this is just one example of the ecosystem we’re creating with the Hyper-V extensible switch. While you’re here, make sure you check out the tech expo. We’ve got a lot of partners that are plugging into the extensible switch, and, in fact, there’s a lot of excitement in the industry around where networking is going right now, and including a lot of people embracing software-defined networking.
Now let’s talk about automation. One of the best ways to reduce costs and improve efficiency at scale is pervasive automation. With Windows Server 2012 we’re dropping in a V12 world class automation engine in PowerShell. With over 2,400 PowerShell commandlets everything you want to do in server now can be automated. (Cheers.) Got a winner out there.
One of the things we wanted to do is in this next demonstration I wanted to show how we’re coupling PowerShell with site migration capabilities utilizing one of our hotly anticipated features, Hyper-V Replica.
So, in this case I’m going to bring up System Center, and I’m going to start my runbook. Now, what I’ve been doing is I’ve been using Hyper-V Replica to replicate virtual machines from one site to another. Now what I want to do is I actually want in a systematic and methodical way to actually bring them up on my new site.
So, first, I’m going to type in my destination host, going to type in my source host, and I’m going to provide the server that’s actually going to do the runbook automation, and I’m going to click start.
Now, while that’s happening let’s move on over to instances and view the details. And you can get a high level overview of what’s actually happening here. What’s happening here is through runbook automation System Center is using PowerShell as the automation engine to actually make sure that everything is in place to begin the migration from my workloads from one site to the next. It will then bring up my virtual machines in the correct order with dependencies configured in the runbook automation. All of this very cool, brought to you by Hyper-V replication, and of course at its core PowerShell.
Now, what if you don’t want to just migrate your workload, but what you’d really like to do is extend your datacenter to the cloud using capabilities and capacity from a provider. Well, let me show you how we do that with Windows Server 2012 and System Center 2012 SP1.
You can see here I’ve got a view of my clouds on-premise: dev cloud, infrastructure, preproduction and production environments. But what I’d like to do is I’d like to connect to my service provider. So, I’m going to go to connections, which is where I broker connections. Here I’m going to click on connect and you can see I have the option to connect to another VM in the server or use SPF, the System Center Provider Foundation. This is a powerful new capability that allows me to take on capacity provided to me by my service provider.
Now, in this case I’ve bought capacity from Orlando Hosting, and they have provided me a URL. Of course, I need a certificate for pretty obvious reasons, encryption. Type in my password and click OK.
And what you’re seeing is in a few easy steps what I’m doing right now is System Center is brokering the connection with Orlando Hosting so that I can provide that capacity and manage that capacity under my control. In fact, if I go back on over here to cloud what do you see, you’ll see that Orlando Hosting now appears in my console in the context of my other clouds running on-premise. Very cool stuff here.
So, in just a few moments we’ve flown through literally a whole bunch of technologies and capabilities, but one thing I want to be very clear about, quite honestly I haven’t even scratched the surface of what’s new in Server 2012: with massive scale, massive performance, complete VM mobility, the only virtualization platform that allows you to live migrate servers with nothing but an Ethernet cable, PowerShell automation, offloaded data transfer, and the ability to extend your datacenter to the cloud with System Center. Quite simply, these are just a few of the dozens of reasons why Windows Server 2012 and System Center 2012 is the best way to cloud optimize your business.
Thank you very much. (Applause.)
SATYA NADELLA: So, hopefully you got a quick glimpse of the power in Server 2012. It’s a fantastic release, and I think over the course of this conference you will get a chance for many drilldown sessions on the hundreds of features in Server 2012.
Ever since the beta there has been tremendous traction with our customers. Over 300,000 customers used the product since beta, and, in fact, the first three months after the beta. Since we went and had the RC launch, in the first week we had 80,000 customers download the RC, as I said. Internally we have RC deployed in production.
We had 150 customers who are part of our TAP program that we work with very closely, many of them already taking the RC and put in production workloads that we support, so a tremendous amount of progress.
So, let’s roll a video with some of the comments from the customers who have been using Windows Server 2012.
(Video segment.)
SATYA NADELLA: And as Jeff was mentioning, we are building Windows Server 2012 of course to power your datacenters and your private clouds, but we’re also building it with in mind an overall broader ecosystem. We want to make sure that there is a consistent world of Windows Server across the service provider, Windows Azure, as well as your datacenter, and that’s one of the most important technical and strategic goals for us at Microsoft.
When we say consistency, the key thing is for us to ensure that identity, virtualization, management, and development is something that is consistent across the service provider, Windows cloud, your datacenter, and Windows Azure.
And in that context last week, we announced a major set of revamp and features for Windows Azure, and one of them was our infrastructure as a service. With the launch of infrastructure as a service capabilities in Windows Azure you now have virtual machine portability with no changes to format, the ability to take an app and a workload and move it transparently from your own private cloud to Azure, to a service provider, and back with no lock-in is something that you can do.
So, I wanted to give you a feel for some of the new capabilities in Windows Azure and the infrastructure as a service, and to do that I wanted to introduce up onstage Mark Russinovich from our Windows Azure team. Mark? (Cheers, applause.)
MARK RUSSINOVICH: Good morning, everybody.
So, I know most of you like automation, especially with PowerShell, but we wanted to make it so easy to create virtual machines in Windows Azure that even your boss can do it. And so for that I’m going to switch over to the newly designed and Metro-optimized Windows Azure portal.
Now, there’s been an explosion of a certain class of devices and a specific device that I suspect many of you are using. So, we wanted to make sure that this new portal works on all operating systems and all browsers. So, to answer your question that I know you’ve got in your head, yes, it will look great on your Nokia Lumia 900 Windows Phone 7 device.
Now, here you can see all the resources that we can manage in the portal, including virtual machines. We’ve got a consistent experience for creating new resources here you can get with this new button down here. I’m going to show you how to create a new virtual machine and how easy it is. When I select this menu item I’ve got two options. One is quick create, which lets me with a single dialog box pick the most common options for creating a virtual machine in one click. But I’m going to show you some of the more advanced features that we’ve got with this release by picking the gallery option.
And so you can see the list of platform images I can select from, and no, we haven’t been hacked, we’ve actually got Linux up here in Windows Azure. We’ve worked closely with these companies making these distributions to support them on Windows Azure. But for this demonstration, of course, I’m going to pick the best operating system in the list, the one that is so good that it doesn’t even need an icon, and that is Windows Server 2012. (Laughter.)
Press next here, I’ll give it a sample name, a password that will make it happy. I hope I’ve got those matched. And press next.
Now, in this dialog I get asked whether I want to create a standalone virtual machine or to add this virtual machine to an existing virtual machine to create a cloud service that consists of multiple virtual machines. I’m going to go ahead and select standalone virtual machines, give this virtual machine the DNS name that I can access it over the Internet with, and now I pick the storage account into which I want the operating system VHD to be placed, and that’s because we’re using Windows Azure storage underneath to store VM VHDs.
I click to use automatically generated storage account and it will create one for me, or I could pick one that I’ve already got, and then I get this dropdown here that asks me where I want to place this virtual machine. I can pick from any of the number of datacenters that we’ve got Windows Azure running in across the world or I can even pick to deploy it into a virtual network, which is a VPN gateway subnet up in Windows Azure that connects back to corpnet. So, I’m going to pick the VPN network that I’ve already got up there, the corp network, and press next.
And this final dialog lets me pick some of the scale-out and high availability features that we’ve got that I’ll demonstrate actually in a few minutes. So, I’m going to go ahead and skip that slide and just press next to go create that virtual machine.
Now, as Satya said, one of our goals was to make it easy to migrate virtual machines back and forth. So, let’s say that you’ve got an application running on-premise in your Hyper-V private cloud like this one right here. It’s a simple events manager app that’s built on IIS and SQL Server, and I want to take that application and migrate it up into Windows Azure. Because Windows Azure virtual machines are based on Windows Azure storage, I can simply upload them to Windows Azure storage blobs and then create virtual machines from them.
But what makes that simpler to migrate virtual machines is System Center 2012 App Controller that Jeff introduced you to. I’m going to go over to the App Controller dialog here, and you can manage private clouds, you can also manage Windows Azure. I’ve got that application running here on my private cloud. If I go to the virtual machine menu entry you can see there it is, events manager local.
I previously made a backup of that virtual machine VM or VHDs and store it in a library here, and when I right-click on this and select migrate I’ll be guided through a simple wizard that will let me push that entire VM with its VHDs up into Windows Azure.
I pick the cloud that I want to deploy to. In this case it will be Windows Azure. Pick the cloud service I want to deploy into. Here I can create a new cloud service or I can add this VM to an existing one. I’ll add it to this one right here, press OK.
Final step, pick some of the options you saw me pick in the portal there with the create virtual machine wizard like the instance size. I’ll pick an extra large. The storage account I want to upload into, so just navigating through my Windows Azure storage accounts. I’ll put it in the migrated VMs container.
And the nice thing about App Controller is this virtual machine actually consists of two VHDs, an operating system VHD and a data disk with a SQL Server on it. App Controller knows that and will automatically migrate both of those VHDs up when I press the deploy button.
But how about the reverse? Let’s say that I’ve got an application running up in Windows Azure and I want to bring it back on-premise, maybe for disaster recovery, maybe for backup, maybe I want to just take a look at it. I’ve got that events manager application. I’ve already migrated it up with app controller. You can see it here in this virtual machine list. When I click on it, here you can see a virtual machine dashboard we’ve got. We’re actually having the infrastructure collect performance information and surface it up in the portal, including CPU usage, network usage, in and out, as well as disk I/Os.
Here in the URL you can see the DNS name assigned to that virtual machine. And just to prove it’s actually the same app let’s log in, and we’ll see the same exact interface we saw, because it is the same virtual machine with the same VHDs.
You can see down here there are the two disks that we migrated up sitting in Windows Azure storage. Because it’s Windows Azure storage, not a separate storage service, it uses the same storage APIs, and that means that off the shelf storage utilities for Windows Azure just happen to work against it.
I’ve got an example utility here, Cloud Explorer, and if I go take a look at the VHDs that I’ve got running or the migrated VHDs that I’ve copied up into the cloud here, the data disk and the OS disk, I can simply say copy, paste them into a backup folder up in the cloud. Because this is the copy on write copy it’s almost instantaneous, and now I can take those, copy and then paste them to download them to my local system, and then at that point I can just use Hyper-V to create a virtual machine with those VHDs and get the application back up and running on-premise.
So, that’s a fairly simple application. How about serious enterprise applications like ones that are built on SharePoint and Active Directory? We also have features that support those and we’ve got people that are already building those kinds of applications.
SATYA NADELLA: Thank you, Mark and Mike.
Windows Azure, hopefully you got a good flavor for the capabilities in Windows Azure. It’s the most enterprise-grade public cloud service. Last week we announced a set of features that we have updated as part of our spring wave. We have the fall/spring rhythm with Windows Azure, and we are continuously improving the service. We think that we’re really ready for the mainstream of the enterprise, especially with the coming together of IAAS and PATH.
In terms of the feature capabilities, all the things that we talked about for Windows Server in terms of the release criteria, so to speak, apply to Windows Azure. The first thing in terms of scalability and elasticity, that’s what it’s really been built for at the core. You can scale the virtual machines, you can scale the Azure website, you can scale the cloud services that you build.
In terms of always up/always on, that is of course the underpinning of Windows Azure design, the availability set feature that Mark demoed is something that you inherit even for the IAAS infrastructure from within the core underlying storage and network and compute, and the way it’s constructed so that it’s resilient to hardware failure or network failure. Shared resources, you can make Windows Azure a seamless part of your datacenter, the network virtualization capabilities is something that makes that possible. And, of course, you can automate everything. Everything that’s available, Mark showed a lot of the capabilities in the management portal, but everything is exposed through PowerShell and APIs, so that you can automate it and make it part of your own management suite.
So, we think Windows Azure is really ready to take some of those very mission critical workloads and use them on the public cloud. And hopefully you’ll give that a try as we’re in the early access program for infrastructure as a service.

Microsoft Announces New Cloud Opportunities for Partners [press release, July 10, 2012]

New guidance, training and programs for Windows Server 2012 and Windows Azure unveiled at Worldwide Partner Conference.

During the second day of Microsoft Corp.’s annual Worldwide Partner Conference (WPC), top executives from the company announced new training, tools and other programs that enable partners to deliver compelling new cloud services to their customers. Satya Nadella, president of the Server and Tools Business, announced a community technology preview (CTP) of new technologies that enable hosting service providers to use their Windows Server data centers to deliver capabilities consistent with services running in Windows Azure. In addition, he announced a new program that gives partners guidance, training and software tools to help customers transition from VMware’s virtual infrastructure to Microsoft’s cloud.

“We’ve taken everything that we’ve learned from running data centers and services at a global scale to usher in the new era of the cloud OS,” Nadella said. “Microsoft offers partners modern yet familiar technology to meet customer demand on their path to the cloud.”

With the new CTP, hosting service providers can offer customers turnkey cloud services, including high-scale websites and virtual machine hosting with an extensible self-service portal experience. These capabilities, which run on Windows Server 2012 and Microsoft System Center 2012, will offer hosting providers some of the same experiences and services recently announced by Windows Azure. Go Daddy, the largest global Web hoster, is piloting these new capabilities to deliver new cloud services for customers.

“Customers view Go Daddy as an IT partner with which they can grow,” said Scott Brown, vice president of Product Development – Hosting at Go Daddy. “These new capabilities give customers a seamless path to expanding their online presence. In addition, the improved site performance, scalability and availability all lead to a more enjoyable experience for our customers and their visitors.”

In addition, the new program announced on stage, Switch to Hyper-V, will allow partners to grow their virtualization, private and hybrid cloud computing practices while also helping customers improve IT agility at a lower cost with Microsoft’s cloud infrastructure.

Already, partners are making significant progress in helping their customers with this transition. Microsoft Gold Certified Partner FyrSoft recently helped Iowa-based Pella Corp. migrate nearly 100 percent of its VMware infrastructure — nearly 700 VMware virtual machines — to Hyper-V, moving the company beyond virtualization to a private cloud solution. With the Microsoft private cloud, Pella has evolved its business while reducing IT costs and improving efficiencies. Server and Tools Business Corporate Vice President Takeshi Numoto further details partner opportunities in the era of the cloud OS on a blog published today, and more information can be found here.

In addition, in a keynote that further reinforced how Microsoft is working with its partners to transform businesses throughout the world, Microsoft Business Solutions President Kirill Tatarinov highlighted the incredible opportunity in the year ahead for partners focused on selling business solutions based on Microsoft Dynamics.

“Microsoft brings together technologies in a way that no other company can match,” Tatarinov said. “Microsoft Dynamics takes full advantage of the amazing innovations Microsoft is delivering, and we’re actively supporting our partners in developing and delivering a complete, modern, flexible and cloud-based business solution to grow their businesses. There’s never been a better moment to be a Microsoft Dynamics partner.”

With a renewed focus on building enterprise partnerships, Microsoft announced new global independent software vendors that are choosing or extending their solutions across Microsoft Dynamics. Companies such as Campus Management Corp., Cenium Inc., Cincom Systems Inc., PROS Pricing and Technosoft that are industry leaders in their markets are embracing the Microsoft Dynamics solutions to expand their offerings and in some cases as the core foundation on which to build their unique industry-focused solutions. For instance, global hospitality and hotel solution organization Cenium is extending its Microsoft Dynamics-based business offerings in areas such as property management, procurement, human resources and point of sale; and Campus Management, a leading provider of enterprise software solutions for higher education, is planning to expand global reach by leveraging Microsoft Dynamics AX and providing institutions of any size or complexity more choices when it comes to student information systems and enterprise resource planning solutions.

Other keynotes included the following news and momentum updates from Microsoft senior executives:

  • Thom Gruhler, corporate vice president of Windows Phone Marketing, took the stage to demo Windows Phone 8 and highlight that Windows Phone is now a true extension of the Windows that 1 billion users worldwide know and use today.
  • Laura Ipsen, corporate vice president of Worldwide Public Sector, provided an overview of Microsoft’s National Plan and citizenship efforts, including empowering youth and driving societal change through the proliferation of Microsoft technology.

A New Era Together: Partners and the Microsoft Cloud OS [The Official Microsoft Blog, July 10, 2012]

Posted by Takeshi Numoto
Corporate Vice President, Server & Tools Business, Microsoft

Here at the Worldwide Partner Conference, it’s exciting to hear the buzz about the opportunities that are in store for our partners and their customers in the cloud. Over the past year alone, we have delivered a number of new solutions to empower the transition to the cloud – from the release of System Center 2012 and SQL Server 2012 to new Windows Azure services and the release candidate of Windows Server 2012 – and we will have more new solutions in the future.

As Satya Nadella recently described, we are bringing this all together to usher in the era of the “Cloud OS.” With Windows Azure and Windows Server at its core, the Cloud OS takes our strong legacy of running the highest scale application services and petabyte-sized datacenters to the next generation of computing and delivers a modern platform for the world’s applications. And for our customers, this will ultimately result in lower IT costs, faster innovation and greater agility.

Partner Opportunity with the Cloud OS

Our partners are essential to helping customers realize these benefits. The new opportunities that weannounced today are the latest examples of how we are helping our partners seize business opportunities in the cloud. For example, hosting service providers can now use their Windows Server datacenters to deliver some of the same services running in the Windows Azure public cloud. It’s a great example of our strategy to deliver consistent capabilities that span customer’s private cloud on-premises, service provider clouds and Windows Azure.

That consistency is one of our keys to helping partners succeed and profit by betting on Microsoft and the Cloud OS. Partners are able to use the same familiar application development and management tools, as well as common data, identity and virtualization platforms across the Cloud OS. This means that they can carry their current skills, experience and investments forward as well as help their customers do the same.

The Cloud OS strategy reflects our longstanding practice of democratizing technology to fuel partner and customer innovation. I believe it truly sets Microsoft apart. It is exciting to see more and more partners and customers – including the likes of T. Rowe Price, Lufthansa Systems, Munder Capital and ING Direct – choose our solutions over other vendors as they progress to the cloud. As we highlighted today, we now have even more resources in place to help partners more easily migrate customers from another platform to ours.

One great example is how Microsoft Gold Partner FyrSoft leveraged our cloud technology along with new resources to move windows-and-doors manufacturer Pella Corp. from VMware to our private cloud. In doing so, Pella Corp.’s IT organization has gone beyond virtualization, enabling more agile, manageable services to ensure their customers’ windows and doors are built and delivered correctly – all while saving costs and improving the company’s bottom line during challenging economic times.

But as I am learning here at WPC, stories like FyrSoft and Pella’s aren’t unusual. Many of our partners have stories to tell about bringing their innovative solutions and services to life on the Microsoft platform, and they are truly inspiring. I look forward to seeing what Microsoft and its partners together can accomplish in the era of the Cloud OS.

From: Satya Nadella: Worldwide Partner Conference 2012 Day 2 Keynote [speech transcript, July 10, 2012]

Today I want to talk about the future. I want to talk about the future, though, first by reflecting on the past. I joined the company in ’92, that was just at the very, very beginning of the client-server era. We had Windows 3.1 and Windows NT being birthed, if you will. And over the last 20 years, we collectively have achieved a tremendous amount of success with the client-server paradigm.
And we’ve done a lot of innovation over the years, but two things that have sort of really remained constant and specifically were unique to the approach we took, one was to build a broad software-based platform and specifically the operating system that enabled the partners to innovate with their applications and services and capabilities and take it to market, and a partner-led business model.
We think of those two things which were unique to our approach in the past are going to remain constant as we move to this new era of connected devices and continued services. Broad platform-based approach when it comes to technology, a partner-led business model when it comes to go-to-market.
And in this new world, for sure, things are going to change. Yesterday, you heard Tami and Steve talk a lot about our connected devices, how we’ve reimagined Windows in its core from the silicon to the developer platform to the user experience and even the device form factors.
And, today, I want to talk more about what’s going to happen on the back end, especially around continuous services. But as these things change, as the markers between categories, as business models change, I still want to come back to the fact that we will always be partner-led; we always will have a broad OS that enables partners to express their services, their value, and their capabilities to drive more customer value.
So, the thing that I want to talk today about is the back end and how the back end is changing in the next era. And we refer to this as the cloud operating system or the cloud OS. The cloud OS, like any OS, has two elements to it: The first is its job to really abstract away the hardware so that the application developers can focus on the application. And then we talk about the hardware abstraction. It’s now going through a fundamental shift.
There are two pivots to it. The first one is that you think about the unit of compute now as compute storage and network, and at an atomic level, you really are thinking about the compute-server-storage capacity as well as the network capacity all together.
And then you’re thinking about this at massive scale. When we say “massive scale” it’s at a data center or multiple data centers. So, when we think about an operating system, we’re talking about a true distributed operating system that spans multiple data centers.
Now, having thought about the resource pool or the hardware abstraction at that scale, you obviously don’t want to make all of that sort of show through to the application developer. You want the application developer to be most productive with abstractions that help them focus on their application logic, on their experience. And that’s what we do with a very rich application platform.
One of the things that is motivating the cloud OS development for us within Microsoft is the first-party applications that we run inside of Microsoft at Internet scale. This was very true even in the last era. When you think about the continuous improvements that we’ve done over the years for Windows Server, they were driven because of some of the workloads. Remember the early days of SQL were sort of a real boon for the folks working on Windows Server because that pushed to really get our IO right, a lot of the capabilities that we needed to get right in the core operating system were driven by the database that was pushing us.
The same thing with Exchange, later on SharePoint, Lync — that was a fantastic virtuous cycle that we had internally that made our operating system better.
The same thing now exists when it comes to Internet-scale properties. We have one of the most diverse sets of workloads operating on Internet scale. You take something like Bing, it’s a big-data, applied-machine-learning-at-scale application. Office 365, that’s teaching us all about collaboration and communication in the cloud. Dynamics and AdCenter, are stateful, transactional systems in the cloud.
So, that diversity, not just the fact that any one of them has scale, but it’s that diversity of workloads is what helps us build a general purpose cloud operating system, and it’s a very, very important virtuous cycle for us.
The opportunity of the cloud operating system is right there. And I think there’s a lot of excitement in the audience when Jon talked about people doing cloud development and cloud development next year, and these numbers talk to it. In fact, it’s probably the most unbounded of opportunities because you would think about it for a second, the opportunity around infrastructure and cloud infrastructure is unbounded because it scales with the number of devices, which by the way we’re going to have more number of devices than the total number of people on the world by three times in five years.
It scales with data. It scales with applications. All of these streams are going through an explosive phase. So, therefore, the need for infrastructure, private or public, is going to be significant. And so therefore anyone who is either in the app-dev business, or in the infrastructure business, is going to see this growth rate over the next multiple years.
So, let’s take a look at the modern data center. We’ve built Windows Azure and Windows Server as one consistent set, as that one distributed operating system that has the following attributes: The first core capability is the ability to scale your resource. We talked about compute, storage, and network; you need to have the capability to take compute, storage, and network and scale it to data center and multi-data-center scale on behalf of a given application or shrink it down. So, that’s why elasticity is very, very core.
Always-up, always-on. You want to build software resilience into the operating system. You really want to build it. At scale, everything breaks. So, that means you need to be resilient to hardware breaking on the network side, on the storage side, on the compute side, and you want to build that right into the software fabric.
When it comes to sharing resources, you have to build multi-tenancy from the ground up. You want to be able to pool your storage, pool your compute, virtualize your network. You need to be able to isolate. Not only do you need to pool the resources, you need to be able to isolate the resources, so two workloads from two departments within an enterprise do not mix up with each other when it comes to resource contention.
And automating and self-service is super important. Everything’s got to have a surface area for management both from a UI-based management as well as API, and so that you can automate everything so that you can have the total cost of ownership for such scale at reasonable levels.
So, to show you some of this innovation across both Windows Server and Windows Azure, I wanted to invite up on stage Jeff Woolsey. Jeff? (Applause.)
JEFF WOOLSEY: Thank you, Satya, it’s a pleasure to be here. Windows Server 2012 was designed to unleash new business opportunities for you. You told us you wanted to help your customers virtualize everything, even those workloads considered non-virtualizable, workloads that require dozens of cores, hundreds of gigabytes of memory, or SAN-attached, and with exceptionally high IO requirements. Think Exchange, SharePoint, massive-scale-up SQL Server.
Well, today we’re here to redefine performance and scale with support for up to 320 logical processors per server, up to four terabytes of memory per server, and up to 64 virtual processors per VM. (Applause.)
In addition, you can see this virtual machine has been allocated 100 gigabytes of memory, and will support up to a full terabyte of memory per VM. In addition, whether this VM has been allocated 10 gigabytes, 100 gigabytes, or a full terabyte, it still costs the same for your customer, which means more margin opportunity for you.
In terms of performance, let’s talk about industry-leading, world-class performance. I’m going to bring up a standard industry benchmark iometer. I’m going to run an industry-standard test, 4K random IOPS. I should also mention at this point that the guys at VMware claim that they can deliver a maximum of 300,000 IOPS from a single virtual machine. We’re delivering over 1 million IOPS from a single virtual machine. That’s over three times VMware, and folks, I’m just getting warmed up. (Applause.)
We’ve made some huge investments in storage with transformational new technologies like offloaded data transfer, or ODX. I’m going to copy a 10-gigabyte file without using ODX. You can see that my network utilization has just completely spiked, and I’m saturating my network.
Well, now let’s do that same copy, this time with ODX storage. Folks, I’m copying a 10-gigabyte file in about 10 seconds. This type of performance is unheard of and with the other slew of capabilities we’re including in Server 2012, it makes Server 2012 a no-brainer for cloud storage. (Applause.)
Now, one of the things our joint customers have told us is, quite simply, they want to live beyond their means. They want a data center without boundaries. Well, with you as their cloud broker, let’s transform the industry together. Here is the brand new Windows Azure portal. You can see I’ve got virtual machines, cloud services, SQL databases.
Let’s go ahead and extend my data center to the cloud and create a new virtual machine. I’m going to do so from the gallery, and you can see I’ve got a number of options from Windows Server and from our partners, including Linux distributions.
I’m going to choose my favorite, Windows Server 2012. I’m going to paste in a name. I’m going to provide a password. You can see I can now choose the size of my virtual machine anywhere from extra small to extra large. I’m going to keep the default. It’s now going to ask me for a DNS name, and then it asks me where do I want to deploy this. Lots of different options from around the world. I’m going to go for western United States. And finally, like that, I am deploying new workloads into the cloud.
Now, one of the things you’ve told us and our customers have told us is they want choice and flexibility when it comes to the cloud. They want to be able to run on-premises as well as service provider offerings as well. Well, take a look at System Center. Here we’ve got on-premises cloud, we’ve got service-provider clouds here with Toronto Hoster, and we’ve got Windows Azure. This is exactly what your customers are looking for — a single consistent management interface for all of their cloud resources.
In fact, let’s go ahead and deploy a new cloud service. A click on services, and click on deploy. And notice the first thing I’m going to configure, the first thing it asks me is: Which cloud would you like to deploy on? Would you like to deploy on-premises in Windows Azure or using my service provider? Which is exactly what I’m going to do.
Finally, I select a template, and just like you saw me do in Azure, I can now deploy my virtual machines using a templatized experience. So, we’re taking what we’ve learned in Azure and bringing that to the Microsoft private cloud.
So, what have we seen? Massive scale, industry-leading performance, the ability to manage all of my clouds from a single, consistent management interface. With Windows Azure, System Center, and Windows Server 2012, we are delivering the ultimate cloud OS. Let’s go transform the data center together. Thank you very much. (Applause.)
SATYA NADELLA: Thanks, Jeff. And what you saw there was a boundaryless data center powered by Windows. And that’s pretty unique, the notion that you can build your own data center, use Windows Azure as well as the service provider cloud with consistency is something that we can do very uniquely. There’s no translations of virtualization format, you have a single pane of glass for management, those are capabilities that we can deliver in the marketplace with your capabilities combined in a very, very unique way.
So, I want to talk about three specific announcements today relating to our cloud OS. The first is, Windows Server 2012 is getting ready for prime time. The general availability of Windows Server 2012 is going to be September. It’s going to RTM in August. It’s a huge milestone for us. (Applause.)
We’ve had over half a million downloads of the release candidate. We’ve had around 250 customers who have been part of our TAP program who have worked very closely with us, many of them have already gone to production. In fact, Bing today is powered by Windows Server 2012 already. So, we feel very, very good about the robustness of the operating system and we’re looking forward to September.
The second announcement is a new set of services for Windows Server. We’re announcing today the customer technical preview of three capabilities: High-density website hosting, VM or virtual machine hosting, as well as the management and provisioning capabilities on top of those two for Windows Server, specifically focused on service providers.
We have these features in Windows Azure that were announced very recently. As for our promise, we’re now making it available on Windows Server so that anyone else who wants to provide these capabilities to customers as part of their service can do so.
Lastly, I want to talk about the Hyper-V Switch Program. This is another very exciting program for us. We’ve had some tremendous momentum building up with the number of customers who’ve chosen to move to Hyper-V from VMware, especially in anticipation of what’s coming with Windows Server 2012. We’ve already seen some successes like FyrSoft was able to move Pella, one of their customers, from VMware to Hyper-V. We’ve also had Avanade who moved Unilever from VMware to Hyper-V. So, we’ve got great momentum building.
As part of this program, you’ll get tools, resources, and guidance to take the risk out of these migrations. So, we think this next year is going to be a year for us collectively to push this significantly, and we’re going to put all of our might behind this program.
Now, I want to switch to talk about the application platform because that’s the second part of the modern cloud operating system. And when you talk about the application platform, the consideration set is similar to what we talked about in the data center, but abstracted to the application. So, the first thing is you want to have a very rich, capable application platform or rich application services. Things like media services, things like storage, identity, caching, service bus, all of these things need to be available as part of your application platform.
Data needs to be at the core. Now, SQL, obviously, and SQL Server is very, very key, but also things that are non-SQL format, so what we’re doing with Hadoop is going to be part of it, what we’re doing in the higher layers with BI is going to be part of it because there’s not a single application that’s not driven by data going forward.
You need to have a very dynamic life cycle for both development and management; this dev-ops cycle is going to be revolutionary in terms of the cycle times that people will come to expect in terms of what you can do on the cloud, both private and public.
And lastly, not only is it about building and monitoring and managing applications, it’s about being able to take your applications and make them accessible anywhere on any device for any user based on their identity, because you want to have security and not be compromised at all.
So, things that we have done in VDI, things that we have done now with System Center in Intune to be able to manage these modern devices, and Active Directory and AD is something that’s super important for us to be able to deliver a more people-centric device management capability.
The uniqueness in our approach, again, is very similar to what we talked about in the data center, which is you have this application platform pervasive through the world of Windows across your data center, across the service provider cloud as well as Windows Azure. So, you have a complete application platform, so that means you can move an application between any one of these locations. In fact, you can split tiers. You can have a front end in Azure, you can have the back end inside of your private data center; we see many deployments like that today.
You want flexibility and development; of course we’re going to do a fantastic job with .NET and Visual Studio as the framework and tool set, but we will also have support for other frameworks and tools in Java, in PHP, and Node, and that’s going to all be first class on top of our platform, both Windows Server and Windows Azure.
The common identity, though, is going to be very, very key, especially any time you have distributed computing where things are running in different places, and applications themselves are running in different places, it becomes very important for you to have that identity and access control be governed in a way that nothing in the enterprise gets compromised in terms of data security or application access.
So, to give you a feel for what this people-centric access and device management is, I wanted to invite up on stage Deb McFadden from our team to give you a feel for some of the capabilities we’re building in for new devices.

Windows Server 2012 Powers the Cloud OS [press release, Sept 4, 2012]

New server is built from the cloud up for the modern datacenter.

Today in a global online launch event Satya Nadella, president of Microsoft Server and Tools Business, announced the general availability of Windows Server 2012. In his keynote speech, Nadella described how Windows Server 2012 is a cornerstone of the Cloud OS, which provides one consistent platform across private, hosted and public clouds.

“The operating system has always been the heartbeat of IT and is now undergoing a renaissance in the new world of continuous cloud services, connected devices and big data,” Nadella said. “Microsoft’s unique legacy in the most widely used operating systems, applications and cloud services positions us to deliver the Cloud OS, based on Windows Server and Windows Azure, helping customers achieve a datacenter without boundaries.”

Enabling the Modern Datacenter

Microsoft built Windows Server 2012 from the cloud up, applying its experience operating global datacenters that rely on hundreds of thousands of servers to deliver more than 200 cloud services. Windows Server 2012 expands the definition of a server operating system, with significant new advancements in virtualization, storage, networking and automation. Hundreds of new features can help customers achieve a transformational leap in the speed, scale and power of their datacenters and applications. In combination with Windows Azure and System Center, Windows Server 2012 empowers customers to manage and deliver applications and services across private, hosted and public clouds.

Customers Find Success With Windows Server 2012

Customers can use their existing skills and investments in systems management, application development, database, identity and virtualization to take advantage of Windows Server 2012 and realize the promise of cloud computing. Many enterprise customers are already seeing tremendous value in early deployments. A survey of 70 early adopter customers from across the globe revealed that they expect, on average, 52 percent reduction in downtime, 41 percent reduction in workload deployment time, and 15 hours of productivity time saved per year, per employee. 91 percent of the companies surveyed expect a reduction in server administration labor, and 88 percent expect reduction in network administration labor.*

Menzies Aviation, an airline passenger and cargo handling company that employs more than 17,000 people, is using Windows Server 2012 to provide identity access management and information access policies to its employees as it rapidly incorporates newly acquired businesses.

“We are very impressed by Windows Server 2012 and Microsoft’s overall solution to help us manage our systems and applications across our private cloud environments as they scale with our business,” said Martin Gallington, senior vice president of IT at Menzies Aviation. “This is a dramatic leap forward, matched by a simple, cost-effective pricing model.”

Equifax is a global information solutions provider that organizes and assimilates data on more than 500 million consumers and 81 million businesses worldwide. It now counts on Windows Server 2012 for improved reliability and uptime of its information services to clients.

“Windows Server 2012 revolutionizes how we can operate our datacenter, allowing us to better meet our commitments,” said Bryan Garcia, chief technology officer at Equifax. “The new high availably technologies help us deliver ‘always-on’ applications, and we’re betting on Hyper-V as a critical component of our private cloud strategy. We are gaining tremendous efficiencies, which translate into more time to innovate for company growth.”

More information about Windows Server 2012 and the Cloud OS is available here. Read Satya Nadella’s post on The Official Microsoft Blog here. The conversation on Twitter can be followed at #WinServer.

Founded in 1975, Microsoft (Nasdaq “MSFT”) is the worldwide leader in software, services and solutions that help people and businesses realize their full potential.

* “Windows Server 2012 Rapid Deployment Program: TCO Study Whitepaper,” Microsoft Corp., June 2012

From: Satya Nadella: Windows Server 2012 Launch Keynote [speech transcript, Sept 4, 2012]

VOICE: For more than 50 years information technology has powered global innovation. And today, IT is in the midst of radical change as cloud computing transforms the landscape.
How can your organization take advantage of the new opportunities? Imagine data centers without boundaries, capacity on-demand. Imagine information crossing the globe seamlessly and securely, a modern platform for the world’s applications.
At Microsoft we unlock the full range of possibilities. We call it the Cloud OS and it’s here now.
SATYA NADELLA: Hello and welcome to the Windows Server 2012 launch. I’m really excited to have a chance to talk about Windows Server, it’s an epic moment for us, it’s been four years in the making, and talk about our broader vision with the Cloud Operating System.
We’re going through a great transformation across the industry, from client-server to the world of connected devices and continuous services.
If you look at the transformation, it’s across the entire ecosystem. You have more and more users connected to the Web and the Internet. The number of devices these users are using is absolutely going through an explosion. If you look at the total number of devices that users are expected to be using by calendar year ’15, we expect to have twice the number of devices as the people on the planet, and this is not adding all of the industrial equipment and sensors that also will be connected.
But the real transformation begins when applications are being delivered to these devices that are all powered by continuous services. And these applications themselves are using data. They’re generating lots and lots of data, as well as reasoning on top of all of this data and powering new features of intelligence and social capabilities in these applications.
The real news, of course, is that all of this transformation is being powered by servers that are delivering these applications, making all of this transformation possible.
So, with that industry transformation in mind, we have really set out to build the Cloud Operating System.
When we talk about the Cloud Operating System we have four key things in mind as we develop our innovations.
The first one is the transformation of the data center. You need to be able to take all of the resources in the data center — storage, network, compute — bring it together, share, make sure that the great utilization is available.
You want to be able to scale this up, and scale this up with elasticity so that for any given application you can have infinite scale but you can also make that elastic.
You want to have always up and always on. In other words, you want to have software power the resilience of your applications.
You want to be able to automate everything in the data center through APIs and self-service.
And we want to enable modern applications on top of all of this infrastructure. You want to have a rich set of new runtime services that enable your social, mobile, as well as big data applications.
You want to have the flexibility in the tooling and the development environments so that you can build these applications quickly.
You want to have a great rapid development lifecycle, a lifecycle that brings together both developers and management professionals to have dev ops lifecycles that are much more in tune with the dynamism that’s part of your business.
You want to have people really empowered to bring their own devices into the enterprise. For example, you want to be able to personalize every experience, every application on any device they may be using anywhere. At the same time, you want to have IT be able to have the control and the governance needed to make sure that there is security in access and all of the devices are well-managed.
And lastly, data is very much first class in this new world of the Cloud Operating System. You want to be able to support any data in any size, so from SQL to NoSQL. You want to be able to connect to the world’s information. You want to be able to blend the data that you have in your enterprise with the world’s information to create new value.
With the work we are doing in SQL as well as Hadoop we want to make sure we have support for any data, any size, anywhere.
And lastly, you want to not only have lots of data, but you want to get real insight. So, creating immersive experiences for users around data is the most important thing that the Cloud Operating System needs to provide.
That is the vision that’s really driving us to build a comprehensive Cloud Operating System platform, and deliver it with all of the flexibility choices that you as customers require.
So, that means you can deploy this Cloud Operating System in your data center, you can consume it from the partner data centers, or use it from Windows Azure.
But the consistency that we bring by ensuring the commonality of virtualization, no transformations required of hypervisor formats, management infrastructure, development infrastructure, data itself as well as identity, these commonalities are unique attributes to our Cloud Operating System vision that Microsoft brings to you.
And that is the context with which we bring Windows Server 2012 to the market today. Today, is a momentous day for us because we are announcing the availability of Windows Server 2012. It’s perhaps the biggest release of our server product in our history.
I was here in Microsoft when we launched Windows NT, and it ushered in the era of client-server, and we believe the Windows Server 2012 ushers in the era of the Cloud Operating System and we continue to take the power of software to really make sure that you can build the applications and the infrastructure needed for it in this new era.
This particular release has been four years in the making. It’s got many, many features across all of the dimensions we talked about, and we’re really excited to have an opportunity to get this to market. And many customers are already using this operating system in production.
But one of the things that’s really driving a lot of the innovation in Windows Server is the feedback loop that we have internally between our first party Internet scale properties and the feature innovation in Windows Server.
For example, Bing already has deployed Windows Server 2012 in production. So, that means our hypervisor, our .NET runtimes are all battle tested with Internet scale production services, and those capabilities are things now that you can deploy in your data centers.
And this, of course, extends to Office 365 and their use of Active Directory, Xbox Live and their use of capabilities like the virtual GPU capabilities that are there inside of Windows Server, what we have done with Outlook and Dynamics; all of these services are really consuming capabilities of Windows Server and in turn making Windows Server more robust, more capable.
We’ve had a very comprehensive early access program for Windows Server 2012 as we were building it and developing it. And so many customers had a chance to deploy Windows Server 2012 in their development environments, as well as in production, and many of these customers are already reaping the benefits of Windows Server 2012, and I wanted to have a chance to share some of those experiences with you.
(Begin video segment.)
Equifax: What we’re seeing today with Windows Server 2012 is revolutionary. It’s not just a couple little fixes, it’s a huge release and it’s a huge different experience than what we’ve had before.
Marquette University: With Server 2012 I think Microsoft has pretty much leveled the field when it comes to the hypervisor with virtualization.
EmpireCLS: Hypervisor Replica now allows us to move virtual machines through both our private and public cloud locations seamlessly while they’re live. We’re seeing very significant performance improvement across the globe.
Outsourcery: Our customers can run applications and services and actually achieve more in their business using the power of Windows Server and System Center together.
BMO Capital Markets: I don’t see any workloads known to mankind that cannot be ported to a Microsoft platform at this point.
EmpireCLS: Windows Server 2012 makes my life easier when I don’t have to worry about things after hours, things I know are running, they’re solid.
Equifax: With Windows Server 2012 we don’t need to be concerned in the private development team about our maintenance windows.
Menzies Aviation: Active Directory and Windows Server 2012 together with Hyper-V, virtualizing it, it brought so much more security.
Rackspace: With Windows Server 2012 it’s building a great foundation for the future.
(End video segment.)
SATYA NADELLA: The industry is also ready for Windows Server 2012. We’ve had a comprehensive program to work with all the constituents of the broad ecosystem around Windows Server, from hardware manufacturers who have optimized their hardware for Windows Server 2012 to take advantage of some of the new capabilities around storage, around networking, around compute and management.
We’ve got service providers all over the world already deploying Windows Server 2012. So, that means the service provider cloud is available with Windows Server 2012.
We’ve had ISVs who have certified their applications for Windows Server 2012, and they’re also taking advantage of new capabilities in Windows Server 2012 to exploit the advances there, as well as we have a broad set of partners who are now capable of deploying, helping you upgrade, helping you migrate and take advantage of Windows Server.
So, in conclusion, I wanted to talk about our commitment to the Cloud Operating System.
We are really excited about this new era of the Cloud Operating System, because we believe that it delivers the most comprehensive and consistent platform that is required of modern data centers and modern applications.
We’re unique in the feedback cycle that we have with the Internet scale properties that are really driving some of the innovations in the Cloud Operating Systems, and we’re also unique in the fact that we can provide the Cloud Operating System in your data center, in partner data centers, as well as with Windows Azure. And by doing so we believe that you will get the best economics and the best flexibility needed in order to move your data center and your applications forward.
And now you’ll have a chance to hear from Bill Laing, Scott Guthrie, and Brad Anderson on more of the details of the Cloud Operating Systems and the innovations that are built into Windows Server 2012.
And so I hope you have a chance to look at all of those presentations, as well as take a fresh look at Windows Server 2012 and what it can do for your modern data centers, as well as your modern applications.
Thank you very much.
END

Microsoft Reaches Agreement to Acquire StorSimple [press release, Oct 16, 2012]

Microsoft to acquire leader in Cloud-integrated Storage.

Microsoft Corp. and StorSimple Inc. today announced that Microsoft has reached a definitive agreement to acquire StorSimple, a leader in Cloud-integrated Storage (CiS) solutions. The addition of CiS will advance Microsoft’s Cloud OS vision and help customers more efficiently embrace hybrid cloud computing.

“Customers faced with explosive growth in data are looking to the cloud to help them store, manage and archive that data. But, to be effective, cloud storage needs to integrate with IT’s current investments,” said Michael Park, corporate vice president, Server and Tools Division for Microsoft. “StorSimple’s approach helps customers seamlessly integrate on-premises storage with cloud storage through intelligent automation and management.”

StorSimple solutions combine the data management functions of primary storage, backup, archive and disaster recovery with cloud integration, enabling customers to optimize storage costs, data protection and service agility. With its unique cloud snapshot capability, StorSimple automatically protects and rapidly restores production data using public clouds. Large enterprises across many vertical markets, including retail, oil and gas, manufacturing, consumer goods, healthcare, and financial services, have made their first public cloud deployments using StorSimple.

“Most StorSimple customers are mainstream IT organizations that have chosen Windows Azure as their primary cloud. We are excited to continue to work with Microsoft and bring the combined benefits of StorSimple and Windows Azure to customers around the world,” said Ursheet Parikh, co-founder and CEO, StorSimple.

Terms of the deal were not disclosed.

About StorSimple

StorSimple (www.StorSimple.com) is the leader in cloud-integrated storage for Windows. StorSimple securely and transparently integrates cloud storage for on-premises applications and offers a single appliance that delivers high-performance tiered local and cloud storage, live archiving, cloud-based data protection and disaster recovery. StorSimple has uniquely achieved the most stringent “Certified for Windows Server 2008” and was named the Microsoft BizSpark Partner of the Year 2011. StorSimple appliances were named 2011 Products of the Year in the Storage Systems category by Storage Magazine/SearchStorage.com.

For more information, visit www.StorSimple.com.

From: Satya Nadella, Scott Guthrie and Jason Zander: Build Day 2 [speech transcripts, Oct 31, 2012]

On the client side, we’ve talked about how we’ve reimagined Windows from the developer platform to the user experience. Again, in support of the kind of applications that you’re building for the devices today. These fluid, touch-first applications that also take advantage of capabilities in Windows RT to be able to truly bring to life all the new application capabilities.
Very similarly on the back end, we’re reimagining Windows for cloud services. It’s a pretty concrete thing for us. We refer to this as the Cloud OS. At the hardware level, for example, the core of any operating system is to think about the hardware abstraction. And the hardware abstraction is going through a pretty radical change. At the atomic level, we’re bringing compute, storage and network together, and then scaling it to a datacenter on a multidatacenter scale. So this is no longer about a single server operating system, but it’s about building distributed, virtualized infrastructure that includes storage, compute, and network and spans, if you will, across the datacenters.
We want to be able to have the richness at the data tier that supports the richness in your application. You have SQL data, you have NoSQL data, you want to have all kinds of different types of processing capabilities, you want to have a rich data platform, and that’s something that we’re building right into the operating system.
Beyond that, of course, the thing that developers interface with is the app platform, and this is perhaps the place where we’re having the most radical changes. We want to create a very new way for you to interact with data. Your applications are going to reason over large amounts of data, you’re going to build these data-parallel applications that are driving new features in your applications. You’re going to have very rich semantics around identity, social, sharing, that you want to be able to get onto the platform.
And then your middle tiers are much more sophisticated. You’re really going to be able to build middle tiers that scale with the proliferation of devices, can manage the complexities of variety of different applications and the requests you get from your clients. Because that’s the sophistication we want to build into our platform.
Lastly, of course, when you build these applications, you also want to make sure that you can deliver these applications to the devices you’re targeting with personalization and security so that, again, you want to really reimagine things like identity so that both the end user is very happy with the application you produced, but as well as the operators and IT professionals are happy and feel secure in terms of how you’re projecting your applications and the data contained inside the applications to these devices.
So that’s what we refer to as the Cloud OS. It manifests both in Windows Server and Windows Azure for us. And, in fact, it’s not either/or. In fact, many of the application developers will take advantage of both because when you’re building a distributed application, it’s not going to be in one datacenter, it’s not going to be on one cluster. You do need to be able to span the globe on the Web.

alias “OS Moment PR” Microsoft Advances the Cloud OS With New Management Solutions [press release, Jan 15, 2013]

New offerings deliver on the commitment to help customers and partners deliver cloud services and manage connected devices.

Microsoft Corp. today announced the availability of new solutions to help enterprise customers manage hybrid cloud services and connected devices with greater agility and cost-efficiency. System Center 2012 Service Pack 1 (SP1), the enhanced Windows Intune, Windows Azure services for Windows Server and other new offerings deliver against the Microsoft Cloud OS vision to provide customers and partners with the platform to address their top IT challenges.

“With Windows Server and Windows Azure at its core, the Cloud OS provides a consistent platform across customer datacenters, service provider datacenters and the Microsoft public cloud,” said Michael Park, corporate vice president of marketing for Server and Tools, Microsoft. “Powerful management and automation capabilities are key elements of the Cloud OS, taking the heavy lifting out of administration and freeing IT organizations to be more innovative as they embrace hybrid cloud computing and the consumerization of IT.”

Park today wrote about the Cloud OS on the The Official Microsoft Blog here.

Transforming the Datacenter

Using System Center 2012 SP1 with Windows Server 2012, customers can shift from managing datacenter components separately to delivering resources as a whole, including networking, storage and compute. Cloud infrastructure capabilities such as multitenancy, software-defined networking and storage virtualization are built in and ready for automated, hybrid cloud environments.

With the updated System Center, customers can centrally manage cloud-based applications and resources running in their datacenters, on a hosted service provider datacenter or on Windows Azure. By integrating service provider cloud capacity and management directly into their operations, enterprises can extend their datacenter capabilities. Administrators can move virtual machines to Windows Azure and manage them from within System Center, based on their needs.

Customers can also use System Center 2012 SP1 to back up their servers to Windows Azure to help protect against data loss and corruption. In addition, SP1 supports Global Service Monitor, a new Windows Azure-based service available for trial evaluation today, which provides Web application performance measurement from a user’s perspective.

Hosting Service Providers and the Cloud OS

Hosting service providers play a key role in the Cloud OS with the opportunity to deliver new solutions, attract more customers and grow revenues. With Windows Server 2012 and System Center 2012 SP1, they can build multitenant, massive-scale cloud services that interoperate with customer datacenter operations. For example, System Center 2012 SP1 delivers a Service Provider Foundation API, which hosting partners can use to give customers self-service management of hosted infrastructure and applications.

Microsoft today released Windows Azure technologies that hosting service providers can run on their own Windows Server 2012 infrastructure for high-scale website and virtual machine hosting services. These capabilities are specifically designed for easy incorporation into hosting service providers’ offerings for deployment to their customer bases.

Unified PC and Device Management

With the new release of the Windows Intune service and System Center 2012 Configuration Manager SP1, enterprise customers can centrally manage a full array of PCs, laptops and mobile devices. With one management console, IT organizations can crack the bring-your-own-device challenge, helping ensure secure and productive employee experiences with applications and data on virtually any device, anywhere.

Working as a unified solution, Windows Intune and System Center Configuration Manager provide a comprehensive approach to better securing and managing the new generation of powerful Windows 8 PCs, Windows RT tablets and Windows Phone 8 smartphones, as well as the diversity of other platforms in today’s modern enterprise.

More information about System Center 2012 SP1 is available athttp://www.microsoft.com/systemcenter, and more information about Windows Intune is available at http://www.microsoft.com/intune. Those interested can follow the conversation on Twitter at #CloudOS, @WindowsServer, @WindowsIntune and @MSServerCloud.

What is the Cloud OS? [The Official Microsoft Blog, Jan 15, 2013]

post from Michael Park, Corporate Vice President of Marketing in the Server & Tools Business at Microsoft

We all know change is constant, especially in technology. Managing through change is always a challenge, but over the past 20 years I’ve found it to be one of the most rewarding aspects of my career in the tech industry.

During the past six months I’ve been talking to IT executives and partners about the big changes and trends in enterprise IT, such as the various cloud computing models, the consumerization of IT, the new generation of connected applications and big data. I’ve shared with them our vision of what we call theCloud OS and the feedback has been very positive. They see it as a differentiated approach from Microsoft that will help them embrace the transformational changes happening now. Today, Microsoftannounced several new products and services that deliver against the Cloud OS, so I thought I’d take this opportunity to further explain it to readers of this blog.

At the highest level, the Cloud OS does what a traditional operating system does – manage applications and hardware – but at the scope and scale of cloud computing. The foundations of the Cloud OS are Windows Server and Windows Azure, complemented by the full breadth of our technology solutions, such as SQL Server, System Center and Visual Studio. Together, these technologies provide one consistent platform for infrastructure, apps and data that can span your datacenter, service provider datacenters, and the Microsoft public cloud.

Key to this consistent platform is a set of common technologies and capabilities that extend across those three datacenters.

  • Flexible development allows your organization’s developers to use their choice of tools, languages – Microsoft or open source – and open standards to quickly build apps, connect them with other apps and data, and then deploy on premises, in the cloud or in a hybrid model. Visual Studio and Team Foundation Server enable application lifecycle management, from the idea to deployment of an app.
  • Unified management with System Center and Windows Intune gives your administrators a single pane of glass to manage applications, systems and devices across private, hosted and public clouds.
  • Active Directory and Windows Azure AD provide a powerful base for single identity across clouds to securely extend applications to people and their devices.
  • Integrated and portable virtualization, built into Windows Server, allows your team to virtualize not just servers, but also the network, storage and applications across clouds.
  • Last but not least, a complete data platform powered by SQL allows you to manage petabytes of data, power mission critical applications and give businesspeople BI solutions with a range of tools – Excel all the way up to Hadoop.

So, what does the Cloud OS mean to IT organizations?

It means your organization can shift to more efficiently managing datacenter resources as a whole, including networking, storage and compute. You will be able to deliver powerful apps that boost employee productivity and delight your customers much, much faster across private, hybrid and public clouds. Further, it means you can manage data, both big and small, to extract the story it has to tell for your business. And you will be able to give employees personalized experiences with apps and data on virtually any device, while maintaining security and compliance.

One of the reasons I believe Microsoft is uniquely positioned to deliver on the promise of the Cloud OS is that our products and services are deeply informed by our first-hand experience in running some of the largest Internet-scale services in the world. Running more than 200 cloud services for over 1 billion customers and 20+ million businesses around the world has taught us – and teaches us in real time – what it takes to architect, build and run applications and services at cloud scale.

We take all the learning from those services into the engines of the Cloud OS – our enterprise products and services – which customers and partners can then use to deliver cloud infrastructure and services of their own. It’s a virtuous cycle of development. Combine that with the established enterprise credibility of our products, such as Windows Server, which runs tens of millions of servers around the world, and you can see why customers can truly bet on Microsoft in the cloud era. Our breadth of experience across private, public and hybrid cloud is unmatched, whereas other vendors tend to specialize in one or another area.

I hope this introduction provides a good sense of what we mean by Cloud OS and the opportunity it presents to enterprise customers and individual IT managers. By embracing Microsoft’s approach, IT professionals can evolve from IT administrator to cloud innovator, and better assure themselves a career path into the future.

The Cloud OS: New solutions available today advance Microsoft’s vision [C&E News Bytes Blog, Jan 15, 2013]

Today Microsoft announced the release of new products and services available that further deliver against the company’s Cloud OS vision. Both customers and partners can capitalize on cloud opportunities with System Center 2012 Service Pack 1 (SP1), the new Windows Intune, and Windows Azure services for Windows Server.

In a blog post, Michael Park, corporate vice president of Server and Tools Marketing, outlines the Microsoft Cloud OS vision to provide customers with one consistent platform for infrastructure, apps and data – spanning customer datacenters, hosting service provider datacenters, and the Microsoft public cloud.

Solutions announced today include:

  • General Availability of System Center 2012 Service Pack 1: This update brings the full range of System Center management to Windows Server 2012 for private and hybrid cloud-based computing. It provides a single tool to manage cloud based applications and resources running in a private, hosted or public cloud. Learn more on theServer-Cloud blog.
  • Windows Intune: The new Windows Intune cloud-based service and System Center Configuration Manager are a unified PC and mobile device management solution. Together they provide a comprehensive approach to securing and managing the new generation of powerful Windows 8 PCs, Windows RT tablets and Windows Phone 8 smartphones, as well as the diversity of other platforms in today’s modern enterprise, including Android and iOS. More details can be found on the Windows Intune blog.
  • Windows Azure services for Windows Server: These new technologies allow hosting service providers to use their Windows Server datacenters to provide the same high-scale web site and virtual machine hosting capabilities announced this past summer in Windows Azure. These capabilities are specifically designed for easy incorporation into hosting service providers’ offerings for deployment to their customer base. More details are available on theHosting Insights blog.
  • Global Service Monitor: System Center 2012 SP1 includes support for a new Windows Azure-based service called System Center Global Service Monitor (GSM) to evaluate and boost the performance of Web apps. GSM extends the application monitoring capabilities in System Center 2012 SP1 using Windows Azure locations around the globe, giving a true reflection of end-user application experiences. GSM is now available for trial and will be broadly available in March. Read Brian Harry’s blog to learn more about Microsoft application lifecycle management for “DevOps” solutions.
  • System Center Advisor: This Windows Azure-based management solution enables IT departments to assess server configurations and proactively avoid problems, help to resolve issues faster and reduce downtime. System Center Advisor is now available to all Microsoft customers, not just those with Software Assurance.

More information on today’s news, including the press release, can be found on the Microsoft Virtual Cloud Press Room. You can also follow the conversation on Twitter at @WindowsServer and through #CloudOS.

Transform Your Datacenter with System Center 2012 SP1 [Server & Cloud Blog, Jan 15, 2013]

Mike Schutz
General Manager, Windows Server and Management Product Marketing

Today we announced the final release of System Center 2012 Service Pack 1, which helps deliver on Microsoft’s Cloud OS vision to provide customers with one consistent platform for infrastructure, apps, and data – spanning customer datacenters, hosting service provider datacenters, and the Microsoft public cloud. Together with Windows Server 2012, System Center 2012 SP1 helps customers take advantage of the latest technical advances in storage, networking, virtualization, and management to help transform their datacenter into a resilient platform for long-term growth while enabling future growth into the cloud.

System Center 2012 SP1 is now available for download here.

System Center 2012 uniquely delivers unified infrastructure, application and cloud management capabilities in a single product offering and this blog post highlights new cloud and datacenter capabilities introduced in SP1. For information about Configuration Manager enhancements in SP1 and related Windows Intune capabilities, please read the client management blog post.

Windows Server 2012 and SQL Server 2012 Support
Windows Server 2012 and SQL Server 2012 were both groundbreaking releases and provide the infrastructure and application platform as the foundation for SP1. With the release of SP1, all System Center 2012 components are now enabled to run in a Windows Server 2012 environment and provide management capabilities for Windows Server 2012 Hyper-V, Windows Server 2012 Servers, guest operating systems and applications. With SP1, a single instance of Virtual Machine Manager now supports up to 8000 VMs on clusters of up to 64 hosts and customers can easily extend beyond these limits with multiple instances of VMM, enabling datacenter management at large scale. SP1 also now supports the use of SQL Server 2012 as a repository for use by System Center 2012 components.

Software Defined Networking (SDN)
Traditionally, networks have been defined by their physical topology – how the servers, switches, and routers were cabled together and configured. That meant that once you built out your network, changes were costly and complex. SDN addresses these limitations by using software to configure end hosts and physical network elements, dynamically adjusting policies for how traffic flows through the network, and creating virtual network abstractions that support real-time VM placement and migration throughout the datacenter.

Windows Server 2012 Hyper-V Network Virtualization already delivers network flexibility by enabling multi-tenant virtual networks on a shared physical network, entirely defined in software. Each tenant gets a complete virtual network, including multiple virtual subnets and virtual routing, defined in a ‘policy.’ In SP1 we’ve built on this, adding management capabilities to simplify the definition and dynamic re-configuration of entire networks. By applying VM placement decisions and the policy updates together, SP1 provides a high degree of agility, automation and centralized control, essential to the smooth operation of a modern datacenter.

Windows Server 2012 also introduces the Hyper-V Extensible Switch which provides a platform through which our partners can extend SDN policies within the switch. One of the most common use cases for this extensibility is to integrate the virtual switch with the rest of the physical network infrastructure. SP1 manages Hyper-V switch extensions to ensure that as VMs migrate, their destination host is configured with the required switch extensions.

SP1 adds management support for isolated tenant networks, IP Virtualization, switch extensions, and logical switch. Our approach allows partners to enlighten their network software and network equipment to participate in, support, and augment the multi-tenant datacenter brought about by Hyper-V Network Virtualization. We think that this open approach, our focus on standards, and our close partnerships across the industry make our solution particularly unique and compelling for customers. Learn more.

DevOps with Global Service Monitor and Visual Studio
Application development and IT operations are being drawn together in organizations that want to improve SLAs and speed time to take new capabilities live. SP1 includes support for the new Windows Azure-based service called “Global Service Monitor” (GSM). GSM extends the application monitoring capabilities in System Center 2012 SP1 using Windows Azure points of presence around the globe, giving a true reflection of end-user experience of your application. Synthetic transactions are scheduled using your on-premises System Center 2012 SP1 Operations Manager console; the GSM service executes the transactions against your web-facing application and GSM reports back the results (availability, performance, functionality) to your on-premises System Center dashboard. You can integrate this perspective with other monitoring data from the same application, taking action as soon as any issues are detected in order to maintain your SLA.
GSM enables a 360-degree monitoring perspective for your web applications and, with Microsoft’s developer and ALM tools, forms part of a broader DevOps solution. Learn more.

Hybrid Cloud Management
SP1 extends the support in System Center 2012 to help integrate off-premises resources into the datacenter while retaining the same ‘single pane of glass’ common management interface:

  • Enterprise Consumption of Hosted Cloud Capacity
    System Center 2012 introduced the App Controller component to enable organizations to optimize resource usage across their private cloud and Windows Azure resources from a single pane of glass. In SP1, we’ve extended App Controller’s capabilities to integrate cloud resources offered by hosting service providers, giving you the ability to manage a wide range of custom and commodity IaaS cloud services from the same management console for you to manage centrally. SP1 also introduces additional new capabilities for multi-tenant and hosting scenarios. Learn more.
  • Windows Azure Virtual Machine Management
    The App Controller component in SP1 integrates with the preview of Windows Azure Virtual Machines enabling you to migrate on-premises Virtual Machines to run in Windows Azure and manage them from your on-premises System Center installation. This new functionality enables a range of workload distribution and remote operations scenarios.
  • Enhanced Backup and Recovery Options
    The Data Protection Manager component in SP1 adds the option to host server backups in the Windows Azure cloud (in supporting countries), helping to protect against data loss and corruption while integrating directly into the existing backup administration interface in System Center. Learn more.

Learn more about System Center 2012 SP1 and start your evaluation.

Delivering Unified Device Management with Windows Intune and System Center 2012 Configuration Manager SP1 [Windows Intune blog, Jan 15, 2013]

Mike Schutz

General Manager, Windows Server and Management Product Marketing

Back in September, we announced our strategy around unified device management, and how the next releases of Windows Intune and System Center 2012 Configuration Manager will deliver on that vision. As part of today’s update to our Cloud OS vision, we’re pleased to announce that System Center 2012 Configuration Manager and Endpoint Protection Service Pack 1, as well as the latest Windows Intune service, are available today.

Together, these releases deliver a unified device management solution for the enterprise, built on a “People-centric” model, where the user is the focus, not the device. IT is able to provide users with access to the corporate resources (applications and data) they need on the devices they choose. Administrators are able to address the unique challenges created by Bring Your Own Device policies by being able to identify and manage endpoint devices, including Windows PCs (physical and virtual), tablets, smartphones, Macs, and embedded devices all through a unified administration console.

This blog post highlights new device management capabilities in Windows Intune and System Center 2012 Configuration Manager SP1. For information about new cloud and datacenter capabilities, please read the blog post located here.

Windows Intune addresses new challenges IT departments face when managing devices, including:

  • Providing management and software distribution across a range of mobile devices and platforms, including Windows RT, Windows Phone 8, Android, and iOS
  • Through integration with Configuration Manager 2012 SP1, IT administrators will be able to manage both corporate- and personally-owned devices with a single console, making it easier to identify and enforce compliance
  • A self-service portal for selecting and installing company apps

With the latest release, the Windows Intune service is now expanded to 45 additional countries taking the total to 87 countries worldwide.

Configuration Manager 2012 SP1 contains several enhancements, including:

  • Support for Windows 8 and Windows Server 2012, including delivery of Windows 8 applications, the ability to limit downloads on 3G and 4G network connections to prevent unwanted data charges, and support for Windows To Go
  • Native management of Windows Embedded devices
  • Support for PowerShell for administrative tasks
  • Windows Azure-based Distribution Points
  • Support for Mac OS X devices and Linux and Unix servers

Endpoint Protection 2012 SP1 contains enhancements, including:

  • Ability to automatically deploy definition update three times per day
  • Real-time administrative actions to update definitions, scan, and remediate issues quickly
  • Client-side merge of antimalware policies

For more information and to sign up for a free 30-day trial subscription to Windows Intune, click here. SP1 can be downloaded by MSDN and TechNet subscribers as well as through the Volume Licensing Software Center.

Modern Lifecycle on the Cloud OS [Brian Harry’s blog, Jan 15, 2013]

Brian Harry
Microsoft Technical Fellow,
Product Unit Manager for Team Foundation Server.

Microsoft announced a bunch of new releases today, advancing our Cloud OS vision. You can read more on the overall announcements here but I will focus on some new/improved application lifecycle management scenarios that support our DevOps initiatives.

The cloud continues to drive new demands on application development. It enables new ways of conceiving, building and delivering great app experiences. It does this by enabling rapid cycle times from idea –> delivery –> feedback. In order to really realize this virtuous cycle, you need to rethink the way you do development. Among the changes you need to consider is how your development, test and operations teams work together to deliver an experience and to cooperate to respond quickly to both problems and opportunities.

Our approach to Enterprise DevOps is anchored in Visual Studio 2012 and System Center 2012. The wave of Cloud OS announcements today integrates these with bunch of new application lifecycle management capabilities. These include:

Global Service Monitor (GSM) – A new service offering in the System Center family that allows you to monitor and measure your application from points of presence around the world. GSM uses “Web Tests” you can develop with Visual Studio Ultimate Edition to replay scenarios against your app and measure availability and performance.

Lab Management & Windows 2012 – With the System Center 2012 SP1 release today and Visual Studio 2012 Update 1, you can use Lab Management with Windows 8 hosts to streamline your flow of code from development into the test environment – removing friction from your dev process.

Incident integration – With Operations Manager in System Center 2012 SP1 and TFS/VS 2012, we’ve further streamlined the workflow for managing production incidents. An ops engineer can easily escalate an incident to the dev team for investigation and collaboration from within Ops Manager. The dev can open the System Center diagnostic information inside the VS Intellitrace debugging experience to quickly get at the root cause.

These are just a few of the capabilities we are working on to streamline the modern build, measure, learn development cycle. This is one more installment in our continuing commitment to deliver new value regularly.

Everything you need from the VS/TFS side were included in the Update 1 we shipped at the end of November.

Michael Park and Mike Schutz: Cloud OS Announcement [speech transcripts, Jan. 15, 2013]

JOEL SIDER: Hi, everybody. Thanks for joining us. I’m Joel Sider. I’m a public relations manager here in Server & Tools. We’re going to go ahead and get started.
We’re here today to talk to you about several new Microsoft products and services, all of which demonstrate progress against Microsoft’s Cloud OS vision.
Corporate vice president of Server & Tools Marketing Michael Park is going to lead off with a brief overview of the Cloud OS, and then our general manager Mike Schutz here in Server & Tools will talk through the new offerings and how they fit into the Cloud OS story for customers and partners.
And we’re very fortunate to have with us Alan Bourassa, CIO of EmpireCLS, and Jess Coburn, CEO of hosting service provider Applied Innovations. You’ll hear from them about how they’re using Microsoft technologies in their businesses.
There will be an opportunity for Q&A, for question and answer, with the Lync messaging functionality at the end of the call.
Our press release, Michael’s blog post, and links to additional resources are all available on Microsoft.com/newscenter.
We do ask that you keep your phone muted until the end.
So I think with that, we can get started with Michael Park.
MICHAEL PARK: Thanks, Joel.
Good morning, everyone. I want to thank you again for joining us for this announcement today.
You likely have heard us talk about the Cloud OS over the past few months behind the Windows Server 2012 launch that we did in September, and I think it’s useful that we set the stage today to clarify and define what we’re seeking to accomplish with our strategy for commercial IT as a whole.
At Microsoft we’ve built a very successful franchise on the concept of an OS. At its core, an operating system manages hardware drivers and provides an application platform for PCs, and we’ve been very successful in that domain in the past.
We think that the OS will play a much more important role in the era of the cloud, moving beyond PCs to an OS that can manage the underlying infrastructure and application platform across public clouds, private clouds, and hosting service provider clouds.
Our vision is to create and provide customers with one consistent platform for infrastructure, applications and data, spanning customer datacenters, hosting service provider datacenters and the Microsoft public cloud.
And the Cloud OS is what drives our product strategy and road map to help our customers embrace the transformational trends in IT that we’re all familiar with, such as various cloud computing models, the consumerization of IT, the new generation of connected applications and big data.
So, more specifically, what does Cloud OS mean for IT organizations? IT organizations are going to be able to deliver and manage powerful modern applications that can help IT meet the demands of business in terms of speed, cost-effectiveness and flexibility of these applications.
They’ll be able to give employees personalized experiences with apps and data on virtually any device by tackling the consumerization of IT and bring your own devices with what we call people-centric IT.
Further, it means that IT will manage data, both structured and unstructured, in a way that can help end users unlock insights with greater ease and end-user adoption than ever before.
It also means that IT can shift to more efficiently managing their datacenter resources across not just the servers but across networking, storage and compute, all as a singular resource pool instead of the quagmire of complexity that many of them live in today.
And furthermore, this extends beyond just the private cloud, their own datacenter resources, but out to the partner and the public clouds with a modern hybrid IT infrastructure.
So Microsoft’s approach to this Cloud OS is unique, and it starts with our core platforms, Windows Server and Windows Azure, working together as a consistent platform across the three datacenters we’ve been talking about, with a consistent set of capabilities designed to make it easier for IT and developers to do their jobs.
You know, specifically, we’re talking about differentiation in five key areas. One is to deliver flexible application development tools and languages, not just .NET but also open source in this world of heterogeneous application development.
Second is a single user identity across clouds through Active Directory and Windows Azure Active Directory.
Third is our unified management with System Center for a single control plane to manage IT resources across private, hosted and public clouds.
The fourth area is integrated virtualization built into Windows Server to virtualize not just the servers but also the network, storage and applications with portability across the different clouds.
And last but not least, a data platform powered by SQL to power mission-critical business applications and give end users BI solutions with a wide range of tools ranging from Excel all the way out through Hadoop.
These consistent five capabilities are what make the Microsoft approach to the Cloud OS unique.
Our knowledge is deeply informed by two vantage points. No. 1 is that we’ve got firsthand knowledge running over 200 cloud services for a billion plus customers and over 20 million businesses around the world from our own datacenters around the world. And second, we’ve also learned a lot from our customers by running more than 75 percent of the world’s servers on-premises that support commercial IT infrastructure today.
We take all the learning from these services and servers into how we think about delivering the Cloud OS — Windows Server and Windows Azure and beyond to SQL Server, System Center and Visual Studio, all of which customers and partners can then use to deliver cloud infrastructure and services of their own.
For example, Windows Server 2012 delivers capabilities taken from our public cloud datacenters, things like cluster-aware updates for lower downtime, the use of industry standard storage for resilient failover and support for multitenant, high-density websites. It’s pretty cool stuff, and it’s stuff that we’re learning as we do in sharing that learning in the products we develop.
It’s a virtuous cycle of development and an important reason why customers can really bet on Microsoft in the cloud era. Our breadth of experience across private, public and hybrid cloud is unmatched, whereas other vendors are tending to specialize in one or another area.
JOEL SIDER: Thanks, Michael.
So with that, we’d like to go ahead and bring in Mike Schutz into the conversation. As I said, Mike is a general manager here in Server & Tools for product marketing. He’s going to talk about the new products and services and how they fit into the Cloud OS, including customer datacenters, the role hosting service providers play and also managing the consumerization of IT.
MIKE SCHUTZ: Thanks, Joel.
Welcome, everybody, and thank you for joining us.
Michael talked about one of the core tenets of the Cloud OS, which is helping customers on their own path to transform their datacenters.
By that we mean shifting from managing individual servers with CPUs, disk drives and network adapters to really managing and deploying their datacenter resources as a whole. It’s about managing storage, network and compute holistically as a singular cloud infrastructure, and even extending their datacenters to hosted and public clouds, all with the scale and availability that they need to build and deploy applications to respond to the needs of their unique businesses.
As you know, we delivered Windows Server 2012 and released it last September. Windows Server 2012 represents the foundation of these capabilities, delivering hundreds of new advancements and enhancements in virtualization, in storage, in networking, as well as automation.
Windows Server 2012, as Michael pointed out, is a prime example of how we’re bringing our public cloud lessons and investments to our products that our customers and partners deploy in their own datacenters.
Some examples of that are things that we learned around deploying multitenant services in our public cloud and to deliver network virtualization, which represents the foundation for software defined networking, updating a cluster of servers of up to 64 hosts and 8,000 virtual machines without having any service downtime, and so doing that with cluster level updating, as well as leveraging innovations and storage to deliver high-scale resilient storage to power the cloud infrastructure on industry standard hardware.
The customer response and industry response to Windows Server 2012 has just been incredible. For example, a recent Enterprise Strategy Group survey found that 90 percent, nine out of 10 customers, plan to deploy Windows Server 2012 in the next two years. We’re really excited by the response we’re hearing from our customers and partners around Windows Server 2012.
And now we’re announcing the release of System Center 2012 SP1, service pack one. This is an update to System Center 2012 that brings the full range of System Center management capabilities that our customers have grown to know and love to Windows Server 2012 for private and hybrid clouds.
Things like multitenancy, storage virtualization, network virtualization, which provides the foundation for software-defined networking, and all of the great capabilities that Windows Server delivers are now brought to bear with System Center 2012 SP1, including the support for non-Windows operating systems and multi-hypervisor environments, so customers can leverage their existing infrastructure investments and still take advantage of all of the new capabilities that have been delivered.
System Center is a true hybrid cloud management solution. It provides a single tool to manage cloud-based applications and resources, whether they’re running in our customers’ datacenter, a hosted service provider’s datacenter or in a Microsoft datacenter with Windows Azure.
Customers can use System Center to move virtual machines to Windows Azure and manage them within the System Center console that they use today.
They can also use System Center 2012 SP1 to backup servers to Windows Azure in a hybrid environment to protect against data loss and corruption.
Additionally, System Center 2012 SP1 includes support for a new Windows Azure-based service that we’re announcing today called Global Service Monitor or GSM. GSM is a companion service, if you will, to System Center 2012 SP1 that helps customers monitor and boost the performance of Web applications.
GSM extends the application monitoring capabilities that are already in System Center 2012 SP1 with the Operations Manager component, and it uses Windows Azure locations around the globe to give customers a real true reflection of the end-user experience that users will have with the Web applications.
GSM is now available for trial, and we’ll make that more broadly available in March.
JOEL SIDER: Thanks, Mike. So at this point, we’re going to bring in Alan Bourassa. He’s the CIO of EmpireCLS. He’s going to tell us a little bit about how he’s using Microsoft technologies, particularly Windows Server 2012 and System Center 2012 SP1, to really transform his company’s IT operations and the business overall. Welcome, Alan.
ALAN BOURASSA: Thank you. Thanks and good morning, everyone.
I’m Alan Bourassa, the CIO of EmpireCLS, and we are the second largest ground transport —
JOEL SIDER: Alan, I think we may have lost you for a minute. Are you there?
ALAN BOURASSA: Hello.
JOEL SIDER: Yes. I think we can hear you now, Alan. Go ahead.
ALAN BOURASSA: OK. I don’t know where I left off.
We’re the second largest ground transportation company in the world, and we operate in more than 650 cities worldwide. And we maintain three world-class datacenters worldwide. We actually just finished up doing the Golden Globes this past weekend.
Over the last several years, we have been transforming our business from just a luxury ground transportation company to a world-class hosting provider, targeting and specializing specifically in the ground transportation industry segment.
We now offer software as a service for our proprietary dispatch and reservation systems that we build and infrastructure as a service to those companies specializing in the ground transportation industry.
We took an existing software asset from our company and developed it and used it in a business model with our intellectual property and capital to make a public cloud offering and to build on that model for our cloud services to maximize our original software investment.
Today, more than 90 percent of our workloads are virtualized, and either run on our private, public cloud offering or on the Microsoft Azure platform.
We have also virtualized our entire mobile workforce and internal desktops are all provisioned and operate with cloud services.
With this, we’ve reduced our desktop problem resolution case by more than 75 percent; we’ve reduced our carbon footprint by more than 50 percent; we actually provide a carbon footprint to all of our customers to be as green as possible, which also helps us reduce our electrical costs and operating costs.
Empire has consolidated its datacenter footprint by more than 50 percent through HP server offerings, combined with Microsoft’s strong virtualized performance improvements and coupled with the announcement today of System Center 2012 SP1 management platform in this release today.
So how has Empire been able to do this while maintaining and lowering our costs?
The answer for us was clearly System Center 2012 and Windows Server 2012, and probably to use an overrated term, you know, the story is better together. We have been a Microsoft customer for the last several years, and before that we were a total UNIX shop. And now with the release today of System Center 2012 SP1, coupled with Windows Server 2012 features and functionality, we have now been able to transform our company from just a luxury ground transportation company to a world-class public cloud hosting provider.
We actually expect over the next several years that more than 50 percent of our total revenue is going to come from this new business transformation and venture.
Without Microsoft’s total cloud vision and solution, we’d never have been able to have accomplished this. This has allowed us to expand our business model and to becoming a public hosting provider in the cloud while maintaining and controlling our costs with the use, you know, again of the System Center 2012 SP1 suite of products. And there are many in this suite, and I’ll mention a few that have actually had a major impact on our business in relation to resiliency and our cloud offerings that have allowed us to expand our business and increase and maximize our revenues significantly.
So some of those examples are we’re using in the suite Virtual Machine Manager, which is the Hypervisor 3 environment, to basically manage for us hundreds of hypervisor workloads through one pane of glass. You know, the saying now in our company is we don’t build servers anymore; we build hypervisors and we build private and public clouds. This translates into many hypervisors and many clouds seen as one. It’s all about simplicity of management. All of this is seen as one view into many hypervisors and workloads. This keeps our cost down, but we’ve actually been able to add more than 550 transportation vendors over the last year to this platform, again without increasing costs.
We’re also using Data Protection Manager for all of our backup and recovery protection in this suite in both our private and public cloud and using Windows Azure backup services to augment and protect our data in the public Microsoft cloud offering.
And we’re using System Center Operations Manager to alert us to any issues in our private, public and Microsoft Azure cloud platform services before the users or customers see any changes even in the environment. This has allowed us to maintain an uptime of seven 9s, which until now has been unheard of in the industry.
We’re using System Center Service Manager for our user community and customers to submit tickets for new services, which then get provisioned automatically through System Center Orchestrator to reporting issues about services they need to help.
So clearly it’s a whole suite of products, and I’ve only touched on a few. You know, with Microsoft’s Hypervisor 3 and Windows Server 2012, we’ve moved even our communication platform of traditional PBX systems to Lync 2013 that now handles our voice, video and IM communications all 100 percent virtualized and in the cloud. We really only dreamed of doing this before; now it’s a reality.
And with Microsoft System Center 2012 and Windows Server 2012 we’re also virtualizing workloads like SQL Server 2012 in a cloud virtualized environment. I personally would have never dreamed of virtualizing this type of workload before either. In fact, I was a staunch opponent for a long time of a SQL workload being virtualized. That dream has now come true with the new releases that you’re hearing about today, and the release recently of Windows Server 2012.
Now, as evidence to support this, we now have more than 550 customers and vendors on our combined private, public and Microsoft Azure cloud services offering, and it’s generating additional revenues for our company that were not possible before.
Microsoft has been able to make the cloud environment more robust while making the management of these complex environments simpler for the IT staff members to manage.
So you might ask at this point, you know, why did EmpireCLS pick Microsoft? We, of course, in our evaluation of platforms looked at many vendors. One that you will always compare against typically when you’re talking about hypervisors is VMware. But in the final analysis though, VMware was a hypervisor virtualization play only and didn’t have the depth and breadth of a total solution for cloud services end-to-end that all companies need. Microsoft from my point of view is clearly the innovator and thought leader in the total cloud sol, and clearly the thought leader and visionary for businesses that want and need total cloud services solutions.
Now, examples of this are that we are seamlessly moving virtualized workloads between our private, our public and Microsoft Azure cloud services offerings seamlessly. It’s simple and it’s through one pane of glass. It can’t get simpler than that.
Of course, the discussion would not be complete also without mentioning some of the features that are enabling us or being enablers to be more competitive and help us transform our business model into a hosting provider offering. So I’ll get a little technical. We’re using some of the new features like software-defined networking that’s been unheard of in the industry, and we’re starting to embrace that, extensible network software switches, hypervisor replicas to maintain uptime and availability across datacenters, live storage and live virtual machine migration on shared nothing — unheard of, virtual fiber switches, scale-out file servers, and software network isolation for our tenants, just to mention a few of the hundreds of new features and functionalities introduced today.
So for us, for EmpireCLS, so all in all, Microsoft is and will continue in our view to be the thought leader and visionary in the complete cloud services offering space, and EmpireCLS actually bet on Microsoft cloud technology end-to-end several years ago, and it was the best bet we ever made. We couldn’t be happier being a customer, and more so treated by them as a partner in helping us to grow actually our business through their complete and strong cloud services offering, and the primary thought leader and visionary for the cloud transformation that is going to continue to revolutionize and commoditize the cloud services industry.
I want to thank Microsoft and thank everyone for listening today. Thank you.
JOEL SIDER: Thanks so much, Alan.
So we’re going to take it back to Mike Schutz now. He’s going to talk about hosting service providers in the Cloud OS, building on what Alan talked about, and we have some news in that area as well.
MIKE SCHUTZ: Thanks, Joel.
It really is inspiring to be able to work with customers like Alan and EmpireCLS, and watch how some of the technologies and products that we work on can play some small part in the business transformation.
So speaking of business transformation, we’re seeing a huge shift to cloud computing, as well as how hosting service providers play a really key role in that.
And as I mentioned earlier, Windows Server 2012 and System Center 2012 enable hybrid IT across private, hosted, as well as public cloud.
Hosted service providers are a really key role in our Cloud OS strategy and help customers transform their datacenters. So, in that vein, they’re a really important third cloud that Michael outlined earlier with respect to Cloud OS.
We already have an enormous ecosystem of hosting service providers that can participate in the Cloud OS. Over 14,000 use Windows Server today, over 8,500 use SQL Server, and over 5,500 use Exchange, and the list goes on.
So, for example, System Center 2012 SP1 delivers an API that we call Service Provider Foundation that hosting providers can use that helps integrate with a customer’s on-premises datacenter solution and connects their service portals with customer management console. This is a great capability for enterprise customers because it gives them the datacenter elasticity where they can integrate a service provider’s cloud capacity and management directly into their datacenter operations. This provides them more infrastructure scale.
It’s also great for hosting service providers because of what they can offer to clients and how they can grow their businesses by acting as a seamless extension to a customer’s datacenter.
So, with our solution, the story gets even better for hosting partners today with what we call Windows Azure Services on Windows Server. We initially previewed these capabilities last summer in Toronto, Canada, at the Worldwide Partner Conference, and now they’re generally available.
These are high-scale websites and virtual machine hosting capabilities that we originally built for Windows Azure, and we’re now taking those and making them available to our hosting service providers to run on their own Windows Server 2012 and System Center infrastructure.
These are specifically designed for easy incorporation into a service provider’s existing services so that they can use them to differentiate their offerings and customize them without a lot of heavy lifting.
The Windows Azure services on Windows Server are a great example of the virtuous cycle of development that Michael spoke about earlier where we bring our learnings and investments in the public cloud and apply them to our products for customers and partners to deploy in their own datacenters and fulfill part of that three cloud vision that the Cloud OS is based on.
We’re really excited about the opportunities that we have together with our hosting service providers, and the announcements today provide a big step forward in that model.
JOEL SIDER: Great. Yeah, again that was Mike Schutz, general manager here in Server & Tools at Microsoft.
With that, let’s bring Jess in from Applied Innovations. Jess, tell us about Applied Innovations. Tell us how you’re using Windows Server and System Center and the new technologies that Mike talked about.
Jess, have we got you there?
Bear with us.
JESS COBURN: I guess had myself muted. I apologize for that.
JOEL SIDER: Hey, no worries, appreciate it.
JESS COBURN: I am a technologist, believe it or not.
So Applied Innovations is a 14-year old Web hosting company based down here in Boca Raton, Florida. Traditionally, we’ve catered to developers, designers and agencies, and predominantly within the U.S. Today, we power more than 35,000 domains for 10,000 customers, and we operate over 2,500 server instances with 90 percent of that running on top of Hyper-V.
Back in 2009, we were one of the first hosts globally to launch Hyper-V and System Center, and at that time we really opted for Hyper-V because of the economics.
Back in 2009, there were really only two choices for us. So it was Hyper-V or it was VMware. And when looking at the two we looked at Hyper-V, and we went in that direction because, one, it was the overall cost, but more important than that was the fact that we could take our Windows Server admins, leverage their experience and expertise, and deliver this new offering with little pain to ourselves. And so that’s the direction we went.
Now, fast forward to today, and it’s four years later, and we’re still deployed on top of Hyper-V.
And there’s a wealth of different solutions out there for us from open source solutions like OpenStack all the way up to our friends at VMware, and we’re still focused on Hyper-V. And the reason for that is really that it’s this notion of one consistent platform and a Cloud OS, and that’s what’s kept us there, and we’re really excited to see that come forward.
So with System Center 2012 SP1 and the new technology there it’s really going to help our business. You know, the cloud is redefining our business and our industry. And when in the past we saw that we were deploying these Web workloads, and that’s been a major focus of our business, now that we’ve started deploying cloud we’re starting to see what our customers host transition from Web to more of enterprise-type workloads.
Last year, we acquired another company, and that company focused exclusively on building a channel business of partners that delivered managed services to small and medium businesses. And they’re one of those businesses that deployed these enterprise IT workloads that were traditionally on-prem.
In the last three months, we’ve seen that business grow by 25 percent, and I believe by delivering more services that are exposed in Hyper-V and System Center we’ll be able to grow that further.
So SP1 brings some of these new services to us, right? There’s the ability to leverage Hyper-V Replica and offer disaster recovery services. There’s network virtualization and allowing these customers to move workloads to and from the cloud seamlessly. And then there’s the service provider foundation in SP1 that allows our customers to manage all their services directly within System Center.
Now they’ll be able to leverage the same System Center tools they use today on-prem to manage their on-prem infrastructure and also manage their cloud hosted with us or with Azure from that one pane of glass. So, in one pane of glass, they’ll be able to manage all their infrastructure, and for our managed Web business that’s really key.
But there’s also our shared hosting business, and over the last 14 years shared hosting really hasn’t changed much. But when Microsoft announced Windows Azure services for Windows Server, that’s really a big change in shared hosting. You know, Microsoft’s taken a lot of their learnings from Azure and making it available to their partners. And because of this new service, we’ll be able to stand up an elastically scalable cloud hosting environment that will allow our shared hosting customers to get that same benefit of the cloud that previously was only available to our managed cloud customers, and for us that’s really exciting.
So I think it’s pretty safe to say that the cloud has really changed our industry and changed our business, and by Microsoft taking the approach that this isn’t just an OS but a Cloud OS and this one consistent platform, they’re really changing it in a way that we’re able to leverage it and build successful solutions to our partners and our customers in a way that’s easy to understand and easy to deploy, and at the end of the day everybody wins.
JOEL SIDER: Jess, thanks so much.
So we’re going to go back to Mike now to cover the kind of last part of the news and story today.
MIKE SCHUTZ: Thanks, Joel.
And again, as the last part of the announcement, we really want to focus on some of the trends that Michael Park mentioned earlier around the undeniable number of connected devices, the consumerization of IT and the bring-your-own-device trends that we’re seeing in the market.
We all know this is a huge shift for IT organizations and brings with it a whole host of new challenges as end users bring new types of devices into the workplace and would like to work from those devices.
A core part of our Cloud OS vision is to help IT do what we talk about as a people-centric approach to these trends. What this means is we’d like to put the user first by delivering personalized experiences that give employees the productivity they need and have come to expect on the devices that they choose.
System Center 2012 SP1 plays a key role here of providing consistent management of not just the datacenter resources that I talked about previously but also PCs and devices that are used by employees across the company.
System Center 2012 SP1 and the new Windows Intune cloud-based service that’s now available provide a unified PC and mobile device management solution. It’s a comprehensive approach that lets IT use one management solution to provide users with access to corporate resources on the devices that they choose, and therefore represents a win for both users as well as IT. This solution provides management and software distribution with enterprise scale of up to 100,000 devices.
Windows Intune is now offered in 87 countries around the world, representing the majority of the world’s population.
This combination of Windows Intune and System Center is really ideal for helping IT secure and manage the new generation of powerful Windows 8 PCs, Windows RT tablets, Windows Phone 8 smartphones, as well as all the diverse other platforms in today’s modern enterprise, including Android and iOS. So this is a really exciting announcement for us to help bring together the management of PCs and devices to help IT and end users at the same time.
JOEL SIDER: Great. Thanks again, Mike.
So that really brings an end to the overview of the presentation and news. We’ll open it up for Q&A. If you do have questions, you’re welcome to communicate that through the instant messaging functionality on Lync.
As I mentioned at the top of the call, there is a press release, along with a blog post by Michael Park about the Cloud OS, and links to a whole set of other information. All that is on Microsoft.com/newscenter. You can also email questions to our PR team at cloudosnewsbriefing@waggeneredstrom.com.
So we’re getting a question from Timothy Prigget-Morgan (ph) asking about the release notes for System Center SP1, as well as Windows Azure services for Windows Server.
The best place to start is Microsoft.com/systemcenter, which will take you to the more detailed documentation if you’re interested in that.
There’s also a more detailed blog post on what’s called the server cloud blog. If you go to the News Center site, you’ll find links to all of this.
Well, good. So I think we’ll go ahead and wrap up. Again, please let us know your questions, take a look at the information up on News Center, and with that we’ll go ahead and close today’s call. Really appreciate everyone tuning in.
END

New Windows Server 2012 R2 Innovations – Download Now [Windows Server Blog, Aug 6, 2013]

Windows Server 2012 R2 is in preview right now and ready for your evaluation. We have been rolling out detailed information on our Cloud OS vision though Brad Anderson’s What’s New in 2012 R2 blog series. That will continue but we thought you would like a short consolidated list for consideration. Here are some key innovations in Windows Server 2012 R2.

Storage transformation – Delivers breakthrough performance at a fraction of the cost

  • The storage tiering feature of Storage Spaces in Windows Server 2012 R2 automatically tiers data across hard disks and solid state drives based on usage to dramatically increase storage performance and cost efficiency.

Software defined networking – Provides new levels of agility and flexibility

  • Network virtualization in Windows Server 2012 R2, along with the management capabilities in System Center 2012 R2 provides the flexibility to place any virtual machine on any node regardless of IP address with isolation.
  • New in-box gateway in Windows Server 2012 R2 extends virtual networks to provide full connectivity to physical networks as well as access to virtual networks over the internet.

Virtualization and live migration – Provides an integrated and high-performance virtualization platform

  • Cross-version live migration enables virtual machines running on Windows Server 2012 to be migrated to Windows Server 2012 R2 hosts with no downtime.
  • Live migration compression provides dramatic time savings (approximately 50% or greater) by using spare CPU cycles to compress live migration traffic with no special hardware.
  • Live migration with RDMA enables offloading of the process to the NICs (if they support RDMA) for even faster live migrations.

Access & Information Protection – Empowering your users to be productive while maintaining control and security of corporate information with Windows Server 2012 R2

  • Enable users to work on the device of their choice (through BYOD programs or on personal devices) by providing a simple registration process to make the devices known to IT and be taken into account as part of your conditional access policies
  • Deliver policy-based access control to corporate applications and data with consistent experiences across devices
  • Protect corporate information and mitigate risk by managing a single identity for each user across both on-premises and cloud-based applications and enabling multi-factor authentication for additional user validation

Java application monitoring – Enables deep application insight into Java applications.

  • Provides performance and exception events as well as level alerting within Operations Manager for Java applications.
  • Supports Tomcat, Java JDK, and other Java web services frameworks.
  • Line-of-code level traceability with performance and exception metrics for .NET and Java application monitoring for more actionable, tool-driven dev-ops collaboration

This is by no means a comprehensive lists of new features and benefits, but we just wanted to give you some information on the key focus areas. For those of you interested in downloading some of the products and trying them, here are some resources to help you:

alias “Hybrid Cloud PR” Microsoft unleashes fall wave of enterprise cloud solutions [press release, Oct 7, 2013]

New Windows Server, System Center, Visual Studio, Windows Azure, Windows Intune, SQL Server, and Dynamics solutions will accelerate cloud benefits for customers.

Microsoft Corp. on Monday announced a wave of new enterprise products and services to help companies seize the opportunities of cloud computing and overcome today’s top IT challenges. Complementing Office 365 and other services, these new offerings deliver on Microsoft’s enterprise cloud strategy.

Satya Nadella, Cloud and Enterprise executive vice president, said, “As enterprises move to the cloud they are going to bet on vendors that have best-in-class software as a service applications, operate a global public cloud that supports a broad ecosystem of third party services, and deliver multi-cloud mobility through true hybrid solutions. If you look across the vendor landscape, you can see that only Microsoft is truly delivering in all of those areas.” More comments from Nadella can be found on The Official Microsoft Blog.

Hybrid infrastructure and modern applications

To help customers build IT infrastructure that delivers continuous services and applications across clouds, on Oct. 18 Microsoft will release Windows Server 2012 R2 and System Center 2012 R2. Together, these new products empower companies to create datacenters without boundaries using Hyper-V for high-scale virtualization; high-performance storage at dramatically lower costs; built-in, software-defined networking; and hybrid business continuity. The new Windows Azure Pack runs on top of Windows Server and System Center, enabling enterprises and service providers to deliver self-service infrastructure and platforms from their datacenters.

Building on these hybrid cloud platforms, customers can use Visual Studio 2013 and the new .NET 4.5.1, also available Oct. 18, to create modern applications for devices and services. As software development becomes pervasive within every company, the new Visual Studio 2013 Modern Lifecycle Management solution helps enable development teams, businesspeople and IT managers to build and deliver better applications, faster.

Enabling enterprise cloud adoption

Recognizing that most enterprises will take a hybrid approach to cloud, Microsoft wants to help customers utilize their investments in on-premises software solutions toward the adoption of cloud computing. On Nov. 1, Microsoft will offer Enterprise Agreement (EA) customers access to discounted Windows Azure prices, regardless of upfront commitment, without overuse penalties and with the flexibility of annual payments.

As another part of this effort to reduce cloud adoption barriers, Microsoft on Monday announced a strategic partnership with Equinix Inc. Building on recently announced partnerships with AT&T and others, this alliance will provide customers with even more options for private and fast connections to the cloud. Customers will be able to connect their networks with Windows Azure at Equinix exchange locations for greater throughput, availability and security features.

Governments are among the most demanding enterprise customers. To help U.S. federal, state and local government agencies realize the benefits of public cloud computing, Microsoft is introducing its Windows Azure US Government Cloud. This will offer U.S. government customers a dedicated community cloud for data, applications and infrastructure, hosted in the continental U.S. and managed by U.S. personnel. Windows Azure has been granted FedRAMP Joint Authorization Board Provisional Authority to Operate, making it the first public cloud of its kind to achieve this level of government authorization.

Data platform and insights

As part of its vision to help more people unlock actionable insights from big data, Microsoft next week will release a second preview of SQL Server 2014. The new version offers industry-leading in-memory technologies at no additional cost, giving customers 10 times to 30 times performance improvements without application rewrites or new hardware. SQL Server 2014 also works with Windows Azure to give customers built-in cloud backup and disaster recovery.

For big data analytics, later this month Microsoft will release Windows Azure HDInsight Service, an Apache Hadoop-based service that works with SQL Server and widely used business intelligence tools, such as Microsoft Excel and Power BI for Office 365. With Power BI, people can combine private and public data in the cloud for rich visualizations and fast insights.

People and devices in the cloud

The proliferation of cloud applications, data and consumer devices is moving many enterprises to a bring-your-own-device model. The new release of Windows Intune, also available Oct. 18, combines with System Center Configuration Manager to help IT departments give mobile employees security-enhanced access to the applications and data they need on the Windows, iOS and Android devices of their choice. This unified management environment for PCs and mobile devices complements the new access and information protection capabilities in Windows Server 2012 R2.

Further, with Windows Server 2012 R2 Microsoft is introducing the Microsoft Remote Desktop app, available for download in application stores later this month, to provide easy access to PCs and virtual desktops on a variety of devices and platforms, including Windows, Windows RT, iOS, OS X and Android.

Software as a service business solutions

The next major version of the company’s CRM solution, Microsoft Dynamics CRM Online Fall ’13 will be available later this month, helping make customer interactions more personal via contextual information for deeper insights than the previous version, delivered on a variety of devices.* The on-premises version is expected to be available later in the fall for deployment either in-house or hosted by a partner. More information is available here. In addition, Microsoft Dynamics NAV 2013 R2 is now available, offering small and midsize businesses interoperability with Office 365, full multitenant support, and a range of tools designed to support large-scale hosting of the application on Windows Azure.

More information on Monday’s announcements can be found at the Microsoft News Center.

Founded in 1975, Microsoft (Nasdaq “MSFT”) is the worldwide leader in software, services and solutions that help people and businesses realize their full potential.

* Devices include Windows 8 tablets and iPad tablets with Microsoft Dynamics CRM 2013; Windows Phone 8, iPhone and Android phone shortly following the release of Microsoft Dynamics CRM 2013.

Announcing the General Availability of Windows Server 2012 R2: The Heart of Cloud OS [Windows Server Blog, Oct 18, 2013]

For years now, Microsoft has been building and operating some of the largest cloud applications in the world. The expertise culled from these experiences along with our established history of delivering market-leading enterprise operating systems, platforms, and applications has led us to develop a new approach for the modern era: the Microsoft Cloud OS.

The Cloud OS vision combines Microsoft knowledge and experiences with today’s trends and technology innovations to deliver a modern platform of products and services that helps organizations transform their current server environment into a highly elastic, scalable, and reliable cloud infrastructure. Utilizing the software that powers the Cloud OS vision, organizations can quickly and flexibly build and manage modern applications across platforms, locations, and devices, unlock insights from volumes of existing and new data, and support end-user productivity wherever and on whatever device they choose.

At the heart of Cloud OS is Windows Server 2012 R2. Delivering on the promise of a modern datacenter, modern applications, and people-centric IT, Windows Server 2012 R2 provides a best-in-class server experience that cost-effectively cloud-optimizes your business. When you optimize your business for the cloud with Windows Server 2012 R2, you take advantage of your existing skillsets and technology investments. You also gain all the Microsoft experience behind building and operating private and public clouds – right in the box. Delivered as an enterprise-class, the simple and cost-effective server and cloud platform Windows Server 2012 R2 delivers significant value around seven key capabilities:

image_thumb26

Server virtualization. Windows Server Hyper-V offers a scalable and feature-rich virtualization platform that helps organizations of all sizes realize considerable cost savings and operational efficiencies. With Windows Server 2012 R2, server virtualization with Hyper-V pulls ahead of the competition by offering industry-leading size and scale that makes it the platform of choice for running your mission critical workloads. Using Windows Server 2012 R2, you can take advantage of new hardware technology, while still utilizing the servers you already have. This functionality enables you to virtualize today and be ready for the future tomorrow.

Whether you are looking to expand virtual machine mobility, increase virtual machine availability, handle multi-tenant environments, gain bigger scale, or gain more flexibility, Windows Server 2012 R2 with Hyper-V gives you the platform and tools you need to increase business agility with confidence. Plus, you can also benefit from workload portability as you extend your on-premises datacenter into a service provider cloud or Windows Azure.

Storage. With the increase in new applications, the explosion of data, and growing end-user expectations for continuous services, there has come a significant increase in storage demands. Windows Server 2012 R2 offers a wide variety of storage features and capabilities to address the storage challenges faced by organizations. Whether you intend to use cost-effective, industry-standard hardware for the bulk of your workloads or Storage Area Networks for the most demanding ones, Windows Server 2012 R2 provides you with a rich set of features that can help you maximize the returns from all of your storage investments.

Microsoft designed Windows Server 2012 R2 with a strong focus on storage capabilities, including improvements in the provisioning, accessing, and managing of storage and the transfer of data across the network that resides on that storage. The end result is a storage solution that delivers the efficiency, performance, resiliency, availability, and versatility you need at every level.

Networking. New technologies, such as private- and public-cloud computing, mobile workforces, and widely dispersed assets have transformed the business landscape and altered how we manage networking and network assets. Still, the main goal remains the same: keep all networking components connected to ensure smooth data transmission and reliable access by users and customers to the services they need when they need them.

Windows Server 2012 R2 makes it as straightforward to manage an entire network as a single server, giving you the reliability and scalability of multiple servers at a lower cost. Automatic rerouting around storage, server, and network failures enables file services to remain online with minimal noticeable downtime. In addition, Windows Server 2012 R2 provides the foundation for software-defined networking, out-of-the box, enabling seamless connectivity across public, private, and hybrid cloud implementations.

Whatever your organization’s needs, from administering network assets to managing an extensive private and public cloud network infrastructure, Windows Server 2012 R2 offers you solutions to today’s changing business landscape. These capabilities help reduce networking complexity while lowering costs, simplifying management tasks, and delivering services reliably and efficiently. With Windows Server 2012 R2 you can automate and consolidate networking processes and resources, more easily connect private clouds with public cloud services, and more easily connect users to IT resources and services across physical boundaries.

Server management and automation. Datacenter infrastructure has become more and more complex. Multiple industry standards are confusing hardware vendors. Customers are looking for guidance on how to best automate their datacenter while adopting a standards-based management approach supporting their multi-vendor investments. Windows Server 2012 R2 enables IT professionals to offer an integrated platform to automate and manage the increasing datacenter ecosystem. Features within Windows Server 2012 R2 enable you to manage many servers and the devices connecting them, whether they are physical or virtual, on-premises or in the cloud.

Web and application platform. Chances are your organization already uses or is planning to use a combination of on-premises and off-premises IT resources and tools for building a hybrid environment. To protect your existing investment in on-premises applications as you begin to migrate to the cloud, you need a scalable application and web platform that enables you to manage your applications and websites in a unified way.

Windows Server 2012 R2 builds on the tradition of the Windows Server family as a proven application platform, with thousands of applications already built and deployed and a community of millions of knowledgeable and skilled developers already in place. The capabilities included in Windows Server 2012 R2 offer your organization even greater application flexibility, helping you build and deploy applications either on-premises, in the cloud, or both at once, with hybrid solutions that can work in both environments.

As your organization plans for and moves to a hybrid or cloud-based environment, Windows Server 2012 R2 provides the tools you need to build, provision, and manage multi-tenant environments while still supporting your large enterprise or the many customers hosted within your service provider infrastructure.

Access and information protection. Information exists almost everywhere in your organization: on servers, laptops, desktops, removable devices, and in emails. Users need to be able to access this information from anywhere, share it where appropriate, and achieve maximum productivity with the assets they have. To further complicate matters, the move to cloud computing necessitates being able to secure enterprise applications that no longer live in your datacenter.

Microsoft assists you in supporting consumerization of IT and in retaining effective management, security, and compliance capabilities. The enterprise tools and technologies that Microsoft provides can help with key enterprise tasks such as identifying non-corporate devices, delivering applications and data to those devices with the best possible user experience, and establishing and enforcing policies on devices based on the end user’s role within the organization. Microsoft enterprise tools and technologies can help IT staff to maintain a high level of security across all device types, whether the devices are corporate or personal assets, and establish security measures that protect their organization’s systems, data, and network.

To address these information needs and challenges, organizations have to make fundamental shifts in how they approach identity and security. Windows Server 2012 R2 helps you accommodate these changes through exciting new remote access options, significant improvements to Active Directory and Active Directory Federation Services, and the introduction of policy-based information access and audits with Dynamic Access Control, and new scenarios to help customers provide access to corporate resources for users from their own devices. With these new capabilities, you can better manage and protect data access, simplify deployment and management of your identity infrastructure, and provide more secure access to data from virtually anywhere across both on-premises well managed devices and new consumer orientated form factors.

Virtual Desktop Infrastructure. Most IT departments currently face the challenge of enabling worker productivity on a growing number of mobile devices in the workplace. Virtual Desktop Infrastructure (VDI) helps you accommodate these new devices by enabling them to access a centralized instance of the Windows desktop in the datacenter. By virtualizing these desktop resources, you can alleviate device compatibility and security issues while still delivering a consistent, familiar experience that enhances end-user productivity. With Windows Server 2012 R2, Microsoft makes it easier and more cost-effective to deploy and deliver virtual desktop resources across workers’ devices.

VDI technologies in Windows Server 2012 R2 offer easy access to a rich, full-fidelity Windows environment running in the datacenter, from virtually any device. Through Hyper-V and Remote Desktop Services (RDS), Microsoft offers three flexible VDI deployment options in a single solution: Pooled Desktops, Personal Desktops, and Remote Desktop Sessions (formerly Terminal Services). With Windows Server 2012 R2, you get a complete VDI toolset for delivering flexible access to data and applications from virtually anywhere on popular devices, while also helping to maintain security and compliance.

To compete in the global economy and keep up with the pace of innovation, IT organizations must improve their agility, their efficiency, and their ability to better manage costs while enabling their business and end users to stay continuously productive.

Microsoft has gained expertise from years of building and operating some of the largest cloud applications in the world. We’ve combined this expertise with our experiences in delivering market-leading enterprise operating systems, platforms, and applications to develop a platform for infrastructure, applications, and data: the Cloud OS.

The Microsoft Cloud OS delivers a modern platform of products and services that helps enterprise IT teams transform their current environment to a highly elastic, scalable, and reliable infrastructure. With Cloud OS, organizations can quickly and flexibly build and manage modern applications across platforms, locations, and devices, unlock insights from volumes of existing and new data, and support user productivity wherever and on whatever device they choose. Microsoft uniquely delivers the Cloud OS as a consistent and comprehensive set of capabilities that span on-premises, service provider, and Windows Azure datacenters, enabling enterprises to improve scale, elasticity, and availability of IT services.

At the heart of Cloud OS is Windows Server 2012 R2, which delivers upon the promises of a modern datacenter, modern applications, and people-centric IT. Whether you are an enterprise building out your own private cloud environment or a service provider offering large-scale cloud services, Windows Server 2012 R2 offers an enterprise-class, simple and cost-effective solution that’s application-focused and user centric. With Windows Server 2012 R2, you can utilize the capacity of your datacenter, deliver best-in-class performance for your Microsoft workloads, and receive affordable, multi-node business continuity scenarios with high service uptime and at-scale disaster recovery.

We hope that you are as excited as we are to get started today!

alias “Cloud OS Network PR” Leading cloud service providers around the globe bet on Microsoft [press release, Dec 12, 2013]

Cloud OS Network partners provide customers with consistent cloud platform.

On Thursday, Microsoft Corp. introduced the Cloud OS Network, a worldwide consortium of more than 25 cloud service providers delivering services built on the Microsoft Cloud Platform: Windows Server with Hyper-V, System Center and the Windows Azure Pack. These organizations support Microsoft’s Cloud OS vision of a consistent platform that spans customer datacenters, Windows Azure and service provider clouds. Service providers in the Cloud OS Network offer Microsoft-validated, cloud-based infrastructure and application solutions designed to meet customer needs.

“This network of leading service providers will help our customers create datacenters without boundaries for apps, data and device management,” said Takeshi Numoto, Microsoft corporate vice president of Cloud & Enterprise Marketing. “That translates into greater diversity of solutions, more flexibility and lower operational costs for customers, allowing them to focus on their core business rather than managing datacenters.” More comments from Numoto can be found on the Official Microsoft Blog.

Hybrid benefits for customers

Every organization has different needs and different IT requirements for addressing those needs. With the Cloud OS Network, customers now have even more choice in deploying hybrid solutions on the Microsoft Cloud Platform — either in their datacenter, in Windows Azure or, now, through a network of service providers. Customers also benefit from uniquely tailored, fully managed services within their local market, as well as a high degree of technical consistency across environments, which prevents vendor lock-in and enables flexibility. As a result, customers can focus on increasing efficiencies, improving employee productivity and lowering operational costs. Customers interested in the Cloud OS Network and the services offered by these partners can find additional information here.

Cloud service provider opportunity

As cloud adoption accelerates, service providers are focused on delivering value-added services to meet customer demand for hybrid cloud solutions. By joining Microsoft in the Cloud OS Network, leading cloud service providers can quickly and cost-effectively develop new services, attract new customers and increase revenues. With the Microsoft Cloud Platform, service providers have access to the capabilities of and best practices from Windows Azure.

“CSC continues to expand our strategic partnership with Microsoft to increase value to our clients and bring next-gen solutions to market,” said Marc Silvester, vice president of Offerings Management at CSC. “CSC and Microsoft continue to partner on offerings that leverage the tremendous growth in apps, devices and data that are driven by the rise of cloud computing. As these technologies play an ever-increasing role in business, CSC and Microsoft are working together to drive more efficiency and value through Microsoft’s Cloud OS vision. CSC is proud to be part of Microsoft’s Cloud OS Network.”

Watch what service providers have to say about Microsoft’s new Cloud OS Network here.

Worldwide reach

Organizations within the Cloud OS Network cover more than 90 active markets around the world, serve more than 3 million customers every day and operate over 2.4 million servers in more than 425 datacenters.

image14

Partners in the Cloud OS Network include, among others, Alog, Aruba S.p.A., Capgemini, Capita IT Services, CGI, CSC, Dimension Data, DorukNet, Fujitsu Finland Oy., Fujitsu Ltd., iWeb, Lenovo, NTTX, Outsourcery, OVH.com, Revera, SingTel, Sogeti, TeleComputing, Tieto, Triple C Cloud Computing, T-Systems, VTC Digilink and Wortmann AG.

T-Systems offers enterprise customers services based on the Microsoft Cloud Platform [press release, Dec 12, 2013]

  • T-Systems joins Microsoft’s Cloud OS Network
  • Expansion of cloud portfolio in secure T-Systems data centers
  • Data privacy and compliance in accordance with German law

T-Systems and Microsoft Corporation are combining their expertise in cloud computing to enable T-Systems to offer hybrid cloud services. Building on their long-standing alliance, the two companies are now joining their efforts to provide hybrid cloud services to large customers in Germany and the European market. Deutsche Telekom (DT) already offers SME customers cloud-based Microsoft products, and this relationship extends the collaboration. Going forward, customers will be able to use Microsoft products via the DT subsidiary’s secure data centers.

A hybrid cloud approach – a combination of the private and public cloud – will allow customers to take advantage of cost-effective Microsoft infrastructure and move data from the public cloud seamlessly into the highly protected T-Systems private cloud, or vice versa. Customers can decide at any time which data they especially want to protect in a private cloud. T-Systems’ data centers in Germany are subject to the strict German regulations for data privacy and compliance. The combination of the different cloud offers guarantees that business customers get the highest levels of security and availability, as well as full flexibility.

Infrastructure, collaboration, big data

The Deutsche Telekom subsidiary brings its years of expertise in the operation of data centers and IT services to the relationship, as well as its competence in the provisioning of dynamic ICT services. T-Systems has also expanded its portfolio as a provider of cloud services and can therefore offer customers services that they need for their business. “By signing this agreement, we are raising the cooperation with Microsoft to a new level,” said Hagen Rickmann Director Sales at T-Systems. “Together we can offer our customers attractive products and the best of different computing worlds.” The parties will focus their joint work under this agreement in three areas: infrastructure, business collaboration, and big data.

“With the Cloud OS Platform, Microsoft offers a comprehensive technology across customer datacenters, service provider clouds and Windows Azure, based on Windows Server 2012 R2 with Hyper-V, System Center 2012 R2 and the Windows Azure Pack. This platform can enable the future of hybrid cloud scenarios. Especially with T-Systems, which is strong in the enterprise space, customers in Germany and Europe can take advantage of cloud services that adhere to German security and compliance laws,” explained Kai Göttmann, Director Server, Tools and Cloud at Microsoft.

About Deutsche Telekom
Deutsche Telekom is one of the world’s leading integrated telecommunications companies with 140 million mobile customers, over 31 million fixed-network lines and more than 17 million broadband lines (as of September 30, 2013). The Group provides products and services for the fixed network, mobile communications, the Internet and IPTV for consumers, and ICT solutions for business customers and corporate customers. Deutsche Telekom is present in around 50 countries and has 230,000 employees worldwide. The Group generated revenues of EUR 58.2 billion in the 2012 financial year – more than half of it outside Germany (as of December 31, 2012).
About T-Systems
Drawing on a global infrastructure of data centers and networks, T-Systems operates information and communication technology (ICT) systems for multinational corporations and public sector institutions. T-Systems provides integrated solutions for the networked future of business and society. The company’s some 52,700 employees combine industry expertise and ICT innovations to add significant value to customers’ core business all over the world. T-Systems generated revenue of around EUR 10 billion in the 2012 financial year.

ALOG adopts Microsoft Cloud OS [ALOG Notícias, Dec 13, 2013]

Alog Data Centers do Brasil, one of the main providers of Information Technology (IT) structure in the country announced a new partnership with Microsoft Corporation to offer full cloud computing solutions. Alog is the first data center services company in Brazil to become part of the Cloud OS network – Microsoft’s new initiative formed exclusively by strategic providers of the cloud sector that provide services based on Microsoft’s cloud services platform (Windows Server 2012 R2 with Hyper-V, System Center 2012 R2 and Windows Azure package). The main objective of the program is to give the clients, in a consistent fashion and at corporate level, the benefits of Microsoft’s cloud platform that has a hybrid format and offers flexibility with full services that meet each company’s specific core business.

Apart from accessing the most modern cloud platform, having direct contact with Microsoft’s team of engineers will allow Alog to develop cloud computing solutions according to each client’s particularities. “Since we are going to work on the development with Microsoft, we will safely deliver an even more mature and robust offer – with the levels of service that our clients seek”, explains Rodrigo Guerrero, Alog’s commercial director. “The possibility of providing hybrid solutions that connect environments in cloud to the current data centers already used, allows the clients to use it in a more intelligent way and make the most of the benefits in a complete and effective manner”, he adds.

“The network operational system in cloud based on Microsoft’s Cloud OS vision, allows Alog’s clients to accelerate and make the implementation of solutions over hybrid platforms more flexible. Therefore it is possible to optimize the combination of the existing infrastructure with the contracted from the services provider, in order to adapt the cloud environments and better meet the needs of each business sector “, explains André Echeverria, Microsoft’s general manager of the corporate cloud division.

The first “post-Ballmer” offering launched: with Power BI for Office 365 everyone can analyze, visualize and share data in the cloud

… and everything you could know about Satya Nadella’s solution strategy so far (from Microsoft’s Cloud & Enterprise organization):

  1. Power BI as the lead business solution and the Microsoft’s visionary Data Platform solution built for it
  2. Microsoft’s vision of the unified platform for modern businesses

Keep in mind as well: Susan Hauser [CVP, EPG Group of Microsoft] interviews Microsoft CEO Satya Nadella [Microsoft, Feb 4, 2014; published on Microsoft Youtube channel, Feb 5, 2014]: [Microsoft, Feb 4, 2014: “Satya Nadella is a strong advocate for customers and partners, and a proven leader with strong technical and engineering expertise. Nadella addressed customers and partners for the first time as CEO during a Customer and Partner Webcast event.”]

[Contributor Profile: Susan Hauser, Corporate Vice President,
Enterprise and Partner Group, Microsoft]

As a teaser Q: [6:43] How do you think about consumer and business, and how do you see them benefiting each other?

A: You know, one of the things that when we think about our product innovation, we necessarily don’t compartementalize by consumer and business, we think about the user. In many of these cases, what needs to happen is experiences. That’s for sure have to have a strong notion of identity and security, so I.T. control, where it’s needed, still matters a lot, and that’s something that, again, we will uniquely bring to market. But it starts with the user. The user obviously is going to have a life at home and a life at work. So how do we bridge that as there more and more of what they do is digitally mediated? I want to be able to connect with my friends and family. I also want to be able to participate in the social network at work, and I don’t want the two things to be confused, but I don’t want to pick three different tools for doing the one thing I want to do seamlessly across my work and life. That’s what we are centered on. When we think about what we are doing in communications, what we are doing in productivity or social communications, those are all the places where we really want to bridge the consumer and business market, because that’s how we believe end-users actually work. [8:01]

More information:
Satya Nadella’s (?the next Microsoft CEO?) next ten years’ vision of “digitizing everything”, Microsoft opportunities and challenges seen by him with that, and the case of Big Data [‘Experiencing the Cloud’, Dec 13, 2013, 2013] … as one of the crucial issues for that (in addition to the cloud, mobility and Internet-of-Things), via the current tipping point as per Microsoft, and the upcoming revolution in that as per Intel … IMHO exactly in Big Data Microsoft’s innovations came to a point at which its technology has the best chances to become dominant and subsequently define the standard for the IT industry—resulting in “winner-take-all” economies of scale and scope. Whatever Intel is going to add to that in terms of “technologies for the next Big Data revolution” is going only to help Microsoft with its currently achieved innovative position even more. But for this reason I will include here the upcoming Intel innovations for Big Data as well.
Microsoft reorg for delivering/supporting high-value experiences/activities [‘Experiencing the Cloud’, July 11, 2013]
Microsoft partners empowered with ‘cloud first’, high-value and next-gen experiences for big data, enterprise social, and mobility on wide variety of Windows devices and Windows Server + Windows Azure + Visual Studio as the platform [‘Experiencing the Cloud’, July 11, 2013]
Will, with disappearing old guard, Satya Nadella break up the Microsoft behemoth soon enough, if any? [‘Experiencing the Cloud’, Feb 5, 2014]
John W. Thompson, Chairman of the Board of Microsoft: the least recognized person in the radical two-men shakeup of the uppermost leadership [‘Experiencing the Cloud’, Feb 6, 2014]
Modern Applications: The People Story for Business [MSCloudOS YouTube channel, Feb 11, 2014]

We’ve positioned the animation to tell a story that will appeal to non-technical customers (i.e. the business decision makers) that will augment the product and technical stories we have developed. Think of it as an opening gambit to the kind of conversation we want to have with them. This is a “people story” about modern apps for business. This animation is aimed at taking the business angle and reinforcing our strong business app story. Learn More: http://www.microsoft.com/en-us/server-cloud/cloud-os/modern-business-apps.aspx

– THE BIG PICTURE: Microsoft Cloud OS Overview [MSCloudOS YouTube channel, Jan 21, 2014]

Hello!

imageMy name is Gavriella Schuster and I’m the general manager at the US server and cloud business. Today I’d like to talk to you about Microsoft’s vision of the unified platform for modern businesses and how—what we call the Cloud OS—can help you transform your business as you shift in a world demanding continuous, always on services at broad-scale accessed by a multitude of devices.

image

You are in the center of one of the largest IT transformations this industry have ever seen. No question what the big shifts are happening in IT today due to the strength mobility and devices, applications, Big Data and Cloud.

The proliferation of devices and the integration of technology has changed the way people live and work, and it opened the door for a multitude of new applications designed to meet every need. These applications are social, their mobile and they need to be scaleable which means many will have a cloud back-end.

These devices and applications produce a huge amount of data. In fact, the world of data is doubling every two to three years. More than ninety percent of the world’s data was developed just in the last couple of years. These trends are forcing IT to answer new and different question.

imageHow can you enable a mobile workforce work from anywhere on any device? How can you involve your applications to meet these new demand? How can you help businesses make faster and better decision? And, how do you ensure your infrastructure can and will scale to meet the demand?

Microsoft answer is the Cloud OS. The Cloud OS is Microsoft hybrid cloud solution comprised of Windows Server, Windows Azure, System Center, Windows Intune, and SQL Server. With shared planning, development, engineering and support across these technologies we’re bringing a comprehensive solution to support your business across a number of fronts—from infrastructure to data, to applications and devices.

image

When it comes to mobility and devices we empower people centric IT. Our solutions enable you to deliver a consistent and great user experience from anywhere, no matter the device, with the way to manage and protect it all.

Nearly every customer echoes the importance of enabling a bring-your-own-device environment as a direct driver of productivity.

Aston Martin, for instance, the luxury car manufacturer was challenged managing over 700 remote devices—laptops, desktops, smartphones—
across 145 dealerships in 41 countries. With Windows Intune in System Center Configuration Manager Aston Martin can now proactively manage these devices via a single cloud-based console, before employee productivity is affected. In any case where an employee’s device is stolen I can remotely wipe that device to protect your corporate data.

At the application level we enable modern business applications, so that you can quickly extend your applications with new capabilities and deploy on multiple devices, where your applications live, and move wherever you want.

In regards to data its all about Big Data, small data, and all data. The Cloud OS will help you unlock insights on any data, make it easier for everyone to access and perform analytics with tools they already use, like SharePoint an Excel, on any data, any size, from anywhere.

We have democratized access to this data so that the many not the few can uncover insights to power your business.

And lastly, at the core of the Cloud OS powering mobility applications and data is your infrastructure. Our goal is to help you transform your datacenter, to enable you to go from managing each server individually to enabling a single well-managed elastic and scaleable environment to power all your application compute, networking and storage needs.

We call this concept a datacenter without boundaries, where you get a consistent experience that takes you from the data center to the cloud and back if you wish, so that you have access to resources on-demand and the ability to move workloads around with maximum flexibility. This provides you with easy on, easy off with no cloud lock in.

image

What makes our Cloud OS vision different is this hybrid design at the core. You benefit from a common and consistent approach to development, management, identity, security, virtualization and data. Spending on premises to the cloud, your private cloud, a service provider cloud, and Windows Azure—Microsoft enterprise public cloud.

This is powerful for a number of reasons.

  • One, we deliver a flexible development environment to developers [that they] can code and deploy anywhere across Ruby, Java, PHP, Python or .NET. And, you get complete workload mobility to move these applications across cloud.
  • With System Center you get a single unified management solution to manage all your physical and virtual infrastructure resources across cloud in a single pane of glass.
  • Common identity is a third element of our consistent platform. With a federated Active Directory and multi-factor authentication you get a common identity across cloud, so your employees can enjoy a seamless, single sign-on experience.
  • Integrated virtualization is the fourth area. We go beyond traditional server virtualization where compute is virtualized and extended to other areas like storage and networking that are costly in your environment today.
  • Lasty being able to have a complete data platform, where your data can reside anywhere across these three clouds, is a value proposition that is huge as well. You can tap into it and all that data wherever you need, anytime.

Well I shared the core benefits Microsoft can deliver in this hybrid cloud approach.


One question I hear frequently from customers is: Oh, this is great. Can you tell me the best use case to get started with Azure?

Well, Azure can support a number of your infrastructure as a service [IaaS], and platform as a service [PaaS] needs. There are few simple areas I encourage you to look at first.

image

Let’s start with storage.

With today’s enormous growth in data everyone is looking for smarter, more cost-effective ways to manage and store their data. Windows Azure provides scaleable cloud storage and backup for any data big and small. Azure’s very cost-effective because you only pay for what you use at a cost that is lower than many on-premise solutions, SAN or NAS. Additionally we offer hybrid cloud storage option with our Store Simple appliance through Azure allowing you to access frequently use data locally and [put] tiered, less use data to the cloud. Your data is deduplicated, compressed and encrypted which means the data is smaller and therefore more cost effective to store and protect.

One customer example is Steelcase Corporation. There’re an office furniture supplier. They’ve backed up their SharePoint data with Store Simple on Azure, reducing their storage costs by 46 percent, and their restore times by 87 percent.

Another area to consider for Azure is your development and testing environment. You can easily and quickly self provision as many virtual machines as you need for your application development and testing in the cloud, without waiting for hardware procurement or internal processes. We offer complete virtual machine mobility so you can decide whether to deploy that application in production on Windows Azure, on-premises in your data center, or with a hosting provider. The choice is yours to deploy easily in whichever location with a few keystrokes.

And, if you’re looking to upgrade to the latest version of SharePoint or SQL [Server] Azure is a perfect option for testing in the cloud, with no impact to your production environment. You can roll out on-premises or in the cloud when you are ready.

On the topic of SQL [Server], backing up your on-premises SQL [Server] or Oracle databases is a must-have to help reduce your down time and minimize data loss. With Azure you can create a low-cost SQL Server 2012 or 2014 database replica without having to manage at separate data center or use expensive co-location facilities, offering you geo-redundancy and encryption.

Backing up your data base using Windows Azure Storage can save you up to 60 percent compared to on-premise SAN or tape solutions due to our compression technology.

And, our last scenario here for you to consider is identity. Managing identity across both the public cloud an on-premises applications provides you with the security you want in a great user experience. With Windows Azure Active Directory you can create new identities in the cloud or connect to an existing on-premises Active Directory to federate and manage access to your cloud application. More importantly you can synchronize on-premises identities with Windows Azure Active Directory and enable single sign-on for your users to access [your] cloud application.


I hope I provided you with a good overview of Microsoft hybrid cloud approach with the Cloud OS

In delivering global services at scale—like Bing, Skype and Xbox from our data centers—you can trust that our solutions are battle tested to meet the needs of your business.

And it’s not just battle tested by us but also by our customers. You heard a number of examples today of enterprises and organizations already benefiting from the Cloud OS vision. There are many-many more. This is a look at a small sampling.

image

We’re excited to see how each of you will transform IT and your businesses by taking advantage of our investments and solutions that are bringing the Cloud OS to life. So whether you’re testing the cloud for the first time, or going along with it, we have the platform and tools to help you every step of the way. Windows Azure in Windows Server support hybrid IT scenarios so you can flex to the cloud when you want, but still using your existing IT assets.

image

To get started today visit our Microsoft Cloud OS home page [Jan 20, 2014] to learn more and try out our solution.

Thank you for joining me.

Descriptors/tags:

Power BI as the lead business solution, Microsoft’s visionary Data Platform solution, unified platform for modern businesses, Microsoft Cloud OS, Cloud OS, mobility, apps, Big Data, cloud, Microsoft hybrid cloud solution, Windows Server, Windows Azure, System Center, Windows Intune, SQL Server, datacenter without boundaries, hybrid design, Microsoft Cloud OS vision, flexible development, unified management, common identity, integrated virtualization, complete data platform, storage, SharePoint, SQL, identity, self-service business intelligence solution, self-service analytics, self-service BI, analysis, visualization, collaboration, business intelligence models, Power BI for Office 365, Office 365, insights from data, data insights, Data Management Gateway, Power BI Sites, Power BI Mobile App, Mobile BI, natural language query, Q&A of Power BI, Microsoft Cloud & Enterprise Group, Microsoft’s Data Platform Vision, Power BI Jumpstart, autonomous marketing, Aston Martin, Microsoft’s Cloud OS home on YouTube, mobile device management, cloud computing, innovation, hybrid cloud, midmarket, datacenter modernization, consumerization of IT, hybrid cloud strategy, Business Intelligence, innovations, Microsoft Excel, Q&A, high-value activities, high-value experiences, high-value focus, MicrosoftMicrosoft strategy, value focus, Active Directory, application development, Azure AD, Cloud first, cloud infrastructure, cloud solutions, enterprise opportunities, PaaS, IaaS, Windows devices, Windows Phone


1. Power BI as the lead business solution and the Microsoft’s visionary Data Platform solution built for it

imageSelf-service business intelligence solution enables all kinds of business users to find relevant information, pull data from Windows Azure and other sources, and prepare business intelligence models for analysis, visualization and collaboration.
image
February 10: the top message on the Microsoft News Center 

Although it is just linking to this blog entry (no press release or anything like a big splash):
Power BI for Office 365 empowers everyone to analyze, visualize and share data in the cloud [The Official Microsoft Blog, Feb 10, 2014]

The following post is from Quentin Clark, Corporate Vice President, Data Platform Group.


On Monday we announced that Power BI for Office 365 – our self-service business intelligence solution designed for everyone – is generally available. Power BI empowers all kinds of business users to find relevant information, pull data from Windows Azure and other sources, and prepare compelling business intelligence models for analysis, visualization, and collaboration. 

Modernizing business intelligence

Today business intelligence is only used by a fraction of the people that could derive value from it. What we all need is modernized business intelligence which will help everyone get the information they need to understand their job or personal life better. Not just the type of information gained from an Internet search, but also information from expert sources. Now imagine you could bring together these different information sources, discover relationships between facets of information, create new insights and understand your world better. And that you could get others to see what you see, and enable them to collaborate and build on one another’s ideas. And imagine that available on any scale of data and any kinds of computation you might need. Now imagine it’s not just you – but that anyone can access this kind of data-driven discovery and learning. 

Power BI brings together many key aspects of the modernization of business intelligence: a public and corporate catalog of data sets and BI models, a way to search for data, a modern app and a Web-first experience, rich interactive visualizations, collaboration capabilities, tools for IT to govern data and models, and a groundbreaking natural language experience for exploring insights. Together, these capabilities will not just change the kinds of insights we can gain from data, but change the reach of those insights as well.

Bringing big data to a billion users

With Power BI, we have the opportunity to bring these types of data insights to a billion people. Office 365 is broadly adopted and growing – one in four of our enterprise customers now has Office 365. By making our business intelligence features part of Office, we ensure the tools are accessible, and through Office 365, we make the tools easy to adopt – not just the ease of using Web applications, but making things like collaboration, security, data discovery and exploration integrated and turnkey. 

I talked earlier about the importance of reach, and one of the ultimate forms of reach we discovered over the course of developing Power BI has been a feature we named Q&A, which allows anyone to type in search terms – just as they would in Bing – and  get instantaneous, visual results in the form of interactive charts or graphs.

Power BI for Office 365 Overview [MSCloudOS YouTube channel, Jan 22, 2014]

Power BI for Office 365: Self-service analytics for all your data. Learn how Power BI can help you discover, analyze and visualize your data while it empowers you to share your insights and collaborate with your colleagues. Ask questions with Q&A, schedule refreshes from on-prem or cloud data sources and access your reports anytime, anywhere. Try Power BI: http://www.microsoft.com/en-us/powerbi/default.aspx#fbid=lVtiyE9CkuC

Realizing value from data

I personally know how significant this all is – as you can imagine, at Microsoft we run our business on our own data platform and on Power BI. In my role as head of our data platform group, I don’t create a lot of models, but I consume a lot of them – everything from the business financials of the SQL Server business and team management to our engineering and services datasets. My mobile business intelligence application for Windows 8 allows me to interact with our daily engineering data. The ability to visualize and interact with data on my large PPI screen allows me and my finance and marketing partners to meet in my office and have a deep conversation about the business. Collaboration through Office 365 and SharePoint Online allows me to share perspective with my peers around the company.

Power BI for Office 365 has empowered me to realize deeper value from data. I’m excited to share this power with everyone.

Get Insights from Data [MSCloudOS YouTube channel, Jan 24, 2014]

One-minute video clip explaining the value of Power BI along with Office 365, focusing on how it addresses business’ pain points (once you have your data, how you get insights from it).

Big insights from big data at the World Economic Forum 2014 [Next at Microsoft Blog, Jan 22, 2014]

I’m at the World Economic Forum in Davos this week – where the world’s leaders, thought leaders and innovators gather to discuss the political, social and economic forces that are transforming the world and our lives. The other force that the World Economic  Forum calls out in their program (above all else) are the technological forces.

WEF 2014 education data with Power BI for Office 365 [Microsoft YouTube channel, Jan 21, 2014]

Education data from the World Economic Forum Global Competitive Index — visualized using Power BI for Office 365

Microsoft’s Vision Center sits directly across from the congress hall where all of these forces are being discussed and inside the center we’re showing how our technologies are helping turn data in to insight. As part of their work, the World Economic Forum produces a large volume of data and indices covering 148 countries. When I saw this data set in an Excel spreadsheet I knew it was ripe for transformation using Power BI for Office 365. As you can see in the video above, we’ve taken all of that data and are helping to deliver insight from it using Power View, Power Map and our Q&A technology. When you see health data below over a time period mapped country by country it really bring the data alive. When you can compare educational data across regions, countries and by type of education, once again the data comes alive. The real treat for me has been using Q&A to ask questions of the data much as you would ask questions of a data scientist.

WEF 2014 healthcare data with Power BI for Office 365 [Microsoft YouTube channel, Jan 21, 2014]

Healthcare data from the World Economic Forum Global Competitive Index — visualized using Power BI for Office 365

If you’ve not had a chance to see Power BI in action I’d encourage you to take up a trial of Office 365 and download the Power BI tools from PowerBI.com – it puts the decision making from data in the hands of anyone and I believe will help to deliver insights that answer some of the big questions at Davos this week and in the future. 

Source: World Economic Forum, Global Competitiveness Report series (various editions)

Find and Combine Data [MSCloudOS YouTube channel, Jan 24, 2014]

One-minute video clip explaining the value of Power BI along with Office 365, focusing on how it addresses business’ pain points (finding and combining data within the SMB).

Microsoft Releases Power BI for Office 365 [C&E News Bytes Blog*, Feb 10, 2014]

Today, Microsoft announced the general availability of Power BI for Office 365, a cloud-based business intelligence service that gives people a powerful new way to work with data in the tools they use every day, Excel and Office 365. Power BI for Office 365 brings together Microsoft’s strengths in cloud, productivity and business intelligence to enable people to easily analyze and visualize data in Excel, discover valuable insights, and share and collaborate on those insights from anywhere with Office 365.

Power BI for Office 365 with Excel allows business users to easily create reports and discover insights in Excel and share and collaborate on those insights in Office 365. Excel includes powerful data modeling and visualization capabilities which enables customer to easily discover, access, and combine their data. Customers also have the ability to create rich 3D geospatial visualizations in Excel.

With Office 365, customers have access to cloud-based capabilities to share visualizations and reports with their colleagues in real time and on mobile devices, interact with their data in new ways to gain faster insights and manage their work more effectively. These key cloud-based capabilities include:

  • A Data Management Gateway which enables IT to build connections to on-premise data sources and schedule refreshes. Business users always have the most up to date reports, whether on their desktop or over their device.
    [From the preview in Oct’13 here:] Through the Data Management Gateway, IT can enable on-premises data access for all reports published into Power BI so that users have the latest data. IT can also enable enterprise data search across their organization, making it easier for users to discover the data they need. The system also monitors data usage across the organization, providing IT with the information they need to understand manage the system overall.
  • [Power] BI Sites, dedicated workspaces optimized for BI projects, which allow business users to quickly find and share data and reports with colleagues and collaborate over BI results.
    [From the preview in Oct’13 here:] Power BI for Office 365 enables users to quickly create Power BI Sites, BI workspaces for users to share and view larger workbooks of up to 250MB, refresh report data, maintain data views for others and track who is accessing them, and easily find the answers they need with natural language query. Users can also stay connected to their reports in Office 365 from any device with HTML5 support for Power View reports and through a new Power BI mobile app for Windows.
  • Real-time access to BI Sites and data no matter where a user is located via mobile devices. Customers can access their data through the browser in HTML5 or through touch-optimized mobile application, available on the Windows Store.
    [From the preview in Oct’13 here:] The Power BI Mobile App is a new visualization app for Office that helps visualize graphs and data residing in an Excel workbook available in the Windows Store. The user is able to navigate through the data with multiple views and ability to zoom in and out at different levels. This app was first available for Windows 8, Windows RT, and Surface devices through the Windows Store and specifically for those customers using the Power BI for Office 365 Preview. It provides touch optimized access to BI reports and models stored in Office 365.
    Power BI App for Windows 8 and Windows RT now available in Store [“Welcome to the US SMB&D TS2 Team Blog”, Aug 21, 2013]
    Microsoft mobile app helps citizens report crimes more quickly to police in Delhi, India [The Fire Hose Blog, Jan 29, 2014]
  • A natural language query experienced called Q&A which allows users to ask questions of their data and receive immediate answers in the form of an interactive table, chart or graph.

Power BI for Office 365 provides an easy on-ramp for organizations who have bet on Office 365 to begin doing self-service BI today. Several customers have already started realizing the benefits of the service, including Revlon, MediaCom, Carnegie Mellon University and Trek.

For more information, read Quentin Clark, Corporate Vice President of the Data Platform Group’s, post [here you’ve already seen/read above] on the Official Microsoft Blog. Customers can find out more about how to purchase Power BI for Office 365 at powerbi.com.

[*About C&E News Bytes Blog: Here you will find a quick synopsis of all news from Microsoft’s Cloud & Enterprise organization as it is released with links to additional information.]

Share Data Insights [MSCloudOS YouTube channel, Jan 24, 2014]

One-minute video clip explaining the value of Power BI along with Office 365, focusing on how it addresses business’ pain points (once you get your data insights, how you can share it within your SMB and use the data to its fullest potential).

Broncos Road to the Big Game [MSCloudOS YouTube channel, Jan 31, 2014]

Power Map tour of the 2013 Broncos season and their road to the Super Bowl XLVIII.

Seahawks Road to the Big Game [MSCloudOS YouTube channel, Jan 31, 2014]

Power Map tour of the 2013 Seahawks season and their road to the Super Bowl XLVIII

What Drives Microsoft’s Data Platform Vision? [SQL Server Blog, Jan 29, 2014]

FEATURED POST BY:   Quentin Clark, Corporate Vice President, The Data Platform Group, Microsoft Corporation

imageIf you follow Microsoft’s data platform work, you have probably observed some changes over the last year or so in our product approach and in how we talk about our products.  After the delivery of Microsoft SQL Server 2012 and Office 2013, we ramped-up our energy and sharpened our focus on the opportunities of cloud computing.  These opportunities stem from technical innovation, the nature of cloud computing, and from an understanding of our customers.

In my role at Microsoft, I lead the team that is responsible for the engineering direction of our data platform technologies.  These technologies help our customers derive important insights from their data and make critical business decisions.  I meet with customers regularly to talk about their businesses and about what’s possible with modern data-intensive applications.  Here and in later posts, I will share some key points from those discussions to provide you with insight into our data platform approach, roadmap, and key technology releases.

Microsoft has made significant investments on the opportunities of cloud computing.  In today’s IT landscape, it’s clear that the enterprise platform business is shifting to embrace the benefits of cloud computing—accessibility to scale, increased agility, diversity of data, lowered TCO and more. This shift will be as significant as the move from the mainframe/mini era to the microprocessor era.  And, due to this shift, the shape and role of data in the enterprise will change as applications evolve to new environments.

Today’s economy is built on the data platform that emerged with the microprocessor era—effectively, transactional SQL databases, relational data warehousing and operational BI.  An entire cycle of business growth was led by the emergence of patterns around Systems of Record, everything from ERP applications to Point of Sale systems.  The shift to cloud computing is bringing with it a new set of application patterns, which I sometimes refer to as Systems of Observation (SoO).  There are several forms of these new application patterns: the Internet of Things (IoT), generally; solutions being built around application and customer analytics; and, consumer personalization scenarios.  And, we are just beginning this journey! 

These new application patterns stem from the power of cloud computing—nearly infinite scale, more powerful data analytics and machine learning, new techniques on more kinds of data, a whole host of new information that impacts modern business, and ubiquitous infrastructure that allows the flow of information like never before.  What is being done today by a small number of large-scale Internet companies to harness the power of available information will become possible to apply to any business problem. 

To provide a framework for how we think applications and the information they generate or manage will change—and how that might affect those of us who develop and use those applications—consider these characteristics:

Data types are diverse.  Applications will generate, consume and manipulate data in many forms: transactional records, structured streamed data, truly unstructured data, etc.  Examples include the rise of JSON, the embracing of Hadoop by enterprises, and the new kinds of information generated by a wide variety of newly connected devices (IoT).

Relevant data is not just from inside the enterprise.  Cross-enterprise data, data from other industries and institutions, and information from the Web are all starting to factor into how businesses and the economy function in a big way.  Consider the small business loan extension that accounts for package shipping information as a criteria; or, companies that now embrace the use of social media signals.

Analytics usage is broadening.  Customer behavior, application telemetry, and business trends are just a few examples of the kinds of data that are being analyzed differently than before.  Deep analytics and automated techniques, like machine learning, are being used more often. And, modern architectures (cloud-scale, in-memory) are enabling new value in real-time, highly-interactive data analysis.

Data by-products are being turned into value.  Data that were once considered as by-products of a core business are now valuable across (and outside of) the industries that generate this data; for example, consider the expanding uses of search term data.  Perhaps uniquely, Microsoft has very promising data sets that could impact many different businesses.  

With these characteristics in mind, our vision is to provide a great platform and solutions for our customers to realize the new value of information and to empower new experiences with data.  This platform needs to span across the cloud and the enterprise – where so much key information and business processes exist.  We want to deliver Big Data solutions to the masses through the power of SQL Server and related products, Windows Azure data services, and the BI capabilities of Microsoft Office. To do this, we are taking steps to ensure our data platform meets the demands of today’s modern business.

Modern Transaction Processing—The data services that modern applications need are broader now than traditional RDBMS.  Yes, this too needs to become a cloud asset, and our investments in Windows Azure SQL Database reflect that effort.  We recognize that other forms of data storage are essential, including Windows Azure Storage and Tables, and we need to think about new capabilities as we develop applications in cloud-first patterns.  These cloud platform services need to be low friction, easy to incorporate, and operate seamlessly at scale—and have built-in fundamental features like high availability and regulatory compliance.  We also need to incorporate technical shifts like large memory and high-speed low latency networking—in our on-premises and cloud products. 

Modern Data Warehousing—Hadoop brought flexibility to what is typically done with data warehousing: storing and performing operational and ad-hoc analysis across large datasets.  Traditional data warehousing products are scaling up, and the worlds of Hadoop and relational data models are coming together.  Importantly, enterprise data needs broad availability so that business can find and leverage information from everywhere and for every purpose—and this data will live both in the cloud and in the enterprise datacenter.  We are hearing about customers who now compose meaningful insights from data across Windows Azure SQL Database and Windows Azure Storage processed with Windows Azure HDInsight, our Hadoop-based big data solution. Customers are leveraging the same pattern of relational + Hadoop in our Parallel Data Warehouse appliance product in the enterprise. 

Modern Business Intelligence—Making sense of data signals to gain strategic insight for business will become commonplace.  Information will be more discoverable; not just raw datasets, but those facets of the data that can be most relevant—and the kinds of analytics, including machine learning, that can be applied—will be more readily available.  Power BI for Office 365, our new BI solution, enables balance between self-service BI and IT operations—which is a key accelerant for adoption. With Power BI for Office 365, data from Windows Azure, Office, and on-premises data sources comes together in modern, accessible BI experiences. 

Over the coming months, we are going to publish regular posts to encourage discussions about data and insights and the world of modernized data. We will talk more about the trends, the patterns, the technology, and our products, and we’ll explore together how the new world of data is taking shape. I hope you will engage in this conversation with us; tell us what you think; tell us whether you agree with the trends we think we see—and with the implications of those trends for the modern data platform.

If you’d like more information about our data platform technologies, visit www.microsoft.com/bigdata and follow@SQLServer on Twitter for the latest updates.

Getting Trained on Microsoft’s Expanding Data Platform [SQL Server Blog, Feb 6, 2014] 

With data volumes exploding, having the right technology to find insights from your data is critical to long term success.  Leading organizations are adjusting their strategies to focus on data management and analytics, and we are seeing a consistent increase in organizations adopting the Microsoft data platform to address their growing needs around data.  The trend is clear: CIOs named business intelligence (BI) and analytics their top technology priority in 2012, and again in 2013. Gartner expects this focus to continue during 2014. 2

At Microsoft, we have great momentum in the data platform space and we are proud to be recognized by analysts like IDC reporting that Microsoft SQL Server continues to be the unit leader and became the #2 database vendor by revenue.1Microsoft was named a leader in both the Enterprise Data Warehouse and Business Intelligence Waves by Forrester, 3,4and is named a leader in the OPDMS Magic quadrant. 5

The market is growing and Microsoft has great momentum in this space, so this is a great time to dig in and learn more about the technology that makes up our data platform through these great new courses in the Microsoft Virtual Academy.

Microsoft’s data platform products

Quentin Clark recently outlined our data platform vision [here you’ve already seen/read above]. This calendar year we will be delivering an unprecedented lineup of new and updated products and services:

  • SQL Server 2014 delivers mission critical analytics and performance by bringing to market new in-memory capabilities built into the core database for OLTP (by 10X and up to 30X) and Data Warehousing (100X). SQL Server 2014 provides the best platform for hybrid cloud scenarios, like cloud backup and cloud disaster recovery, and significantly simplifies the on-ramp process to cloud for our customers with new point-and-click experiences for deploying cloud scenarios in the tools that are already familiar to database administrators (DBAs).
  • Power BI for Office 365 is a new self-service BI solution delivered through Excel and Office 365 which provides users with data analysis and visualization capabilities to identify deeper business insights from their on-premises and cloud data.
  • Windows Azure SQL Database is a fully managed relational database service that offers massive scale-out with global reach, built-in high availability, options for predictable performance, and flexible manageability. Offered in different service tiers to meet basic and high-end needs, SQL Database enables you to rapidly build, extend, and scale relational cloud applications with familiar tools.
  • Windows Azure HDInsight makes Apache Hadoop available as a service in the cloud, and also makes the Map Reduce software framework available in a simpler, more scalable, and cost efficient Windows Azure environment.
  • Parallel Data Warehouse (PDW) is a massively parallel processing data warehousing appliance built for any volume of relational data (with up to 100x performance gains) and provides the simplest way to integrate with Hadoop. With PolyBase, PDW can also seamlessly query relational and non-relational data.

In-depth learning through live online technical events

To support the availability of these products, we’re offering live online events that will enable in-depth learning of our data platform offerings. These sessions are available now through the Microsoft Virtual Academy (MVA) and are geared towards IT professionals, developers, database administrators and technical decision makers. In each of these events, you’ll hear the latest information from our engineering and product specialists to help you grow your skills and better understand what differentiates Microsoft’s data offerings.

Here is a brief overview of the sessions that you can register for right now:

Business Intelligence

Faster Insights with Power BI Jumpstart | Register for the live virtual event on February 11

Session Overview: Are you a power Excel user? If you’re trying to make sense of ever-growing piles of data, and you’re into data discovery, visualization, and collaboration, get ready for Power BI. Excel, always great for analyzing data, is now even more powerful with Power BI for Office 365. Join this Jump Start, and learn about the tools you need to provide faster data insights to your organization, including Power Query, Power Map, and natural language querying. This live, demo-rich session provides a full-day drilldown into Power BI features and capabilities, led by the team of Microsoft experts who own them.

Data Management for Modern Business Applications

SQL Server in Windows Azure VM Role Jumpstart | Register for the live virtual event on February 18

Session Overview: If you’re wondering how to use Windows Azure as a hosting environment for your SQL Server virtual machines, join the experts as they walk you through it, with practical, real-world demos. SQL Server in Windows Azure VM is an easy and full-featured way to be up and running in 10 minutes with a database server in the cloud. You use it on demand and pay as you go, and you get the full functionality of your own data center. For short-term test environments, it is a popular choice. SQL Server in Azure VM also includes pre-built data warehouse images and business intelligence features. Don’t miss this chance to learn more about it.

Here’s a snapshot of the great content available to you now, with more to come later on the on the MVA data platform page:

Data Management for Modern Business Applications

Modern Data Warehouse

For more courses and training, keep tabs on the MVA data platform page and the TechNet virtual labs as well.

Thanks for digging in.

Eron Kelly
General Manager
Data Platform Marketing

———– 

1Market Analysis: Worldwide Relational Database Management Systems 2013–2017 Forecast and 2012 Vendor Shares, IDC report # 241292 by Carl W. Olofson, May 2013
2Business Intelligence and Analytics Will Remain CIO’s Top Technology Priority G00258063 by W. Roy Schulte | Neil Chandler | Gareth Herschel | Douglas Laney | Rita L. Sallam | Joao Tapadinhas | Dan Sommer 25 November 2013
3The Forrester Wave™: Enterprise Data Warehouse, Q4 2013, Forrester Research, Inc.,  December 9, 2013
4The Forrester Wave™: Enterprise Business Intelligence Platforms, Q4 2013, Forrester Research, Inc.,  December 18, 2013
5Gartner, Magic Quadrant for Operational Database Management Systems by Donald Feinberg, Merv Adrian and Nick Heudecker, October 21, 2013.
Disclaimer:
Gartner does not endorse any vendor, product or service depicted in its research publications, and does not advise technology users to select only those vendors with the highest ratings. Gartner research publications consist of the opinions of Gartner’s research organization and should not be construed as statements of fact. Gartner disclaims all warranties, expressed or implied, with respect to this research, including any warranties of merchantability or fitness for a particular purpose.

Free Power BI Training – Microsoft Virtual Academy Jump Start [“A Story of BI, BIG Data and SQL Server in Canada” Blog, Feb 5, 2014]

Whether you’re a power Excel user or you’re just trying to make sense of ever-growing piles of data, we have a great day long, free online training session for you on Power BI for Office 365.

This live, demo rich training will provide sessions covering key Power BI features and capabilities and help you learn about the tools you need to provide faster data insights to your organization. 

Course Outline:

  • Introduction to Power BI
  • Drilldown on Data Discovery Using Power Query
  • The Data Stewardship Experience
  • Building Stellar Data Visualizations Using Power View
  • Building 3D Visualizations Using Power Map
  • Understand Power BI Sites and Mobile BI
  • Working with Natural Language Querying Using Q&A
  • Handling Data Management Gateway
  • Get Your Hands on Power BI

Sign Up for this Microsoft Virtual Academy Jump Start led by the team of Microsoft experts who own them.

Live Event Details

  • February 11, 2014
  • 9:00am-5:00pm PST
  • What time is this in my time zone?
  • What: Fast-paced live virtual session
  • Cost: Free
  • Audience: IT Pro
  • Prerequisites: For data analysts, Excel power users, or anyone looking to turn their data into useful business information.
  • Register Now>>

Interview with Marc Reguera, Director of Finance at Microsoft [MSCloudOS YouTube channel, Feb 10, 2014]

Hear directly from Marc Reguera, Director of Finance at Microsoft and BI champion how Power BI is changing the way finance works inside Microsoft.

Power BI Webinar Series [MSFT for Work Blog, Jan 22, 2014]

Big data scientists and the finance department haven’t always seen eye to eye in most companies. Now is your chance to embrace big data to free your finance department to focus on the ways to add the most value.

You are invited to join Microsoft Finance Director Marc Reguera and members of the Microsoft finance leadership team to find out what they did to become a more empowered and influential finance organization. The powerful new business intelligence tools they will demonstrate have been under wraps for almost two years and have so far only been used within Microsoft.

image

Now the tools have been road-tested and are ready for you to try. Grab your chance to learn how the Microsoft new BI tools will help your business not only adapt to the world of big data, but actually thrive in it.

Register for any and all of the webinars you are interested in:

1/23/14: Visualization: See how these powerful new tools have improved Microsoft’s ability to consume big data and develop insights by simplifying the data and using visualization tools. Register here.

1/30/14: Definitions: Get the best practices for creating and aligning behind a common set of data definitions and taxonomies. Learn how to get everyone on the same page. Register here.

2/13/14: Outsourcing: Learn how Microsoft worked with partners to optimize and outsource non-strategic finance tasks so the organization could focus on high-value activities. Register here.

2/20/14: Cloud collaboration: Learn how your organization can focus more time on delivering business insights by using Power BI and Microsoft Office 365. Register here.

3/6/14: Making things easy to comprehend without making them simplistic: See how Microsoft finance teams consume and analyze millions of rows of data and present their analysis in a narrative that’s easy to understand for multiple audiences. Register here.

Taken together, this series of webinars will help your company’s finance department adapt to a world of rapidly shifting paradigms and what can be, without the right tools, the overwhelming era of big data.

Business Intelligence: “The Eyes and Ears of Your Business” [Microsoft for Work Blog, Jan 30, 2014]

Businesses are collecting more data than ever before, and technology is making that process increasingly easier and more affordable. The challenge for business owners is 1) how to quickly turn that raw data into actionable business insights, and 2) how to give more people within an organization access to those insights on a self-serve basis.
Organizations must have insight into how their operations are performing in order to stay competitive. Companies who successfully manage their big data assets are more profitable than companies not making this investment, says Jason Baick, Senior Product Marketing Manager at Microsoft. Simply put, “[business intelligence] is the eyes and ears of your business,” Baick says.
Release data from the IT department
Data analysis started off as a highly specialized process. “It was always a barrier to self-service information … the treasure trove of the data was locked up in the IT department,” Baick points out. Today there are easy-to-use data visualization tools that offer anyone within an organization access to real-time business insights.
Take the Microsoft Power BI suite, for example, which gives both businesses and the individual an easy-to-use platform to visualize their data. Given that many businesses already have the infrastructure that Power BI is built on (e.g. Microsoft SharePoint) and a familiarity with its feature set, integration and adoption is simplified. Your users don’t have an intimidation factor because they already know how to use Excel, explains Baick. By equipping your employees with these types of tools, you can enable team members to unearth real-time insights, ranging from targeting a prospect at the exact right time to make the sale, to determining where the company can cut costs, to revealing where they should invest more.

Here’s a rundown of specific Power BI tools and what they can offer your business:

  • Discover and Combine
    • Search and access all your company’s data and public data from one place using Power Query. Give your team the ability to be more efficient while cutting down on the cost of investing in multiple, disparate data tools.
  • Model and Analyze
    • Empower your employees to create analytical models using Power Pivot. Since this is built on familiar software like Excel, you won’t have to worry about the cost of training or having to hire new staff for implementation.
  • Visualize
    • Power View and Power Map enables your team members to quickly translate big data sets and create easy-to-understand visuals without a huge time investment.
  • Share and Collaborate
    • Seamlessly share and edit workbooks from any device, allowing your employees quick and easy access to important information in real time.
  • Get Answers and Insights
    • The new Q&A feature gives your employees the ability to ask any question of their data without requiring specialized skills to draw out these insights.
  • Access Anywhere
    • Give your staff access to the Power BI tool set from any device, any location. This empowers your employees to access data in real time, which could mean the difference between making and not making a sale.
How are people using Power BI?
Companies like MCH Strategic Data are already employing the Power BI suite to get more out of their data. MCH collects an enormous amount of education and healthcare marketing data for their clients. After 85 years in the business, they’re now able to deliver new and unique insights to clients like never before. One application has been to create videos using tools like Power Map to create data visualizations showing the geographic range of socioeconomic status across various school districts. They’ve also made subsets of their data available and searchable by Power BI users, including datasets on hospitals, school systems, and emergency preparedness services throughout the US
Building a data-driven organization
Everyone at your company can contribute to uncovering business insights, and it’s important to give them the tools to do so. Using your data in a smart and strategic way enables you to turn it into actionable business insights and to stay ahead of the competition.

The Autonomy of Marketing with Big Data [Microsoft for Work Blog, Jan 15, 2014]

We spoke to Jeff Marcoux, Senior Product Marketing Manager for Dynamics CRM, about how big data and data insights have changed marketing. He outlined three ways that companies can use big data to reimagine their marketing efforts.

He also outlined an all-encompassing rule when using data insights for marketing efforts: it’s not about how much data you have, it’s what you do with it. “Large data makes graphs, but significant data tells a story,” said Marcoux. Learning how to leverage significant big data into actionable insights is the key to unlocking its potential as an asset to your business. Here are Jeff’s three key ways companies can do smart things with their data:

  1. Embrace the idea that autonomous marketing, or marketing that is auto-optimized and auto-customized according to customer insights and machine-generated learning, can reinvigorate marketing campaigns. The key being it’s a more responsive marketing campaign that continuously strengthens and adjusts itself.
  2. Use customer insights to create stronger sales-marketing partnerships by increasing positive brand awareness and generating more accurate information on qualified leads and revenue attribution. In other words, more insight contributing to less finger-pointing and, ultimately, greater partnerships. 
  3. Translate data into business impact by building custom sales kits appropriate for every opportunity and every customer, monitoring the end-to-end customer life cycle, and keeping customers hooked. After all, according to Marcoux, “existing customers are the best sellers.”

Data insights will help drive marketing at the deepest strategic levels, providing actionable insights that can constantly be measured against and refined. Remember, it’s not how much data you’ve got, it’s what you do with it. If your organization has started to use data insights in your marketing efforts, do you have any tips on how to better use data? Sound off in the comments!

Autonomous Marketing: Using data to perfectly personalize marketing efforts [Microsoft for Work Blog, Jan 30, 2014]

Personalization is the gold standard for marketing efforts. If you can connect with a customer on a personal level and demonstrate that you understand your audience, the customer is far more likely to respond to your marketing campaigns. It may seem like a daunting task to crunch that much customer information and automatically adapt it to your marketing efforts, but it doesn’t have to be. Technologies exist that allow you to update campaigns with new data (auto-optimize) and use that updated data to better target your efforts (auto-customize), removing the guesswork you’re your campaigns. Marketing that is auto-optimized and auto-customized based on customer insights and machine generated learning—called “autonomous marketing”—is now a tangible reality for many businesses.
Autonomous marketing and big data will be critical in re-imagining a more personalized approach to marketing—and learning to harness this approach will keep your business ahead of the curve as marketing innovators.
Data, data everywhere…
The amount of data available today is overwhelming. Take, for example, a single business—just between the company’s website, Facebook page, and Twitter, there’s a lot to keep track of. All this information needs to be consolidated and combed to figure out which data is significant and what happens next. For many businesses the question becomes: what do I do with my data?
According to Jeff Marcoux, Senior Product Marketing Manager for Dynamics CRM, that data should be fed to an engine that’s automatically optimizing itself. In practice, this “auto-optimizing” capability translates into the ability to make campaign improvements in real-time. The result is a more responsive marketing campaign that continuously strengthens and adjusts itself to help dial in on more precise market segments and figure out what’s working.
Getting personal
The clincher, once you’ve honed in on those market segments, is auto-customizing marketing campaigns down to the individual level. “Customers are already so far down the buying cycle when they get to you (nearly 57%) and getting personal is the only way to land your message and have it resonate with consumers,” said Marcoux. Once that same engine is automatically tailoring marketing efforts based on data insights, you’ll know you’ve crossed-over into today’s gold standard for marketing—personalization perfection.
“Autonomous marketing is a beast,” said Marcoux, “once it gets going you just have to pay attention and keep feeding it.” The autonomous marketing beast metabolizes content and, so long as it’s fed plenty of “healthy” (significant) data, it will do its job to improve marketing. In turn, you will gain valuable insight into revenue performance and ROI, this way you can pinpoint which marketing maneuvers were converted into real business impact.
A healthy beast, a happy business
The success of autonomous marketing relies on 2 things: the quality of the data it’s fed and whether you take advantage of the insights it offers. A responsive, personalized approach to marketing is where we’re headed—are you doing everything to make sure your business is headed there too?

The Human Side of Autonomous Marketing [Microsoft for Work Blog, Jan 30, 2014]

How do you retain the creative side of marketing when big data and autonomous marketing inevitably change the way marketers work? Data insights enhance the efficacy of your marketing efforts; however, human input is always necessary to decipher big data. Autonomous marketing, used to enable marketers and nail down effective marketing campaigns, is the secret to realizing business impact.
Metrics for the Mind
The application of autonomous marketing is a necessary next step in meeting a new demand, but it doesn’t supplant the need for marketers in the flesh. According to Jeff Marcoux, Senior Product Marketing Manager for Microsoft Dynamics CRM, marketers will never be forced to relinquish their instincts and creativity—their marketing guts—because analytics, data, and insight help fuel creativity.
“The main reason I say that,” said Marcoux, “is because there’s always going to be new channels and marketers have to come up with new ways to use them.” Take for example the exodus of college-age students from Facebook (which Marcoux attributes to the fact that their parents on are on it) to something more like Snapchat. Although data may shed some insight on the shift, it’s up to marketers to take advantage of it in a creative way (e.g., showing loyal fans a secret menu or product announcement before the rest of the world gets to see it).
Take Colorado University’s Online program at their Anschutz Medical Campus, which faced the challenge of how to remain competitive to college students and reach potential students on their own terms. CU used Microsoft Dynamics CRM to identify what their potential students liked, the media they consumed, and the social networks they used—processes that would normally take marketers months of research—and automated it so their marketing team could focus on killer campaigns that would engage the potential students they did find. The result? Increased student retention and recruitment.
Coming up with the emotional content that drives a campaign is where the creativity and experience come in. Marcoux sees autonomous marketing as a way to free up marketers to do what they love—create and innovate—and, today, there’s plenty of opportunity to innovate as campaigns become increasingly personalized.
A Mind-Body Approach to Marketing
Customers don’t want to be just a number; they want to be known. “With social media, everything is personal and everything is online,” said Marcoux. “Hooking” modern consumers is a matter of building those personal, emotional relationships—identifying who they are and what their need is, educating them on a solution, and then ultimately providing that solution.
“We’ve seen that personalization come across in emails and social posts, but that’s all been enabled by big data,” said Marcoux. Customers are already so far down the buying cycle when they get to you (nearly 57%), and getting personal is the only way to land your message and have it resonate with consumers these days.
Autonomous marketing powered by data insights helps marketers gather and combine information from many different sources in order to figure out what content is working. This way, marketers can focus on what is actually selling their product rather than getting petrified by what Marcoux calls “analysis paralysis,” or the misinterpretation and incorrect analysis of data.
Ultimately, autonomous marketing is a way to deal with the deluge of social data and other information to help marketers do their job better. Reimagining marketing, according to Marcoux, is a matter of using big data to narrow in on those granular market segmentations and continuing to fine-tune an effective, personalized marketing approach that will hook and keep hooked customers.


2. Microsoft’s vision of the unified platform for modern businesses

THE BIG PICTURE: Microsoft Cloud OS Overview [MSCloudOS YouTube channel, Jan 21, 2014]

Tune into this bite size video where you will hear Microsoft General Manager, Gavriella Schuster, provide an overview of the Microsoft Cloud OS and how the underlying technologies – Windows Server, System Center, Windows Azure, Microsoft SQL Server, and WIndows Intune- can help you cloud-optimize your business today. Interested in learning more? Visit our Microsoft Cloud OS homepage: http://www.microsoft.com/en-us/server-cloud/cloud-os/ Ready to try these solutions and experience the benefits first hand? Contact your Microsoft account manager or partner to schedule an Immersion experience today.

Business Insights Newsletter Article | October 2013

MICROSOFT DEFINES THE CLOUD WITH ONE WORD – VALUE

When conversation turns to cloud computing, there is a lot of noise. Press, vendors, analysts, bloggers and others deliver opinions on what a successful cloud strategy entails.
Converging technologies such as Big Data, Mobility, BYOD and Social are transforming how businesses operate and compete and are relying on cloud as a critical enabler. Cloud itself is considered an emerging megatrend representing a real opportunity for IT to introduce more efficiency across every operational line of business.
The modern workforce isn’t just better connected and more mobile than ever before, it’s also more discerning (and demanding) about the hardware and software used on the job. While company leaders around the world are celebrating the increased productivity and accessibility of their workforce, the exponential increase in devices and platforms that the workforce wants to use can stretch a company’s infrastructure (and IT department!) to its limit.”
Brad Anderson, Corporate Vice President, Microsoft
Microsoft believes that cloud is quite simply about a single concept – value. In this article we will share how Microsoft helps you realize the value of cloud, why Windows Server is best suited to take you on the journey, and let you hear how luxury car-maker Aston Martin transformed their business and their IT department by using a Windows Server hybrid strategy.
Your Journey to the Cloud
The true value of cloud is the opportunity for IT to get all the benefits of scale, speed, and agility while still protecting existing investments.
Cloud better enables the introduction of the megatrends of Big Data, Social and Mobile by providing answers to help IT manage risk while delivering quality services and applications quickly, efficiently, securely. As organizations start their journey to the cloud, they typically are grappling with a combination of traditional on-premise and cloud-based solutions; however these hybrid scenarios have the potential to introduce new complications. Working with multiple versions of conflicting operating systems, management tools and applications is usually counter-productive and results in staff frustration, departmental inefficiencies and poor productivity. To be successful, teams need a way to consistently manage, support and automate the datacenter. Microsoft Cloud OS Vision Begins with Windows Server 2012 R2
There are multiple ways for customers to think about how they provision their infrastructure, and we aim to enable an ‘and’ philosophy for our customers so they don’t have to think that it’s an either/or decision. We allow them to take servers and other technology they are running on premises and think about how they might want to move some of it into cloud services, while still having a consistent level of management, identity and security.”
Gavriella Schuster, Microsoft GM US Server Tools
Organizations can begin to realize tremendous value with cloud when they leverage the ability to operate and manage a converged infrastructure that shares a common operating system and set of tools across hybrid environments supporting an assortment of devices, applications and users.
Figure 1: Windows Server Delivers Value with a Unified Hybrid Environment
At the heart of the Microsoft Cloud OS vision is Windows Server 2012 R2. With Windows Server 2012 R2 Microsoft’s experience delivering global-scale cloud services enables organizations of all sizes to take advantage of new features and enhancements across virtualization, storage, networking, virtual desktop infrastructure, access and information protection, and more.
The value of standardizing on Windows 2012 R2 as your Cloud OS strategy includes:
Experience Enterprise-class Performance and Scale
    • Take advantage of even better performance and more efficient capacity utilization in your datacenter.
    • Increase the agility of your business with a consistent experience across every environment.
    • Leverage the proven, enterprise-class virtualization and cloud platform that scales to continuously run your largest workloads while enabling robust recovery options to protect against service outages.
      Drive Bottom Line Efficiencies with Cost Savings and Automation
        • Enjoy resilient, multi-tenant-aware storage and networking capabilities for a wide range of workloads.
        • Re-deploy your budget to other critical projects with the cost-savings delivered through a Windows 2012 R2 Cloud OS.
        • Automate a broad set of built-in management tasks.
        • Simplify the deployment of major workloads and increase operational efficiencies.
          Unlock Competitive Advantage with Faster Application Deployment
            • Build, deploy and scale applications and web sites quickly, and with more flexibility than ever before.
            • Unlock improved application portability between on-premises environments and public and service provider clouds in concert with Windows Azure VM and System Center 2012 R2 making it simple to rapidly shift your critical applications virtually anywhere, anytime.
            • Increase flexibility and elasticity of IT services with the Windows Server 2012 R2 platform for mission-critical applications while protecting existing investments with enhanced support for open standards, open source applications and various development languages.
              Empower Users with Better Access Anywhere
                • Windows Server 2012 R2 makes it easier to deploy a virtual desktop infrastructure making it possible for users to access IT from virtually anywhere, providing them a rich Windows experience while ensuring enhanced data security and compliance.
                • Lower storage costs significantly by supporting a broad range of storage options and VHD de-duplication.
                • Easily manage your user’s identities across the datacenter and into the cloud to help deliver secure access to corporate resources.

                  In Summary

                  The datacenter is the hub for everything IT offers to the business: storage, networking and computing capacity. The right Cloud OS strategy enables IT to transform those resources into a datacenter that is capable of handling changing needs and unexpected opportunities. With Windows Server 2012 R2, Microsoft offers a consistent operating system and set of management tools that acts and behaves in exactly the same manner across every setting. Windows Server 2012 R2 delivers the same experience and requires the same skill-sets and knowledge to manage and operate in any environment. Windows Server 2012 R2 delivers a “future-proof” road-map with a fully seamless and scalable platform, making organizations agile, nimble and ready. Highly scalable, Windows Server 2012 is already powering many of the worlds’ largest datacenters – including Microsoft’s – proving out capabilities at cloud scale and then delivering them for the enterprise. With the latest release of Windows Server 2012 R2, Microsoft is redefining the server category, delivering hundreds of new features and enhancements spanning virtualization, networking, storage, user experience, cloud computing, automation, and more. The goal of Windows Server 2012 R2 is to help organizations transform their IT operations to reduce costs and deliver a whole new level of business value.
                  Aston Martin Uses Windows Server 2012 to Drive IT Transformation
                  Behind every luxury sports car produced by Aston Martin is a sophisticated IT infrastructure. The goal of the Aston Martin IT team is to optimize that infrastructure so that it performs as efficiently as the production line it supports. To meet that goal, Aston Martin has standardized on Microsoft technology. The IT team chose the Windows Server 2012 operating system, including Hyper-V technology to virtualize its data center and build four private clouds to dynamically allocate IT resources to the business as needed. For cloud and data center management, Aston Martin uses Microsoft System Center 2012.
                  “The IT team’s purpose is to enable Aston Martin to build the most beautiful sports cars in the world. So, from servers, to desktops, to production line PCs, Microsoft technology is behind everything we do.”
                  Daniel Roach-Rooke, IT Infrastructure Manager, Aston Martin
                  Watch this short video to learn how the team at Aston Martin envisioned and executed on their strategy.

                  imageWatch the Aston Martin video now

                  Call to Action
                  With Windows Server Data Center 2012 R2 set to release in November, now is the time to see your Microsoft licensed solution provider for information about software savings.

                  MSCloudOS YouTube supersite:

                  Microsoft’s Cloud OS home on YouTube to find the latest products & solutions news, demos as well as training videos for Windows Server, SQL Server, System Center, Windows Intune, Microsoft BI, and Windows Azure—the technologies that bring Microsoft’s vision of Cloud OS to life.
                  imageEvolving IT in the Era of the Cloud OS [June 3, 2013] Today’s massive technology shifts are creating new demands on IT. Learn how Microsoft hybrid cloud solutions deliver new innovations that can help you solve the challenges you face now.
                  imageThe Enterprise Cloud Era [June 3, 2013] See Microsoft President Satya Nadella talk about Microsoft’s cloud-first approach.
                  imageTechEd North America 2013 Keynote [June 24, 2013] Despite sea changes in cloud computing, device proliferation, and the explosion of data, IT pros and developers still live for one simple thing: to deliver amazing experiences for their customers and end-users. In this keynote, Brad Anderson will unveil a broad set of new capabilities across the full suite of Microsoft Cloud OS products and technologies designed with that simple end goal in mind. Together with enterprise-optimized enhancements to the Windows 8 client, the advances that Brad will showcase in this keynote significantly advance Microsoft’s long-term effort to give you the most advanced and comprehensive set of services, products, and technologies in the industry. Learn how Windows 8 is ready for business, how Windows Azure is changing hybrid and private cloud computing, and how the world of modern application development is evolving. It’s time to embrace the challenges of a world full of risks and opportunities. See what Microsoft is delivering next, including new enterprise enhancements in the upcoming Windows 8.1 update, and learn what it means for your business as well as your career.
                  imageTechEd Europe 2013 Keynote [June 26, 2013] In an era of global technological change, IT pros and developers still live for one basic thing: to deliver amazing experiences. In this keynote from TechEd Europe 2013, Corporate Vice President Brad Anderson, will detail Microsoft’s strategy to help customers achieve that simple goal by leveraging new innovations in cloud services, device management, application development, data insights, and datacenter evolution. Mr. Anderson will review a broad set of newly-announced updates across the full suite of Microsoft Cloud OS products and technologies, including Windows Server, Microsoft System Center, Windows Azure, SQL Server, Visual Studio, and more. It’s time to embrace the challenges of a world full of new opportunities. See what Microsoft is delivering next and learn what it means for your business as well as your career.
                  imageMicrosoft Keynote Highlights from Oracle OpenWorld 2013 Watch highlights from Microsoft Corporate Vice President Brad Anderson‘s keynote address from Oracle OpenWorld 2013 as Brad discusses the Cloud OS vision and how Microsoft and Oracle are working together to bring the power of Oracle’s software to private/public cloud and service providers. This new partnership allows customers using Java, Oracle WebLogic Server and Oracle Database to run this software on Windows Azure and Windows Hyper-V.
                  MSCloudOS YouTube supersite:

                  Microsoft’s Cloud OS home on YouTube to find the latest products & solutions news, demos as well as training videos for the technologies that bring Microsoft’s vision of Cloud OS to life. Subsites:
                  SQL Server (YouTube)
                  Windows Server (YouTube)
                  System Center & Windows Intune (YouTube)
                  BI (YouTube)
                  Case Studies (YouTube)

                  A People-centric Approach to Mobile Device Management [In The Cloud Blog, Jan 29, 2014]

                  The following post is from Brad Anderson, Corporate Vice President, Windows Server & System Center.


                  It’s been a little while since I wrote about the work we are doing around the BYO and Consumerization trends – but this is an area I will be discussing much more often over the next several months.

                  Consumerization is an area that is changing and moving quickly, and I believe the industry is also at an important time where we really need to step back and define what our ultimate destination looks like.

                  I think there is a great deal of agreement across the industry on what we are all trying to accomplish – and this is aligned with Microsoft’s vision. Microsoft’s vision is to enable people to be productive on all the devices they love while helping IT ensure that corporate assets are secure and protected.

                  One particular principle that I am especially passionate about is the idea that the modern, mobile devices which are built to consume cloud services should get their policy and apps delivered from the cloud. Put another way: Modern mobile devices should be managed from a cloud service.

                  One of the reasons I am such a big believer in this is the rapid pace at which new devices and updates to the devices are released. Enabling people across all the devices they love brings with it the need to stay abreast of the changes and updates happening across Windows, iOS, and the myriad of Android devices. By delivering this as a service offering, we can stay on top of this for you. Thus, as changes are needed, we simply update the service and the new capabilities are available for you. This means no longer needing to update your on-premises infrastructure – we take care of all of it for you.

                  System Center Configuration Manager is the undisputed market leader in managing desktops around the world, and now we are delivering many of our MDM/MAM capabilities from the cloud. We have deeply integrated our Intune cloud service with ConfigMgr so organizations can take advantage of managing all of their devices in one familiar control plane using their existing IT skills.  Put simply:  We are giving organizations the choice of using their current ConfigMgr console extended with the Intune service, or doing everything from the cloud using only Intune if they wish to do management without an on-premises infrastructure.

                  On a fairly regular basis I encounter the question about whether or not cloud-based management is robust enough for enterprise organizations. My response to this has surprised our partners and customers with just how powerful a cloud-based solution can be. The answer is a resounding, “Heck yes it is robust and secure enough!”

                  Windows Intune and Windows Azure Active Directory puts IT leadership in the driver’s seat by allowing an organization to define and manage user identities and access, operate a single administrative console to manage devices, deliver apps, and help protect data.

                  The result is employee satisfaction, a streamlined infrastructure, and a more efficient IT team – all with existing, familiar, on-prem investments extended to the cloud.

                  This holistic approach is central to Microsoft’s strategy to help organizations solve one of the most complex and difficult tasks facing IT teams today: Mobile device management (MDM).

                  As I discussed on the GigaOM Mobilize panel back in October (on the topic of “The Future of Mobile and the Enterprise,” recapped here), it wasn’t that long ago that an IT department worked in a pretty homogenous hardware and software environment – essentially everything was a PC. Today, IT teams are responsible for dozens of form factors and multiple platforms that require specific processes, skills, and maintenance.

                  Helping organizations proactively manage this new generation of IT is what makes me so excited about the advancements and innovation we are delivering as a part of next week’s update to the Windows Intune service. These updates include:

                  • Support for e-mail profiles that can configure a device with the correct e-mail server information and related policies – and it can also remove that profile and related e-mail via a remote wipe.
                  • In addition to our unified deployment mode and integration with System Center Configuration Manager, Windows Intune can now stand alone as a cloud-only MDM solution. This is a big win for organizations that want a cloud-only management solutions to manage both their mobile devices and PC’s.
                  • There is also support for new data protection settings in iOS 7 – including the “managed open in” capability that protects corporate data by controlling the apps and accounts that can open documents and attachments.
                  • This update also enables broader protection capabilities like remotely locking a lost device, or resetting a device’s PIN if forgotten.

                  Windows Intune offers simple and comprehensive device management, regardless of the platform, for the devices enterprises are already using, with the IT infrastructure they already own.

                  Looking ahead to later this year, we will continue to launch additional updates to the service including the ability to allow/deny apps from running (or accessing certain sites), conditional access to e-mail depending upon the status of the device, app-specific restrictions regarding how apps interact and use data, and bulk enrollment of devices.

                  This functionality is delivered as part of the rapid, easy-to-consume, and ongoing updates that are possible with a cloud-based service.

                  Today’s announcements are just a small example of the broader set of innovations Microsoft has been developing. Our focus on a people-centric approach to solving consumerization challenges has led to a number of product improvements and updates like:

                  The number of factors at work within this Consumerization of IT trend make it clear that to effectively address it we have to think beyond devices and focus on a broader set of challenges and opportunities.

                  Microsoft is in a unique position to address the holistic needs behind this industry shift with things like public cloud management, private cloud management, identity management, access management, security, and more.

                  For organizations who haven’t already evaluated Microsoft’s device management solutions – now is the time. With the rapid release and innovation cycle offered by a cloud-based service like Intune, the ability to keep your infrastructure optimized, efficient, and secure has never been easier.

                  The Virtuous Cycle of Cloud Computing [In The Cloud Blog, Jan 29, 2014]

                  The following post is from Brad Anderson, Corporate Vice President, Windows Server & System Center.

                  In the Day 1 keynote at the recent re:Invent conference, there was an interesting point made about the virtuous cycle that can occur for the cloud vendor and for customers. As I listened to the keynote, I kept thinking: “They are missing the biggest benefit for the entire industry; if the public cloud vendor has the right strategy and is thinking about how to benefit the largest population possible, then they are completely missing how this virtuous cycle can grow to benefit every organization in the world – even if they are not using the public cloud.”

                  Let me explain a bit more about what I mean.  (And, before I get too much farther along, I want to note that this post ties into the cool news yesterday about our work with the Open Compute Project.)
                  The virtuous cycle of a public cloud looks a lot like the image below.  As the usage of the public cloud grows, you need more hardware to meet demand – and for sustained growth you will need a lot of hardware. This need for hardware increases your purchasing power and you can then negotiate lower prices as you purchase in bulk. As your purchasing power grows and your costs drop, you then pass those savings on to your customers by dropping your prices. The lower prices increases demand and the virtuous cycle continues.image
                  For customers using the public cloud, they can see the benefits of this virtuous cycle (the lower prices) – but what about organizations that are also using private and hosted clouds? How can they gain benefits from what is happening?
                  Organizations with multiple clouds can benefit if (and only if!) that public cloud vendor has at the core of its strategy an intention to take everything that it is learning from operating that public cloud and delivering it back for use in datacenters around world – not just in its own.
                  This is where Microsoft is so unique! Microsoft is the only organization in the world operating a globally available, at-scale public cloud that delivers back everything it is learning for use in datacenters of every customer (and, honestly, every competitor). Our view is the learning that we are getting from the public cloud should be delivered for all the world to benefit.
                  This innovation can be seen by applying these public cloud learnings in products like Windows Server, System Center, and the Windows Azure Pack – and these products are the only cloud offerings that are consistent across public, hosted and private clouds – ensuring customers avoid cloud lock in and, maximize workload mobility, and have the flexibility to choose the cloud that best meets their needs.
                  With this in mind, I want to show you how I think the virtuous cycle can and should look – and how it can benefit any organization in the world.
                  First, at the center of this virtuous cycle is incredible innovation. This means innovation in software, innovation in hardware, and innovation in processes. When you are ordering and deploying 100,000’s of new servers and xx bytes of storage every year – you have to innovate everywhere or you will literally buckle under demands and costs of procuring, deploying, operating, and retiring hardware at this scale.
                  Microsoft is addressing this challenge in the most direct and complete way possible: Over the last three years, Microsoft has spent more than $15B building datacenters around the world and filling them with the hardware and capacity demanded by customers of Windows Azure and other Microsoft cloud services.
                  We keep our public cloud costs low by managing our supply chain for this kind of capacity, and, per the cycle, we pass these savings to you. We also carefully track things like the number of days from when we place an order for hardware to the time the order appears on our docks (“order-to-dock”), and then we track the number of hours/days from “dock-to-live” where we literally have customers’ workloads being hosted on that hardware. Throughout this process we set aggressive quarterly targets and we work constantly to consistently drive those numbers down. If we didn’t have a best in class product and performance, it would be impossible to remain profitable at this kind of scale.
                  As you can imagine, after spending $Billions on hardware every year, we are highly incented (to put it lightly) to find ways to drive our hardware costs down. The single best way we have found to do this is to use software to do things traditionally handled by hardware. For example, in Windows Azure we are able to deliver highly available, globally available storage at incredibly low prices through software innovations like SDN – all of which is based on low-cost, direct-attached storage. This brings storage economics never before seen in the industry.
                  One example of this is the most common workload hosted in Azure: The “Web” workload. Whether it is Azure acting as the web tier for hybrid application, or the entire workload being hosted in Azure, the web workload is a part of just about every application. This makes it a great place for innovation. In Azure we pioneered high-density web site hosting where we can literally host 5,000+ web sites on a single Windows Server OS instance. This dramatically reduces our costs, which in turn reduces your costs.
                  At Microsoft, we think the public cloud’s virtuous cycle can actually get a lot bigger, a lot more functional, and a lot more powerful by integrating service providers and hosted clouds.

                  image

                  Not only is this expanded virtuous cycle more practical, I’m sure it also looks familiar to what is already up and running in your organization.

                  There are some pretty solid examples of innovation that was pioneered in Azure and then brought to the whole industry for use everywhere through Windows Server and System Center:

                  • For highly available, low-cost direct attached storage, in Windows Server 2012 we shipped a set of capabilities we call Storage Spaces. Storage Spaces delivers the value of a SAN on low-cost, direct-attached storage, and it has been widely recognized as one of the most innovative new capabilities in Windows Server – and it was significantly updated in Windows Server 2012 R2.
                  • Service Bus provides a messaging queue solution in the public cloud that can be used by developers for things like a queuing system across clouds and building loosely coupled applications. Check this post for an in-depth review of Service Bus. Service Bus also ships as a component of the Windows Azure Pack – providing value pioneered in the public cloud for use in private and hosted clouds.
                  • Earlier I referenced the ability to host 5,000+ web sites on a single Windows Server OS instance. This has had an obvious economic impact on of costs of Windows Azure where we host millions of web sites. We proved that capability in Windows Azure, battle-hardened it, and now it ships for customers around to world to use in their datacenters as a part of what we call the Windows Azure Pack (WAP).
                  This is what it looks like when the complete virtuous cycle is in effect.
                  Our efforts haven’t been limited to software, however. Our innovative work with hardware in our datacenters has driven down costs while at the same time increasing the capacity each core and processor can support.
                  Our work with hardware was highlighted yesterday when we announced that we are joining the Open Compute Project and contributing the full design of the server hardware we use in Azure. We refer to this design as the “Microsoft cloud server specification.” The Microsoft cloud server specification provides the blueprints for the datacenter servers we have designed to deliver the world’s most diverse portfolio of cloud services at global scale. These servers are optimized for Windows Server software and can efficiently manage the enormous availability, scalability and efficiency requirements of Windows Azure, our global cloud platform.
                  This design spec offers dramatic improvements over traditional enterprise server designs: We have seen up to 40% server cost savings, 15% power efficiency gains, and a 50% reduction in deployment and service times. We also expect this server design to contribute to our environmental sustainability efforts by reducing network cabling by 1,100 miles and metal by10,000 tons.

                  This level of contribution is unprecedented in the industry, and it hasn’t gone unnoticed by the media:

                  • Wired: Microsoft Open Sources Its Internet Servers, Steps Into the Future
                  • Forbes: The Worm Has Turned – Microsoft Joins The Open Compute Project

                  These are just a couple examples of innovation that is happening here at Microsoft – innovations in process, hardware and software.

                  At Microsoft, we recognize that the majority of organizations are going to use multiple clouds and will want to take advantage of Hybrid Cloud scenarios. Every organization is going to have their own unique journey to the cloud – and organizations should make decisions about cloud partners that truly enable them with the flexibility to use multiple clouds, constant innovation, and consistency across clouds.

                  This is an area that we focus on every day, and you can read more about it as a part of our ongoing, in-depth series, Success with Hybrid Cloud.

                  Vendor Spotlight: A Microsoft GM On New Midmarket IT Tools [Exchange Events, Vendor Spotlight, April 23, 2013]

                  Mr. MidmarketCIO had the opportunity to sit down with Gavriella Schuster, Microsoft’s general manager of the company’s U.S. server and tools business unit. In this interview, Schuster shares her views on the challenges midmarket businesses face today and Microsoft’s vision to address those challenges with the Cloud OS.

                  MES: Can you share with me a little about Microsoft’s vision of the cloud today and how it can address today’s IT challenges for midmarket customers?
                  Schuster: Customers face many challenges today with the new levels of mobility in their workforce and the new devices that enable mobility. This new level of consumerization has enabled avid use of technology with an always-on connectivity.  There are also many more applications available and an explosion of data to manage. All of these things really challenge customers to reconsider how they provision, secure and enable technology within their organization.
                  There are multiple ways for customers to think about how they provision their infrastructure, and we aim to enable an ‘and’ philosophy for our customers so they don’t have to think that it’s an either/or decision. We allow them to take servers and other technology they are running on premises and think about how they might want to move some of it into cloud services, while still having a consistent level of management, identity and security.
                  Our vision for the ‘Cloud OS’ is to really have the best of both worlds. It’s an easy-on/easy-off usage of the cloud that meets the needs of midmarket organizations and can be an extension of current server environments.
                  MES: Is Microsoft’s ‘Cloud OS’ synonymous with Windows Server 2012? Or does it include other Microsoft technologies?
                  Schuster: Windows Server 2012 is certainly the basis of the Cloud OS because it provides the primary framework for identity, access, security and manageability, and also provides that core virtualization layer. Windows Server 2012 is also the basis of Windows Azure, our public cloud platform, so it gives midmarket CIOs the ability to easily extend their on-premises datacenter to the public cloud using a common set of tools between the two. The other core technology in the Cloud OS is Microsoft System Center 2012 because it gives customers that common level of additional management where they can set policies, provision their workloads, get deep application insights, etc. regardless of where the workload is actually running—on-premises or in the cloud.
                  MES: Where do you recommend customers start with their data-center modernization initiative? Why?
                  Schuster: For most customers, they should start with server virtualization. There is potential for them to get a tremendous amount of efficiency and consolidation of their applications through server consolidation.  They can virtualize upwards of 80 percent of all of the apps that they are running in their environment onto virtualized server environments, particularly in the midmarket. They may even be able to consolidate down to one to four servers and really take care of all of their workloads. Using Hyper-V as that virtualization framework and then using System Center Virtual Machine Manager to deploy that new virtual machine into their environment should be their first step to this approach.
                  MES: What are some of the new capabilities of Windows Server 2012 that go beyond virtualization to solve some common challenges?
                  Schuster: Windows Server 2012 not only helps midmarket organizations virtualize the compute—the virtualized machine itself—but it also helps them to virtualize their network and storage layers, which can be very costly capex investments for customers. It eliminates a lot of the common conflicts involved in managing an on-premise environment like IP and networking address conflicts. It also gives them additional storage so they don’t have to buy expensive SANs.
                  MES: A key trend challenging CIOs is mobility and the consumerization of IT. How does the Microsoft Cloud OS vision help address the security and management challenges around new devices and the need for increased mobility?
                  Schuster: I think it goes back to what I said before—we’ve enabled the ‘and’ so they can think about their governance role. There are a number of ways to address the consumerization of IT, and our primary message is that we think customers should embrace it. We enable them through Active Directory, which enables them to have a single sign-on experience and manage the identity of the user regardless of the environment the user is in (Office 365, Windows Azure, their on-premises environment, etc.)—This eliminates multiple pop-ups where the user has to continually sign in to the service.
                  We also have native functionality in Windows Server 2012 that eliminates the need for a VPN. With Direct Access, they can now easily deliver access to corporate resources based on the user’s identity.
                  Lastly, they can set policies for the user experience based on the device that they are using—phone, home machine, work machine, etc.—and can manage those mobile devices from the cloud with Windows Intune, without having to do additional on-premises setup.
                  MES: You briefly talked about Windows Azure as part of the Microsoft Cloud OS. What workloads do you recommend customers think about moving to the public cloud first?
                  Schuster: I think the easiest thing for most customers to think about moving to the public cloud first is cloud storage—they can use it for backup, archiving and disaster recovery. Especially as a midmarket customer, the last thing they probably have is a separate site with another set of servers that are replicated and ready to do a transfer if something disastrous were to occur. That’s absolutely something that the cloud is available and ready for. And customers only have to pay for what they use— it’s consumption based. The other areas that they would probably want to use it for are application development and test environments and for business and data analytics.
                  MES: Microsoft has laid out a hybrid cloud strategy, with the same basic underpinnings for both private and public cloud. What’s the benefit to mid-market customers of adopting Microsoft’s hybrid approach and technologies?
                  Schuster: When we talk about a hybrid environment, there are two ways to think about it: One is that it’s a hybrid enterprise, meaning they have some workloads that are sitting on servers inside their organization while others are using some server capacity within a public cloud like Windows Azure; Second is having hybrid applications. One of the advantages of the cloud today is that it enables even the smallest companies to act and look like very large companies. Unlike in the past with on-premise servers, the cloud gives CIOs the capacity and capability to introduce a new service to the market where they don’t have to have a great forecast of what the demand might be. This has really opened up new doors for midmarket IT organizations.
                  MES: How can your ecosystem of partners help midmarket customers today?
                  Schuster: The midmarket IT customer will typically only have a handful of IT pros within their organization, so enabling them to focus on the business and building applications to help power the business vs. managing servers and infrastructure is a real business value to our midmarket customers—and our partner ecosystem is well set up to help them do that.
                  We have done a lot of work to train our partners on how to deliver both our on-premise Windows Server 2012 virtualization environment as well as our Windows Azure cloud environments, and we have services available that help our customers build new applications.

                  John W. Thompson, Chairman of the Board of Microsoft: the least recognized person in the radical two-men shakeup of the uppermost leadership

                  The Will, with disappearing old guard, Satya Nadella break up the Microsoft behemoth soon enough, if any? [‘Experiencing the Cloud’, Feb 5, 2013] issue cannot be understood properly without understanding the significance of the change at the board chairmanship of the company. Otherwise misunderstandings will be rampant, like Microsoft May Be Undermining Nadella By Bringing Back Gates [a Forbes columnist, Feb 5, 2014], despite more reasonable opinions, like A Different Gates Is Returning to Microsoft [a columnist of The New York Times, Feb 5, 2014].

                  The most appropriate report of this another appointment is in John W. Thompson Replaces Bill Gates as Chairman of Microsoft Board [BLACK ENTERPRISE, Feb 6, 2014]:

                  John W. Thompson, CEO of Virtual Instruments and former CEO of Symantec Corp, has been named chairman of Microsoft’s board of directors, according to reports.

                  An industry leader for more than 40 years, he has made phenomenal strides in technology, having served as the only African American leading a major tech company during his time at Symantec. The Florida A&M and MIT alumnus is credited with growing the software giant’s revenues from $632 million to $6.2 billion and leading the growth of its worldwide workforce to more than 17,500 employees.

                  Thompson has served as an independent director on the board of Microsoft and also brings his experience as a former vice-president at IBM to his current post.

                  An early innovator and investor in tech advances in Silicon Valley, Thompson has also been included among Black Enterprise’s “100 Most Powerful African Americans in Corporate America,” and was named “Corporate Executive of the Year” as head of Symantec in 2004.

                  The West Palm Beach, Fla. native was recognized early for his knack for sales and has had a go-getter approach to his advancement. In a recent New York Times article, Thompson shared the following on career and business lessons he’s learned through the years: “First, never take yourself too seriously, or work is boring. Next, people make the difference. You can have great technology, but if it’s not complemented by great people, it won’t go anywhere. Finally, customers buy from people they like. I can always circle back to former customers and suggest they might want to take a look at our products.”

                  Microsoft Chairman John Thompson on CEO Satya Nadella [Microsoft Feb 14, 2014]

                  John Thompson, Chairman of Microsoft Corporation, talks about the appointment of CEO, Satya Nadella.

                  Microsoft Chairman John Thompson addresses Microsoft employees [Microsoft Feb 14, 2014]

                  At a town hall meeting introducing new Microsoft CEO Satya Nadella, Microsoft Chairman John Thompson tells employees that “the best talent we could find for this job was right here inside Microsoft.”

                  See also: John W. Thompson [Wikipedia article]

                  Before the chairman position he was the lead independent director of Microsoft, i.e. the lead director (lead of the group of other independent directors as well) representing the interests of independent shareholders: Director Video Series: John W. Thompson shares insights and experiences as Microsoft Director [originally aired on Microsoft’s Channel 9 on Aug 7, 2013; republished on YouTube EPC Group.net channel on Oct 23, 2013]

                  John W. Thompson recently sat down with Microsoft’s Channel 9 to share insights on his experience as a member of Microsoft’s board of directors, as part of our Director Video Series. During the course of the conversation, Mr. Thompson speaks about his more than 40 years of experience working in the technology industry, his responsibilities as Microsoft’s lead independent director, and Microsoft’s devices and services strategy. He also gives an insider’s view of the company’s board meetings.

                  He and his new role were characterized in the South Florida-raised John Thompson named Microsoft chairman [MiamiHerald.com, Feb 5, 2014] article as:

                  Thompson’s new role, analysts and industry experts said, is a signal that the company hopes to help reinvigorate its sluggish business by injecting some of Silicon Valley’s cutting-edge innovation.

                  “Thompson is a very well and deeply respected guy and his experience, plus his connections with the tech ecosystem in the Valley and elsewhere, will be invaluable for Satya,” said John Connors, Microsoft’s chief financial officer from 1999 to 2005, who now is a managing partner with Ignition Ventures.

                  While a CEO generally handles a corporation’s daily operation, a chairman, which is generally not a full-time job, has considerable responsibility because the board he or she heads oversees the top executives, including the CEO, and has ultimate responsibility for the company’s performance.

                  Industry experts view the leadership change as a determined effort by the Redmond, Wash., company’s board to move more quickly into mobile devices and other growing markets so the company can regain its former stature. Microsoft has been hard-hit by the declining personal-computer market, which depends largely on its software.

                  And for Thompson … being handed the chairmanship represents a crowning achievement in his long and lustrous career.

                  “This is the capstone, the cherry on top,” said technology analyst Charles King. “I don’t know where you go after this. He’s a good man and a very able leader, and he should be exactly what Microsoft needs at this time.”

                  Other analysts said Microsoft’s board is eager to borrow ideas from Silicon Valley and that Thompson is the perfect person to help it do that.

                  “They are going straight to a core Silicon Valley executive to be chairman, and I think part of that speaks to where the technology trend is going in terms of mobile,” said FBR Capital Markets analyst Daniel Ives. Noting that Thompson headed Microsoft’s search for a new CEO, Ives added that it’s likely Thompson will be heavily involved in making sure the company “is ready for this next phase of growth.”

                  Microsoft officials said Thompson was not available for interviews. But in a video on the company’s website, he addressed a stockholder concern that Microsoft hasn’t been performing up to expectations.

                  “As part of my new role,” he said, “one of my key contributions, I hope, will be to engage with shareholders and keep focused on how together we can bring great innovation to the market and drive strong, long-term shareholder value.”

                  Thompson, one of the few African-American CEOs in technology, was raised in West Palm Beach by a teacher and a postal worker. He played clarinet and saxophone in school, earning a band scholarship that sent him to college in Missouri. But he wanted to study business, telling interviewers later that he noticed the most successful adults he knew in the African-American community were small-business owners. He transferred to Florida A&M, graduating in 1971 with a bachelor’s in business.

                  Thompson’s diverse views and competencies are best shown by his recent postings in image “IT in 3D” Blog:

                  But he is still an excellent entrepreneur by heart:

                  IT Press Tour ’11 John W. Thompson CEO Virtual Instruments March 31st, 2011 [Philippe Nicolas YouTube channel, May 23, 2011]

                  Interview with John W. Thompson, CEO Virtual Instruments, March 31st, 2011 during the IT Press Tour 2011

                  John Thompson introduces Virtual Instruments [Virtual Instruments YouTube channel, Aug 1, 2011]

                  CEO John W. Thompson introduces Virtual Instruments.

                  And what was achieved under his 3 years of CEO leadership: John W. Thompson [Virtual Instruments YouTube channel, Feb 8, 2013]

                  Introduction to Infrastructure Performance Management (IPM) with John Thompson, CEO

                  Exactly at that time came the news that Forbes Names Virtual Instruments the Third Most Promising Company in America [press release, Feb 6, 2013] recognized for company growth and management team among the Top 100 private companies (when it achieved $41M revenue with 230 employees).

                  He is ecxellent technically as well:

                  John W. Thompson, Virtual Instruments – VMworld 2011 – theCUBE  [SiliconANGLE YouTube channel, Sept 12, 2011]

                  John W. Thompson, CEO of Virtual Instruments met with SiliconANGLE founder John Furrier and Wikibon Co-Founder Dave Vellante at VMworld 2011 to discuss virtualization and what his company does. Thompson opened the interview with the following statement: “What’s clear is that the economics around virtualization are proving to be true.” He went on to say that while VMware was undoubtedly a leader in the x86 environment, the virtualization phenomena is spreading across all the layers of the stack. The storage layer, the switch layer, and the server layers are all being virtualized. And while the concept of virtualization isn’t new, having it applied to all of those tiers is, in fact, a new trend. It gives customers better economics, but also new challenges to deal with, particularly around performance and availability management.
                  Thompson gave some history as to how Virtual Instruments (VI) evolved and explained what VI’s technology does. He said their focus is on monitoring the end to end transaction performance of a given set of I/O activities from the virtual server through the switch fabric, to the array and back. He said that as environments become more virtualized, there is a more critical need for deep level performance monitoring capability. He said, “Our technology peers in very deeply, and gives you granular latency level performance insight that no other vendor in the industry provides.”
                  Thompson used PayPal as an example of one of VI’s most impressive deployments. PayPal’s head of storage infrastructure said that prior to VI, they would experience one to two outages a year, and since they implemented VI’s technology, they have had no outages at all. Thompson attributed that to ” . . . the insight we give them on where problems might be looming, that they can take corrective actions on before they become an outage.”
                  Thompson emphasized that VI is a monitoring company — they don’t do configuration management or capacity planning. He explained that VI’s tools complement what the customer already has. Thompson said he doesn’t view them trying to expand into those aforementioned spaces either because their focus is already a big enough problem for customers that will provide them with plenty of room for growth.
                  Thompson discussed the announcements VI made at VMworld 2011, which centered around their next generation software platform, Virtual Wisdom 3.0, as well as their next generation hardware platform, which consists of an 8gb fibre channel probe that attaches between the switch fabric and the storage array to give customers real-time monitoring capabilities.

                  BUT THE MOST IMPORTANT THING FOR HIS CURRENT MICROSOFT ROLE IS HIS QUITE WELL PROVEN LEADERSHIP COMPETENCY:
                  Some of Thompson’s top leadership adages back from 2003 (his leadership adages at Symantec)
                  :

                  • You Cannot Stop Spending on Innovation
                  • Envisioning the Future
                  • Customers Must Drive Your Business Model
                  • Stick to Core Mission, Focus, and Keep It Simple
                  • Customer Diversity is Essential
                  • You Measure What Matters 

                  John Thompson [Symantec] : You Cannot Stop Spending on Innovation [Stanford Technology Ventures Program, Stanford University, recorded on May 28, 2003; re-published on ecorner YouTube channel, Aug 2, 2011]

                  John Thompson states a company must never stop spending on innovation, even in challenging times. When Symantec’s customers had no way to measure the effectiveness of the security technology they deployed, the company chose, amidst a difficult period, to spend a large percentage of revenue to build a portfolio of tools that would become industry standards. View more clips and share your comments at http://ecorner.stanford.edu/authorMaterialInfo.html?mid=350

                  John Thompson [Symantec] –Envisioning the Future [Stanford Technology Ventures Program, Stanford University, recorded on May 28, 2003; re-published on Entrepreneurship.org YouTube channel, Aug 28, 2013]

                  Thompson talks about how the cheapest form of growth is organic growth. We will be in the market again soon, he adds.

                  John Thompson [Symantec] –Customers Must Drive Your Business Model [Stanford Technology Ventures Program, Stanford University, recorded on May 28, 2003; re-published on Entrepreneurship.org YouTube channel, Aug 28, 2013]

                  Thompson talks about customers being the main driving force behind the business and business model. He gives examples from Symantec about the need to be close to customers. He also focuses on the need to concentrate on the business needs over

                  John Thompson [Symantec] –Stick to Core Mission, Focus, and Keep It Simple [Stanford Technology Ventures Program, Stanford University, recorded on May 28, 2003; re-published on Entrepreneurship.org YouTube channel, Aug 28, 2013]

                  John Thompson talks about the key driver at Symantec – product focus.To meet this goal, the enterprise was forced to make difficult decisions, including purging products that were not core to their network-centric vision. The result was that …

                  John Thompson [Symantec] –Customer Diversity is Essential [Stanford Technology Ventures Program, Stanford University, recorded on May 28, 2003; re-published on Entrepreneurship.org YouTube channel, Aug 28, 2013]

                  Thompson stresses the need for customer diversity. Software companies that were dependent on Fortune 1000 companies for their business suffered when their niche clients also suffered in the economic downturn. If a company is to survive challen

                  John Thompson [Symantec] -Measuring Success: You Measure What Matters [Stanford Technology Ventures Program, Stanford University, recorded on May 28, 2003; re-published on Entrepreneurship.org YouTube channel, Aug 28, 2013]

                  Thompson says that at Symantec, they measure what matters. Every one at Symantec knows what is being measured and managed and how they should behave in the contest. He stresses that in today’s environment, it is not about managing results but …


                  Microsoft Adds New Board Member [press release, Feb. 20, 2012]

                  John W. Thompson, CEO of Virtual Instruments and former CEO of Symantec, to join board.

                  Microsoft Corp. today announced that John W. Thompson, chief executive officer of privately held Virtual Instruments and former chairman and CEO of Symantec Corp., was appointed to the company’s board of directors, returning the board’s size to 10 members.

                  “John has extraordinary technology and business expertise, and we are delighted that he is joining Microsoft’s board of directors,” said Bill Gates, Microsoft chairman.

                  Thompson currently serves as CEO of Virtual Instruments, a privately held company located in San Jose, Calif., whose products are designed to ensure the performance and availability of applications deployed in virtualized and private cloud computing environments. Since 2009, Thompson has been an active investor in early-stage technology companies in Silicon Valley.

                  Thompson served as chairman and CEO of Symantec Corp., helping transform Symantec into a leader in security, storage and systems management solutions. During his 10-year tenure as CEO from 1999 to 2009, Symantec’s revenues grew from $632 million to $6.2 billion, and its worldwide workforce grew to more than 17,500 employees. Thompson stepped down as CEO of Symantec in 2009, and stepped down from Symantec’s board of directors in 2011.

                  Previously, Thompson held a number of leadership positions at IBM, including sales, marketing, software development and general manager of IBM Americas. He was a member of IBM’s Worldwide Management Council.

                  “John brings a wealth of experience, from enterprise customers to individual consumers, as well as the insights that come from running a successful large global software company and a fast-emerging startup. He will be a great addition to our board,” said Steve Ballmer, Microsoft chief executive officer.

                  “I am honored to join the Microsoft board and work with this exceptional team,” Thompson said. “Microsoft has been a leader across the entire information technology landscape for decades, and I look forward to sharing my experiences and contributing to the future direction and growth of this global leader.”

                  Thompson currently serves on the board of United Parcel Service, and he has served on a number of government boards and commissions, including the Financial Crisis Inquiry Commission, the National Infrastructure Advisory Committee, and the Silicon Valley Blue Ribbon Task Force on Aviation Security and Technology. He formerly served on the national board of Teach for America, an organization dedicated to eliminating educational inequities for all children.

                  He received a bachelor of business administration from Florida A&M, and a master’s degree in management from the Sloan Fellows program of the MIT Sloan School of Management.

                  In addition to Thompson, Microsoft’s board of directors consists of Bill Gates, Microsoft chairman; Steve Ballmer, Microsoft CEO; Dina Dublon, former chief financial officer of JPMorgan Chase; Raymond V. Gilmartin, former chairman, president and CEO of Merck & Co. Inc.; Reed Hastings, founder, chairman and CEO of Netflix Inc.; Dr. Maria M. Klawe, president, Harvey Mudd College; David F. Marquardt, general partner at August Capital; Charles H. Noski, vice chairman of Bank of America Corp.; and Dr. Helmut G. W. Panke, former chairman of the board of management at BMW AG.

                  John W. Thompson, Chairman [Microsoft, Feb 4, 2014]

                  John W. Thompson joined the Microsoft Board in February 2012 and became independent chairman of Microsoft Corporation on Feb. 4, 2014.

                  Thompson is the chief executive officer of privately held Virtual Instruments, whose products are designed to ensure the performance and availability of applications deployed in virtualized and private cloud computing environments. Thompson is also the former chairman and CEO of Symantec Corp.

                  Since 2009, Thompson has been an active investor in early-stage technology companies in Silicon Valley.

                  During his 10-year tenure as CEO of Symantec, he helped transform the company into a leader in security, storage and systems management solutions. Thompson stepped down as CEO in 2009, and stepped down from Symantec’s board of directors in 2011.

                  Previously, Thompson held a number of leadership positions at IBM, including sales, marketing, software development and general manager of IBM Americas, and was also a member of IBM’s Worldwide Management Council.

                  He has served on a number of government boards and commissions, including the Financial Crisis Inquiry Commission, the National Infrastructure Advisory Committee, and the Silicon Valley Blue Ribbon Task Force on Aviation Security and Technology. He formerly served on the national board of Teach for America, an organization dedicated to eliminating educational inequities for all children. He has also served on the boards of NIPSCO (Northern Indiana Public Service Company), Fortune Brands, Seagate Technologies, and UPS.

                  Thompson received a bachelor’s degree in Business Administration from Florida A&M, and a master’s degree in Management from the Sloan Fellows program of the MIT Sloan School of Management.

                  Microsoft Board names Satya Nadella as CEO [press release, Feb. 04, 2014]

                  Bill Gates steps up to new role as Technology Advisor; John Thompson assumes role as Chairman of Board of Directors.

                  Microsoft Corp. today announced that its Board of Directors has appointed Satya Nadella as Chief Executive Officer and member of the Board of Directors effective immediately. Nadella previously held the position of Executive Vice President of Microsoft’s Cloud and Enterprise group.

                  “During this time of transformation, there is no better person to lead Microsoft than Satya Nadella,” said Bill Gates, Microsoft’s Founder and Member of the Board of Directors. “Satya is a proven leader with hard-core engineering skills, business vision and the ability to bring people together. His vision for how technology will be used and experienced around the world is exactly what Microsoft needs as the company enters its next chapter of expanded product innovation and growth.”

                  Since joining the company in 1992, Nadella has spearheaded major strategy and technical shifts across the company’s portfolio of products and services, most notably the company’s move to the cloud and the development of one of the largest cloud infrastructures in the world supporting Bing, Xbox, Office and other services. During his tenure overseeing Microsoft’s Server and Tools Business, the division outperformed the market and took share from competitors.

                  “Microsoft is one of those rare companies to have truly revolutionized the world through technology, and I couldn’t be more honored to have been chosen to lead the company,” Nadella said. “The opportunity ahead for Microsoft is vast, but to seize it, we must focus clearly, move faster and continue to transform. A big part of my job is to accelerate our ability to bring innovative products to our customers more quickly.”

                  “Having worked with him for more than 20 years, I know that Satya is the right leader at the right time for Microsoft,” said Steve Ballmer, who announced on Aug. 23, 2013 that he would retire once a successor was named. “I’ve had the distinct privilege of working with the most talented employees and senior leadership team in the industry, and I know their passion and hunger for greatness will only grow stronger under Satya’s leadership.”

                  Microsoft also announced that Bill Gates, previously Chairman of the Board of Directors, will assume a new role on the Board as Founder and Technology Advisor, and will devote more time to the company, supporting Nadella in shaping technology and product direction. John Thompson, lead independent director for the Board of Directors, will assume the role of Chairman of the Board of Directors and remain an independent director on the Board.

                  “Satya is clearly the best person to lead Microsoft, and he has the unanimous support of our Board,” Thompson said. “The Board took the thoughtful approach that our shareholders, customers, partners and employees expected and deserved.”

                  With the addition of Nadella, Microsoft’s Board of Directors consists of Ballmer; Dina Dublon, former Chief Financial Officer of JPMorgan Chase; Gates; Maria M. Klawe, President of Harvey Mudd College; Stephen J. Luczo, Chairman and Chief Executive Officer of Seagate Technology PLC; David F. Marquardt, General Partner at August Capital; Nadella; Charles H. Noski, former Vice Chairman of Bank of America Corp.; Dr. Helmut Panke, former Chairman of the Board of Management at BMW Bayerische Motoren Werke AG; and Thompson, Chief Executive Officer of Virtual Instruments. Seven of the 10 board members are independent of Microsoft, which is consistent with the requirement in the company’s governance guidelines that a substantial majority be independent.

                  Will, with disappearing old guard, Satya Nadella break up the Microsoft behemoth soon enough, if any?

                  IMHO that was the most intriguing question raised after Satya Nadella’s appointment as the new CEO of Microsoft. This was also the only question mark hanging over this decision from the future of Microsoft point of view. In other regards there was just only one serious warning (IMHO again): Kedrosky [Bloomberg] to Microsoft: Stop Killing Partners (see embedded below). Otherwise the unanonimous opinion was that Satya Nadella as CEO and John Thompson as chair, with Bill Gates now going to be involved at the product level and Steve Ballmer just taking a seat on the board (both even compensated by an upcoming in March new director on the board, G. Mason Morfit from ValueAct Capital, representing the activist shareholders advocating break-up/spin-off moves), as the new top-level setup is indeed not only the best but the only possible thing for the huge and complacent software company (note that Nadella is not referring to Ballmer’s “devices and services” mantra, it’s gone).

                  Inherent to this companion post: John W. Thompson, Chairman of the Board of Microsoft: the least recognized person in the radical two-men shakeup of the uppermost leadership [‘Experiencing the Cloud’, Feb 6, 2014]. Without reading that misunderstandings like Nadella, Gates: Right Team For Microsoft? [InformationWeek, Feb 6, 2014] will be rampant, despite the reality that everybody should talk about the Nadella-Thompson combo as the uppermost leadership. Keep in mind also that out of the current 10 directors on the board (with Morfit 11) of Microsoft 8 (with Morfit 9) are independent directors (i.e. representing the vast majority independent shareholders), and only Gates and Ballmer are remaining as non-independent ones, representing their joint, somewhat more than 8% stake in Microsoft plus that of existing and the still loyal ex-softies (if any). So both of them could quite easily be outvoted … etc.

                  The upcoming new director, Morfit will specifically (“actively”) advocate the reduction of focus on Windows and the acceleration of efforts to unchain products and services from the operating system. In fact he had an agreement for regular meetings with “selected directors and management to discuss a range of significant business issues” back in August 30, 2013. Note also that he will join the board in March which is actually the time when the next fiscal year plan is presented and approved by the board. This is within the full fiscal year planning process, mainly consisting of the so called MYR (MidYear Review), PRISM (PRIority Setting Meeting), and WWSMM (WorldWide Sales & Marketing Memo) phases. Note as well that for the new CEO appointment this date was the very last one, as the PRISM, preceding the board approval, should be headed by the next fiscal year CEO.

                  Key words and phrases describing the essence of this post:

                  Microsoft complacency Satya Nadella as CEO John Thompson as chair
                  Bill Gates as technical advisor Microsoft break-up Microsoft spin-offs
                  Stop Killing Partners consumer and business computing Microsoft enterprise business
                  Microsoft consumer business Microsoft as “mobile-first, cloud-first” company the notion of the modern enterprise
                  removing obstacles to innovate Gates as a product person new activist director on the board of Microsoft
                  thriving in a mobile and cloud-first world zero in on unique contributions set of high-value activities
                  empowering users and organizations to “do more” as Microsoft’s core value product innovation not compartementalized by consumer and business  

                  Satya Nadella: His first [pre-recorded by MS] interview as CEO of Microsoft [Microsoft Feb 4, 2014]

                  Satya Nadella’s first interview as CEO of Microsoft, at the company’s campus in Redmond, Wash. Learn more about Satya at http://msft.it/msceo

                  Just a fragment of that interview:

                  [2:26] STEVE CLAYTON: And as you step into this role, what do you see as your primary focus? What are you going to be really focused on?

                  SATYA NADELLA: I would say first thing I want to do and focus on is ruthlessly removing any obstacles that allow us to innovate, every individual in our organization to innovate, and then focus all of that innovation on things that Microsoft can uniquely do. We are the company that enables people to do more, to play, have more fun, to create more. So in some sense we refer to ourselves as the do more company, and I want us to be able to take that focus and innovation forward.

                  And lastly, I want every one of us to find more meaning at work. We spend far too much time at work for it not to have deep meaning.

                  STEVE CLAYTON: You talked about this focus on innovation. Where do you see the opportunities lie for Microsoft?

                  SATYA NADELLA: Going forward it’s a mobile first, cloud first world. In other words, everything is becoming digital and software-driven. And so I think of the opportunities being unbounded. And we need to be able to pick the unique contribution that we want to bring.

                  And that’s where our heritage of being the productivity company to now being the do-more company where we get every individual at every organization to get more out of every moment of their life is what we want to get focused on. [4:06]

                  Watch also:
                  Highly recommended Susan Hauser [CVP, EPG Group of Microsoft] interviews Microsoft CEO Satya Nadella [Microsoft, Feb 4, 2014; published on Microsoft Youtube channel, Feb 5, 2014]:   [Microsoft, Feb 4, 2014: “Satya Nadella is a strong advocate for customers and partners, and a proven leader with strong technical and engineering expertise. Nadella addressed customers and partners for the first time as CEO during a Customer and Partner Webcast event.”]

                  Microsoft CEO Satya Nadella speaks with Susan Hauser in front of a group of customers and partners about his background and the opportunities he sees for the company’s continued growth and innovation.

                  [Contributor Profile: Susan Hauser, Corporate Vice President,
                  Enterprise and Partner Group, Microsoft]

                  As a teaser Q: [6:43] How do you think about consumer and business, and how do you see them benefiting each other?

                  A: You know, one of the things that when we think about our product innovation, we necessarily don’t compartementalize by consumer and business, we think about the user. In many of these cases, what needs to happen is experiences. That’s for sure have to have a strong notion of identity and security, so I.T. control, where it’s needed, still matters a lot, and that’s something that, again, we will uniquely bring to market. But it starts with the user. The user obviously is going to have a life at home and a life at work. So how do we bridge that as there more and more of what they do is digitally mediated? I want to be able to connect with my friends and family. I also want to be able to participate in the social network at work, and I don’t want the two things to be confused, but I don’t want to pick three different tools for doing the one thing I want to do seamlessly across my work and life. That’s what we are centered on. When we think about what we are doing in communications, what we are doing in productivity or social communications, those are all the places where we really want to bridge the consumer and business market, because that’s how we believe end-users actually work. [8:01]

                  Read also:
                  Microsoft Names Nadella CEO: New Era or New Error? [BARRON’S, Feb 4, 2014]
                  Facebook or Microsoft: Whose Dominance [i.e. Monopoly] Will Last Longer? [The New Yorker, Feb 4, 2014]
                  Microsoft breakup talk starts [IOL, Feb 4, 2014]: “Jettisoning units such as Xbox video-game consoles and the Bing search engine … should go further by also splitting off Windows and smartphones to focus on providing services to business customers Eighty percent of the value of Microsoft is on the enterprise side and it’s not being valued that way today. The consumer side of the business gets a disproportionate amount of attention. … Shareholders may find an insider advocate in Mason Morfit, the president of activist investing firm ValueAct Holdings LP. Morfit, who’s set to join Microsoft’s board in March, wants the company to reduce its focus on Windows and accelerate efforts to unchain products and services from the operating system
                  Top 5 items on new Microsoft chief’s to-do list [Associated Press, Feb 5, 2014]: “ … some of the most pressing items on Nadella’s to-do list as he reshapes Microsoft into a “mobile-first, cloud-firstcompanyIntegrate Nokia’s mobile device businessFix Windows and unite the company’s various operating systemsSet a hardware strategyPick a management teamWork with the board, including Bill Gates
                  Microsoft’s new CEO gets handsome pay package [USA TODAY, Feb 4, 2014]: “Overall, he could receive about $18 million as a first-year CEO, more than double what he received as head of the Microsoft’s cloud computing operation in 2013.

                  Highly recommended follow-up: Nadella’s Speaks as CEO: Bloomberg West (02/04 video) [Bloomberg TV, Feb 4, 2014] (only the first [18:10] long out of total [41:55] )

                  Feb. 4 (Bloomberg) –- Full episode of “Bloomberg West.” Guests include Habit Design’s Michael Kim [former Microsoft], Kilbourne Group’s Doug Burgum [former Microsoft], Department of Health and Human Services’ Kurt Delbene [former Microsoft], FCC’s Jessica Rosenworcel, Kapor Center’s Nicole Sanchez, Code2040’s Laura Weidman Powers, Bloomberg Contributing Editor Paul Kedrosky and Bloomberg’s Megan Hughes. (Source: Bloomberg)

                  Next the details are coming in the following sections:

                  1. ICT industry reports
                  2. Reuters reports
                  3. Bloomberg reports
                  4. Microsoft video text messages for the world and its employees


                  1. ICT industry reports:

                  Microsoft has named Satya Nadella its next CEO. The company’s cloud and enterprise chief, Nadella will replace Steve Ballmer after a five-month search. [Nick Barber IDG News Service]
                  Keith Shaw chats with Network World’s Brandon Butler about Satya Nadella, who was named the new CEO of Microsoft on Tuesday. Butler discusses the challenges and opportunities for Microsoft in the cloud market with Nadella now in charge.
                  Microsoft didn’t have to look far in its six-month search for a new CEO. The company has tapped Satya Nadella to replace Steve Ballmer. Nadella, who has been at Microsoft for 22 years, will become the third CEO in Microsoft’s four-decade-long history. As CNET’s Kara Tsuboi reports, Microsoft’s new chief faces some serious challenges
                  In this Inside Scoop, CNET’s Kara Tsuboi and Executive Editor Charlie Cooper discuss Microsoft’s new CEO, who replaces the outgoing Steve Ballmer. Hear why Satya Nedella is right for Microsoft, why Microsoft’s job is right for him.
                  It’s a historic day at Microsoft: Satya Nadella is in as CEO, Steve Ballmer is officially retired and Bill Gates is set to spend more time on the Redmond campus. GeekWire’s Todd Bishop and John Cook analyze the big news and talk about Microsoft’s future. More here: http://www.geekwire.com/2014/analysis-new-ceo-bill-gates-means-microsofts-future/
                  With Satya Nadella officially Microsoft’s new CEO, it’s time to say goodbye to one of the most enthusiastic personalities in technology. Make no mistake, he will be missed. A loving tribute. Video by Billy Disney.
                  Limit Gates’ Influence Stop Pretending Windows 8 Works Become More Hardware-Centric
                  Fix Xbox One Flaws Spend More Time in the Clouds Strongly Question the Nokia Play
                  Get Away From Bing Become Smaller, More Agile Announce Windows 9 Sooner Rather Than Later
                  Prove It’s a New Microsoft    


                  The late 2000s were characterized by Microsoft dropping the consumer ball. The company didn’t notice that the iPhone had made the smartphone a mass-market, consumer device, and the company did not appear to anticipate that the smartphone’s success in the consumer space would in turn lead to success in the enterprise space—even though this kind of cross-pollination is a large part of what made Microsoft the behemoth it is. The same story was repeated with the release of the iPad and the consumer-oriented tablet.

                  There is pressure from Wall Street for Microsoft to abandon the consumer market. Sell off Xbox [regarding that see also Phil Spencer: Microsoft’s New CEO Supports Xbox One [Cinema Blend, Feb 4, 2014]], Nokia, Bing, and retreat to the cozy confines of the enterprise market and become another IBM. I would argue that this is a mistake. Should Microsoft abandon the consumer market, the next generation of school-leavers will be raised on Google Apps, iPads, Chromebooks, and OS X.

                  This won’t merely disrupt Windows on the desktop. It will damage the long-term viability of Office, and beyond that, of Windows on the server as a development platform. This is not to say that these businesses will evaporate entirely, but they’ll be greatly diminished.

                  Importantly, Nadella appears to recognize this. At the company’s 2013 Financial Analyst Meeting, Nadella said, “This notion that this is an enterprise product and this is a consumer product I think is not the way we will approach things. We’ll think about these products as sort of meeting end user needs and enterprise IT needs, and how to balance that.”

                  But recognizing the duality is one thing. Responding appropriately to it is another. Making sure that Microsoft doesn’t make the same mistakes, and that it actually leads the consumer space rather than belatedly following others, will require strong, consumer-focused voices and leadership within the company. It’s not surprising that Microsoft’s new CEO does not cover all these bases. It’s unlikely anyone could, such is the diversity of what Microsoft does.

                  With Nadella’s promotion, it’s not entirely clear where this consumer focus and understanding will come from. Microsoft may be pinning its hopes on a new more active role for Bill Gates. Gates has stepped down as chairman (replaced by John Thompson, formerly of Symantec) but will now spend up to one-third of his time working with product groups and defining the “next round of products.”

                  But whether Gates can provide this guidance isn’t so clear. In broad strokes, Gates’ Microsoft was an early pioneer of both tablets and smartphones (and even smart watches). In each case, however, the company failed to adapt those early visions to accommodate new technology and new consumer preferences. Admittedly, Gates wasn’t involved in the day-to-day running of the company when these oversights were made, but even as chairman, one would think that he could have pointed out that Microsoft was missing a trick—assuming he recognized it.

                  Satya Nadella is a good choice for Microsoft CEO, and while it’s day one on the job, he’s so far saying the right things: recognizing the importance of the consumer space, promoting a “mobile first, cloud first” view where devices are important and where hardware, including Nokia, is part of Microsoft’s future. Wretched cliché as it may be, only time will tell if Nadella has what it takes to move the company forward.

                  Microsoft today announced Satya Nadella as the new CEO of Microsoft. We had reported on Friday last week that Sundar Pichai from Google was another top consideration thanks to his firsthand knowledge of the increasingly influential consumer web, but it was Nadella that won out in the end.. Nadella is known for his experience in the enterprise, helping to rework much of Microsoft’s infrastructure for long-reaching products including Bing and Xbox. But is Nadella too safe of a choice for Microsoft? Would an outsider like Pichai have been better suited to lead Microsoft into the new consumer web?
                  Microsoft, like its peers and rivals in the industry, is betting big on the modern data center, but with Nadella at the helm, is the Redmond company landing on the wrong side of the cloud? Is this too much of an analog play and not enough of a shake-up play?
                  In conjunction with the announcement, Microsoft founder Bill Gates will step down from his position as Chairman of the Board and be replaced by our own SiliconANGLE theCUBE alum John Thompson. Gates is allocating a third of his time to mentor Nadella. The duo has much to discuss, reshaping Microsoft for the data center of the future. Cloud services remains at the center of an industry-wide revolution, and Nadella’s already shared his thoughts on the subject.
                  Just this past summer we had Nadella on theCUBE at the Accel Partners Symposium, live from Stanford University. Nadella discussed with theCUBE host Jeff Kelly the notion of the modern enterprise: a re-imagination of what infrastructure means and what applications mean inside of the enterprise. According to the new chief, there is a tectonic shift happening in the enterprise, and Microsoft is a part of that shift. From a business perspective, a key to infrastructure is being in touch with the applications.
                  “We’re building a new operating system for the modern enterprise to be able to deploy these modern applications. That is how I conceptualize it,” Nadella stated.
                  The four mega trends constitute the future of a modern enterprise infrastructure. But that can make for an awfully complex public, private and/or service provider cloud. So how does Microsoft and Nedella approach that problem of complexity?
                  Nadella says to start with the design point — public, private and service provider cloud. He believes it’s the true fruition of distributed computing.
                  “So these four things, identity, management, virtualization and application platform I think is the co-investment you’ve got to make to help enterprises truly adopt the cloud while its complex but you have to tame the complexity,” Nadella explained.
                  So what does that mean for Microsoft? Is Nadella the man to lead them into the consumer web and the Internet of Things? It feels like a bit of a safe bet for Microsoft. How can that be for a Fortune 50 company who just reported a killer? As our Editor-in-Chief John Furrier reported last Friday in his Breaking Analysis segment, it’s about getting the data right.
                  Buried in the news of Nadella being named CEO is the news we mentioned above that John Thompson will be the new Chairman of the Board. Interesting tidbit, when on theCUBE in 2011 Furrier asked Thompson about the “middle fat part” developing within the market as it relates to real-time data. Given this consumer-driven market powered by the Internet of Things, Thompson hints at his own vision for Microsoft, one that rings true nearly three years later as he works closely with Nadella on Microsoft’s board:
                  “Our focus is on the global 2000. They have one thing in common, performance and uptime sensitive. We think this market is about a $1.7-$1.8 billion market. We have literally barely scrapped the surface on that. This is a phenomenon that we think will only catch more wind in its sails,” said Thompson.


                  2. Reuters reports:

                  Microsoft Corp named 22-year company veteran Satya Nadella as its next chief executive officer on Tuesday (February 4), and said co-founder Bill Gates would step down as chairman and advise the new CEO on technology, marking an epochal change of control at the company that drove the PC revolution.

                  Nadella, a 46-year old born in India who led the creation of Microsoft’s Internet-based, or “cloud” computing services, is only Microsoft’s third CEO in 39 years, taking over from Steve Ballmer, who inherited the job from Gates in 2000.

                  The move ends a five-month search process at the Redmond, Washington-based company, triggered by the August announcement of Ballmer’s decision to retire. That was longer than many investors had expected.

                  Gates said Nadella’s experience in cloud computing made him the right man to lead Microsoft, as the company struggles to find its feet in the new arena of mobile computing. As he relinquishes the chairman’s title, Gates will stay on the board and assume a new role as technology adviser to Nadella.

                  Shares of the world’s largest software maker were up 0.2 percent at $36.54 on the Nasdaq on Tuesday morning.

                  Nadella, who describes himself as a cricket and poetry lover, called the appointment “humbling” in an email to the company’s employees. In a videotaped statement he said he would focus on “ruthlessly” removing any obstacles to innovation at the company.

                  Microsoft veteran Satya Nadella will have to find a way to re-invent the 40 year old software giant as its core businesses continue to erode and founder Bill Gates stays on in the newly created role of technology advisor. Bobbi Rebell reports.

                  Transcript:

                  Satya Nadella’s new job is a big one- as only the third CEO in Microsoft’s almost four decade history- he takes over at a critical time- and he himself makes it clear in this Microsoft video- he won’t put up with anything that gets in his way:

                  SOUNDBITE: SATYA NADELLA, CEO, MICROSOFT (ENGLISH) SAYING: “The first thing I want to do and focus on is ruthlessly remove any obstacles that allow us to innovate.” One obstacle, some say, was Bill Gates in his role as Chairman. He’s stepping down – but says he will now spend one third of his time advising Nadella on technology.

                  A good move all around says analyst Patrick Moorhead. SOUNDBITE: PATRICK MOORHEAD, PRESIDENT AND PRINCIPAL ANALYST, MOOR INSIGHT AND ANALYSIS (ENGLISH) SAYING:
                  “He’ll have a less powerful role in terms of managing the business of the board but he is narrowing the scope in on something I don’t think anybody can argue with, in that he is going to give insights to products and at its core, Gates is a product person.”

                  That’s good because products are a big problem. For example, the company is still struggling to find its feet as mobile computing evolves.
                  SOUNDBITE: PATRICK MOORHEAD, PRESIDENT AND PRINCIPAL ANALYST, MOOR INSIGHT AND ANALYSIS (ENGLISH) SAYING:
                  “You have Apple, Samsung and Google who are gaining a ton of share and a ton of mindshare for the future and I think Nadella needs to take a hard look in the mirror and really evaluate whether Microsoft can win there.”

                  See more opinion from him in Gates will have a less powerful role – Patrick Moorhead (6:10) video

                  They also missed the boat on social media– says Bruno Del Ama– who runs the Global X social media Index fund:
                  SOUNDBITE: BRUNE DEL AMA, CEO, GLOBAL X FUNDS (ENGLISH) SAYING: “Tremendous amount of growth. Very difficult to do organically and so if they want to do something there in a meaningful way they probably have to acquire.”

                  Microsoft has also been facing a slow erosion of its PC-centric Windows and Office franchises.

                  Investors have been clamoring for a big moveand a break up makes sense says Chris Baggini of Turner Investments.
                  SOUNDBITE: CHRIS BAGGINI, SENIOR PORTFOLIO MANAGER, TURNER INVESTMENTS (ENGLISH) SAYING:
                  “I think he has to split the company up. It’s really a mishmash of very mature businesses and some growthier businesses and unfortunately we’ve seen this before where the more mature businesses languish with very low growth rates and the growthier businesses really don’t get valued the way they should. So they have a data center business which is very strong, they can break out their consumer business and separate that out and leave an enterprise business which is a very strong cash flow business.” Shares were slightly higher on the official announcement- the stock had rallied when rumors of Nadella’s promotion leaked last week.

                  Microsoft’s new CEO Satya Nadella stresses the Microsoft software ‘experience,’ as he discussed plans for new hardware in the form of devices and services. Rough Cut (no reporter narration)
                  Feb. 4 – Bill Gates is no longer chairman and the company has named a new CEO. Breakingviews columnists, Rob Cox (also Global Editor) Robert Cyran and Jeffrey Goldfarb (also Assitant Editor) discuss why these changes are an important signal for the future of Microsoft.

                  At the end of the discussion they are asking:
                  Will Satya Nadella finally break up Microsoft?
                  Tweet us your thoughts @Breakingviews

                  The author is a Reuters Breakingviews columnist. The opinions expressed are his own.
                  Microsoft’s founding fathers are finally receding into the middle distance. Satya Nadella’s experience makes him a solid choice to succeed longtime Chief Executive Steven Ballmer. Better still, he will have greater room to maneuver as Bill Gates steps down as chairman. Nadella will need to grapple with his predecessors’ bad decisions, like the Nokia deal, and he’s unlikely to pursue a breakup. But he can focus on what the company does best.

                  While many candidates entered the frame, it was always going to be a difficult post to fill. Microsoft spans everything from its omnipresent operating system to enterprise software to consumer hardware. It’s also threatened by upstarts and a shift in technology away from PCs. Finding a manager that understands technology, all these markets and has skills in revitalizing a mature behemoth was close to impossible.

                  Worse, the decision to buy Nokia’s handsets arm for $7.2 billion in the midst of the search showed that Microsoft’s board was wedded to the sprawl built by GATES AND BALLMER. Few credible outsiders wanted to step into a position where they had little say over the company’s direction.

                  In this light, Nadella’s choice is probably as good as the company could make. He has worked for Microsoft since 1992, so he knows the place. His most recent task was to run Microsoft’s cloud and enterprise group. This is one of the fastest growing divisions at Microsoft and represents the company’s future – selling software on demand to companies. He doesn’t have sales experience or much interaction with investors, which is important for a $303 billion market cap company. But Microsoft’s bench has enough depth to make up for these shortfalls.

                  The bigger question is where Nadella will take Microsoft. He didn’t give many hints in his opening memo to employees. The right course would be to focus on enterprise software, which is what Microsoft does best. A breakup or spinoff of the consumer and hardware operations would be welcome. But with Ballmer still on the board and holding some 4 percent of the company, and Gates remaining as the board’s technology adviser to “devote more time to the company,” such radical redrawing will be hard to accomplish any time soon.

                  But the message is unmistakable. The old guard is slipping into the background. That gives Nadella room to slowly turn Microsoft toward a more focused, and potentially valuable, future.


                  3. Bloomberg reports:

                  Feb. 4 (Bloomberg) — Bloomberg News’ Matt Miller reports that Microsoft has promoted Satya Nadella to CEO, while John Thompson assumes Bill Gates’ role as Chairman of the Board on Bloomberg Television’s “In The Loop.”

                  Reporting related to that:
                  Microsoft’s Nadella Named CEO to Transform PC Pioneer [Bloomberg, Feb 5, 2014]: “ ““He’s really the complete package — he has incredible intellect but he also combines that with a deep curiosity and willingness to learn,” said Doug Burgum, who sold business-software developer Great Plains to Microsoft in 2001 and oversaw Nadella while at the Redmond, Washington-based company. … Nadella keeps an eye on the moves of nimbler startups and has pushed Microsoft executives to learn from what people outside of Redmond are doing, a person with knowledge of his management approach has said. At a technology conference in Paris in December, he spent time with local startups like video-on-demand company Video Futur Entertainment Group SA.”
                  Microsoft Signals New Era With Thompson as Chairman [Bloomberg, Feb 4, 2014]: “ “Thompson’s going to be a major voice for the company,” James Staten, an analyst at Forrester Research, said in an interview. “They wouldn’t have made him chairman, if he didn’t have strong opinions about how to drive the company forward. And Satya is looking for strong partners on the board.” … Thompson and Nadella will oversee a transition to a new organizational structure and integrate the $7.2 billion acquisition of Nokia Oyj (NOK1V)’s handset unit. The management transition at Microsoft follows the worst decline on record for personal computers in 2013, when shipments dropped 10 percent and are projected to languish through 2017. Thompson knows what it’s like to be at the head of a struggling incumbent. While at Symantec in 2005, he orchestrated the ill-fated $10.2 billion purchase of Veritas Software Corp., in an effort to push into data storage. When Thompson stepped down as CEO four years later, Symantec was contending with slowing growth amid an economic downturn and rising competition. … Thompson likes to tell people he spent “27 years, 9 months and 13 days at IBM” before joining technology security company Symantec as CEO in 1999. He took the company from $600 million to $6 billion in sales over his decade-long tenure, before stepping down in 2009. … A new director set to join the board next month is Mason Morfit, president of activist shareholder ValueAct Holdings LP. He’s eager to see Microsoft emphasize its business software and Internet-based cloud services rather than consumer technology, people familiar with the situation have said.”
                  Microsoft Gets Style Shift With Nadella Replacing Ballmer [Bloomberg, Feb 5, 2014]
                  Microsoft CEO Pick Leaves Losers Grappling With Fallout [Bloomberg, Feb 5, 2014]: “With Microsoft’s board disclosing today that it picked Satya Nadella as CEO, that leaves internal candidates such as Executive Vice President Tony Bates and Chief Operating Officer Kevin Turner among those who failed to get promoted, people with knowledge of the search have said. Stephen Elop, the former CEO of Nokia Oyj who was set to join the software maker after it closes a $7.2 billion deal for Nokia’s handset unit, was also on the shortlist, among others. … The CEO candidates were informed that they didn’t get the role last week … Turner plans to stay at the Redmond, Washington-based company, said a person close to the COO. And while Bates and Elop both have ambitions to be CEO, they are also set to continue at Microsoft for the time being since success in their current jobs may be the best way to attract other offers, said people close to the executives, who asked not to be identified because the information is private.”
                  Who Is Satya Nadella and Why Is He Microsoft’s CEO? (video) [Bloomberg TV, Feb 3, 2014]: “Feb. 4 (Bloomberg) — Kurt Delbene, former president at Microsoft Office, and Bloomberg Contributing Editor Paul Kedrosky discuss Microsoft’s choice of Satya Nadella as its new CEO on Bloomberg Television’s “Bloomberg West.” “
                  Nadella as CEO Good for Microsoft’s Future: Subotky (video) [Bloomberg TV, Feb 4, 2014]: “Feb. 4 (Bloomberg) — Jason Subotky, a portfolio manager at Yacktman Asset Management Co., talks about Microsoft Corp.’s decision to name Satya Nadella chief executive officer. Nadella will replace Steve Ballmer effective immediately after a five-month search, Microsoft said in a statement today. Subotky speaks with Scarlet Fu, Jon Erlichman, Matt Miller, Paul Kedrosky and Anurag Rana on Bloomberg Television’s “In the Loop.” (Source: Bloomberg)”
                  How Can Nadella, Gates Shape Microsoft’s Future? (video) [Bloomberg TV, Feb 4, 2014]: “Feb. 4 (Bloomberg) — Kurt Delbene, former president at Microsoft Office, and Bloomberg Contributing Editor Paul Kedrosky discuss how the shakeup in Microsoft’s c-suite can shape the future of the company on Bloomberg Television’s “Bloomberg West.” ”

                  Jan. 31 (Bloomberg) — Bloomberg West Editor-At-Large Cory Johnson discusses reports that Microsoft will promote enterprise and cloud chief Satya Nadella to CEO on Bloomberg Television’s “Bloomberg Surveillance.”

                  Reporting related to that: Microsoft Said to Be Preparing to Make Satya Nadella CEO [Bloomberg, Jan 31, 2014]

                  Feb. 3 (Bloomberg) — Bloomberg Businessweek’s Peter Burrows and Crawford Del Prete, executive VP & chief research officer at IDC, discuss the possible future role of Bill Gates at Microsoft on Bloomberg Television’s “Bloomberg West.”

                  Reporting related to that:
                  Microsoft’s New Power Isn’t Gates, Nadella: Walia (video) [Bloomberg TV, Feb 3, 2014]: “Former Microsoft Executive Hardeep Walia discusses the search for a new Microsoft CEO on Bloomberg Television’s “Bloomberg Surveillance.” (Source: Bloomberg)”
                  Microsoft Begins New Era With Nadella, Gates (video) [Bloomberg TV, Feb 3, 2014]: “Feb. 4 (Bloomberg) — Bloomberg senior West Coast correspondent Jon Erlichman breaks down the management changes at Microsoft on Bloomberg Television’s “Market Makers.” ”
                  Bill Gates to Assume Role as Technology Adviser [Bloomberg TV, Feb 4, 2014]: “Feb. 4 (Bloomberg) — Microsoft named Satya Nadella CEO, tapping an insider steeped in business technology to speed turnaround at a software maker that helped usher in the personal-computing age only to be left behind as the world shifted toward the Web and mobile devices. Bloomberg Contributing Editor Paul Kedrosky, Bloomberg’s Matt Miller and Jon Erlichman speak on Bloomberg Television’s “In The Loop.” (Source: Bloomberg)”

                  Feb. 4 (Bloomberg) — Bloomberg Contributing Editor Paul Kedrosky and senior West Coast correspondent Jon Erlichman discuss Microsoft’s track record of working with partners and the relationship between the new CEO and Chairman of the Board on Bloomberg Television’s “In The Loop.”


                  4. Microsoft video and text messages for the world and its employees (in addition to the pre-recorded interviews embedded in the beginning of this post):

                  Microsoft Chairman John Thompson on CEO Satya Nadella [Microsoft Feb 14, 2014]

                  John Thompson, Chairman of Microsoft Corporation, talks about the appointment of CEO, Satya Nadella.

                  Bill Gates welcomes Satya Nadella as Microsoft CEO [Microsoft Feb 14, 2014]

                  Microsoft co-founder Bill Gates welcomes Satya Nadella as the company’s new CEO and discusses his own role at the company.

                  Steve Ballmer welcomes Satya Nadella as Microsoft CEO [Microsoft Feb 14, 2014]

                  Steve Ballmer welcomes Satya Nadella as the new CEO of Microsoft.

                  Microsoft Chairman John Thompson addresses Microsoft employees [Microsoft Feb 14, 2014]

                  At a town hall meeting introducing new Microsoft CEO Satya Nadella, Microsoft Chairman John Thompson tells employees that “the best talent we could find for this job was right here inside Microsoft.”

                  Bill Gates at Microsoft’s “Meet the CEO” event [Microsoft Feb 14, 2014]

                  Microsoft Founder and Technology Advisor Bill Gates joins new Microsoft CEO Satya Nadella, former CEO Steve Ballmer and Chairman John Thompson as Nadella addresses Microsoft customers and partners.

                  Satya Nadella addresses Microsoft employees [Microsoft Feb 14, 2014]

                  New Microsoft CEO Nadella talks to employees for the first time as the company’s CEO. He notes that “this business does not really respect tradition — what it respects is innovation.”

                  Microsoft Board names Satya Nadella as CEO [press release, Feb 4, 2014]

                  Bill Gates steps up to new role as Technology Advisor; John Thompson assumes role as Chairman of Board of Directors.

                  Microsoft Corp. today announced that its Board of Directors has appointed Satya Nadella as Chief Executive Officer and member of the Board of Directors effective immediately. Nadella previously held the position of Executive Vice President of Microsoft’s Cloud and Enterprise group.
                  “During this time of transformation, there is no better person to lead Microsoft than Satya Nadella,” said Bill Gates, Microsoft’s Founder and Member of the Board of Directors. “Satya is a proven leader with hard-core engineering skills, business vision and the ability to bring people together. His vision for how technology will be used and experienced around the world is exactly what Microsoft needs as the company enters its next chapter of expanded product innovation and growth.”
                  Since joining the company in 1992, Nadella has spearheaded major strategy and technical shifts across the company’s portfolio of products and services, most notably the company’s move to the cloud and the development of one of the largest cloud infrastructures in the world supporting Bing, Xbox, Office and other services. During his tenure overseeing Microsoft’s Server and Tools Business, the division outperformed the market and took share from competitors.
                  “Microsoft is one of those rare companies to have truly revolutionized the world through technology, and I couldn’t be more honored to have been chosen to lead the company,” Nadella said. “The opportunity ahead for Microsoft is vast, but to seize it, we must focus clearly, move faster and continue to transform. A big part of my job is to accelerate our ability to bring innovative products to our customers more quickly.”
                  “Having worked with him for more than 20 years, I know that Satya is the right leader at the right time for Microsoft,” said Steve Ballmer, who announced on Aug. 23, 2013 that he would retire once a successor was named. “I’ve had the distinct privilege of working with the most talented employees and senior leadership team in the industry, and I know their passion and hunger for greatness will only grow stronger under Satya’s leadership.”
                  Microsoft also announced that Bill Gates, previously Chairman of the Board of Directors, will assume a new role on the Board as Founder and Technology Advisor, and will devote more time to the company, supporting Nadella in shaping technology and product direction. John Thompson, lead independent director for the Board of Directors, will assume the role of Chairman of the Board of Directors and remain an independent director on the Board.
                  “Satya is clearly the best person to lead Microsoft, and he has the unanimous support of our Board,” Thompson said. “The Board took the thoughtful approach that our shareholders, customers, partners and employees expected and deserved.”
                  With the addition of Nadella, Microsoft’s Board of Directors consists of Ballmer; Dina Dublon, former Chief Financial Officer of JPMorgan Chase; Gates; Maria M. Klawe, President of Harvey Mudd College; Stephen J. Luczo, Chairman and Chief Executive Officer of Seagate Technology PLC; David F. Marquardt, General Partner at August Capital; Nadella; Charles H. Noski, former Vice Chairman of Bank of America Corp.; Dr. Helmut Panke, former Chairman of the Board of Management at BMW Bayerische Motoren Werke AG; and Thompson, Chief Executive Officer of Virtual Instruments. Seven of the 10 board members are independent of Microsoft, which is consistent with the requirement in the company’s governance guidelines that a substantial majority be independent.

                  Satya Nadella email to employees on first day as CEO

                  From: Satya Nadella
                  To: All Employees
                  Date: Feb. 4, 2014
                  Subject: RE: Satya Nadella – Microsoft’s New CEO
                  Today is a very humbling day for me. It reminds me of my very first day at Microsoft, 22 years ago. Like you, I had a choice about where to come to work. I came here because I believed Microsoft was the best company in the world. I saw then how clearly we empower people to do magical things with our creations and ultimately make the world a better place. I knew there was no better company to join if I wanted to make a difference. This is the very same inspiration that continues to drive me today.
                  It is an incredible honor for me to lead and serve this great company of ours. Steve and Bill have taken it from an idea to one of the greatest and most universally admired companies in the world. I’ve been fortunate to work closely with both Bill and Steve in my different roles at Microsoft, and as I step in as CEO, I’ve asked Bill to devote additional time to the company, focused on technology and products. I’m also looking forward to working with John Thompson as our new Chairman of the Board.
                  While we have seen great success, we are hungry to do more. Our industry does not respect tradition — it only respects innovation. This is a critical time for the industry and for Microsoft. Make no mistake, we are headed for greater places — as technology evolves and we evolve with and ahead of it. Our job is to ensure that Microsoft thrives in a mobile and cloud-first world.
                  As we start a new phase of our journey together, I wanted to share some background on myself and what inspires and motivates me.
                  Who am I?
                  I am 46. I’ve been married for 22 years and we have 3 kids. And like anyone else, a lot of what I do and how I think has been shaped by my family and my overall life experiences. Many who know me say I am also defined by my curiosity and thirst for learning. I buy more books than I can finish. I sign up for more online courses than I can complete. I fundamentally believe that if you are not learning new things, you stop doing great and useful things. So family, curiosity and hunger for knowledge all define me.
                  Why am I here?
                  I am here for the same reason I think most people join Microsoft — to change the world through technology that empowers people to do amazing things. I know it can sound hyperbolic — and yet it’s true. We have done it, we’re doing it today, and we are the team that will do it again.
                  I believe over the next decade computing will become even more ubiquitous and intelligence will become ambient. The coevolution of software and new hardware form factors will intermediate and digitize — many of the things we do and experience in business, life and our world. This will be made possible by an ever-growing network of connected devices, incredible computing capacity from the cloud, insights from big data, and intelligence from machine learning.
                  This is a software-powered world.
                  It will better connect us to our friends and families and help us see, express, and share our world in ways never before possible. It will enable businesses to engage customers in more meaningful ways.
                  I am here because we have unparalleled capability to make an impact.
                  Why are we here?
                  In our early history, our mission was about the PC on every desk and home, a goal we have mostly achieved in the developed world. Today we’re focused on a broader range of devices. While the deal is not yet complete, we will welcome to our family Nokia devices and services and the new mobile capabilities they bring us.
                  As we look forward, we must zero in on what Microsoft can uniquely contribute to the world. The opportunity ahead will require us to reimagine a lot of what we have done in the past for a mobile and cloud-first world, and do new things.
                  We are the only ones who can harness the power of software and deliver it through devices and services that truly empower every individual and every organization. We are the only company with history and continued focus in building platforms and ecosystems that create broad opportunity.
                  Qi Lu captured it well in a recent meeting when he said that Microsoft uniquely empowers people to “do more.” This doesn’t mean that we need to do more things, but that the work we do empowers the world to do more of what they care about — get stuff done, have fun, communicate and accomplish great things. This is the core of who we are, and driving this core value in all that we do — be it the cloud or device experiences — is why we are here.
                  What do we do next?
                  To paraphrase a quote from Oscar Wilde — we need to believe in the impossible and remove the improbable.
                  This starts with clarity of purpose and sense of mission that will lead us to imagine the impossible and deliver it. We need to prioritize innovation that is centered on our core value of empowering users and organizations to “do more.” We have picked a set of high-value activities as part of our One Microsoft strategy. And with every service and device launch going forward we need to bring more innovation to bear around these scenarios.
                  Next, every one of us needs to do our best work, lead and help drive cultural change. We sometimes underestimate what we each can do to make things happen and overestimate what others need to do to move us forward. We must change this.
                  Finally, I truly believe that each of us must find meaning in our work. The best work happens when you know that it’s not just work, but something that will improve other people’s lives. This is the opportunity that drives each of us at this company.
                  Many companies aspire to change the world. But very few have all the elements required: talent, resources, and perseverance. Microsoft has proven that it has all three in abundance. And as the new CEO, I can’t ask for a better foundation.
                  Let’s build on this foundation together.
                  Satya

                  Steve Ballmer email to employees on new CEO

                  From: Steve Ballmer
                  To: All Employees
                  Date: Feb. 4, 2014
                  Subject: Satya Nadella – Microsoft’s New CEO

                  Today is an incredibly exciting day as we announce Satya Nadella as the new CEO of Microsoft. Satya will be a great CEO, and I am pumped for the future of Microsoft. You can read the full announcement here.
                  Satya is a proven leader. He’s got strong technical skills and great business insights. He has a remarkable ability to see what’s going on in the market, to sense opportunity, and to really understand how we come together at Microsoft to execute against those opportunities in a collaborative way. I have worked closely with Satya for many years and I have seen these skills many times. He is not alone, though. Our Senior Leadership Team has never been stronger, and together this group will drive us forward.
                  Microsoft is one of the great companies in the world. I love this company. I love the bigness and boldness of what we do. I love the way we partner with other companies to come together to change the world. I love the breadth and the diversity of all of the customers we empower, from students in the classroom to consumers to small businesses to governments to the largest enterprises.
                  Above all, I love the spirit of this place, the passion, and the perseverance, which has been the cornerstone of our culture from the very beginning.
                  Stay focused and keep moving forward. I am excited about what we will do. Satya’s appointment confirms that.
                  Thanks for making Microsoft the most amazing place to work on the planet, and thanks for the chance to lead.
                  Steve

                  Microsoft’s integrated solution for streaming video and Live TV providers on all devices, plus the upcoming live-action and “shared experience” TV of its own on Xbox

                  This is my finding as an update to the one of a year ago in “Microsoft entertainment as an affordable premium offering to be built on the basis of the Xbox console and Xbox LIVE services [Feb 13, 2013] OR create interactive content as a premium offering together with partners using Kinect technology as a starter OR moving Microsoft Xbox 360 to ‘entertainment console’ OR leaving the good quality commodities to others and going for a premium brand with Xbox as well”.

                  One cannot understand the Microsoft solution without first looking at:

                  1. Cable and satellite video market (U.S. only)
                  2. Pay-TV market (cable and satellite, IPTV, terrestrial)
                  3. The overall TV market (home video, on demand video, linear TV)
                  4. IPTV—AT&T U-verse TV and Verizon FiOS video in particular
                  5. OTT (Over-the-top content)

                  Then the Microsoft solution could be presented as follows:

                  6.   Microsoft’s live TV solution on Xbox
                  7.   Preliminary information on the upcoming products from Xbox Entertainment Studios
                  8.   Xbox Music and Xbox Video services for other devices

                  Before all that, however, we should also understand a key trend that the Installed Base of Internet-Connected Video Devices to Exceed Global Population in 2017 [iSuppli press release, Oct 8, 2013] which is also showing the immense difficulty for the Microsoft effort:

                  More than 8 billion Internet-connected video devices will be installed worldwide in 2017, exceeding the population of the planet, according to research from  the Broadband Technology Service at IHS Inc. (NYSE: IHS).

                  The installed base of video-enabled devices that are connected to the Internet—a category that includes diverse product such as tablets, smart TVs, games consoles, smartphones, connected set-top boxes, Blu-ray players, and PCs—will expand to 8.2 billion units in 2017. This will represent a nearly 90 percent increase from 4.3 billion in 2013, as presented in the attached figure.

                  With the world’s population amounting to 7.4 billion people in 2017, this means that there will be 1.1 Internet-connected video devices installed for each global citizen.

                  image

                  “On average every human being in the world will possess more than one Internet-connected video device by the year 2017—a major milestone for the electronics market,” said Merrick Kingston, senior analyst, Broadband Technology, at IHS. “In practice, ownership of Internet-connected hardware will be concentrated among users whose homes are equipped with broadband connections. We’re quickly approaching a world where the average broadband household contains 10 connected, video-enabled devices. This means that each TV set installed in a broadband-equipped home will be surrounded by three Internet-connected devices.”
                  Asia-Pacific gets connected
                  The number of connected devices in the mature North American and Western European regions will grow at a relatively modest compound annual growth rate (CAGR) of 10 percent from 2013 to 2017.
                  In contrast, Asia-Pacific will expand at 20 percent during the same period. Driven largely by Chinese demand, Asia-Pacific will add 1.9 billion connected devices to the global installed base between 2013 and 2017.
                  On the other end of the regional spectrum, sub-Saharan Africa will contribute 145 million net additions to the total installed base during the next four years.

                  Challenges and opportunities

                  In order to cash in on this massive growth in Internet-connected devices, media companies across the operator, broadcast, consumer electronics manufacturing and over-the-top (OTT) businesses have embraced Internet protocol (IP) video distribution. Big names making a foray into IP video include HBO, Microsoft, DirecTV and Netflix.

                  However, all of these companies face a major challenge: how to wrap consumers into their ecosystems, given the proliferation of platforms, high switching costs and strong incentives for consumers to stay with their existing services.

                  Back in 2005, PCs comprised 93 percent of all connected devices. By the end of 2017, the base of connected devices will diversify dramatically, with PCs comprising only 23 percent of the connected installed base. Other devices will account for the rest of the market, including smart TVs at 5 percent, consoles  at 2 percent, and smartphones and tablets collectively representing 67 percent.
                  “Addressing the full breadth of the device landscape, and recuperating the development cost of doing so, will pose a major challenge for a number of media firms,” Kingston added.

                  1. Cable and satellite video market (U.S. only)

                  Let’s start with a list of cable and satellite video providers in the U.S.:

                  image
                  The chart is from Would a DirecTV-DISH Merger Still Make a New Pay-TV Media Monopoly?
                  [24/7 Wall St., Oct 10, 2013]. Note that Newco is the DirectTV-DISH merged company
                  just imagined by the article
                  . The actual Top 5 companies represented 75.4% of the U.S.
                  cable and satellite video subscribers: 35.6% satellite (newco) and 39.8% cable.
                  Relative to that Verizon FIOS video IPTV had 4.7M subscribers and
                  AT&T U-verse [IP]TV 4.5M by the end of Q4’12 (see below).

                  See also (in order to understand the challenges cable operators are facing everywhere):
                  TWC rebuffs Charter’s latest takeover bid [[IHS] Screen Digest commentary, Jan 14, 2014] with “The saga to create the nation’s second largest cable operation is moving into a new phase … With so many sharks circling TWC, IHS believes that it will be a matter of not if but when TWC accepts a bid.”
                  Time Warner Cable prepares for its business future [[IHS] Screen Digest commentary, Oct 8, 2013] with “TWC and other cable operators are in the unenviable position of seeing their primary product, pay TV video, declining. Coupled with encroachment from IPTV, and potential upstart OTT technologies, cable operators are pushing to grow other business lines. … Staying ahead of the technological curve is a problem for all pay TV operators, and cable more than IPTV, with Satellite experiencing the worst of it.
                  Netflix added to Virgin’s TiVo platform [[IHS] Screen Digest commentary, Sept 10, 2013] with “UK cable company Virgin Media has signed what is effectively an OTT carriage deal with Netflix to bring the streaming service onto the Virgin TiVo platform. Groundbreaking move is the first deal of its type and indicates a change in the positioning of Netflix and the competitive positioning of OTT against ‘traditional’ pay TV. … that more firmly positions Netflix as a content aggregator (read: channel) rather than a platform and opens the door for similar deals internationally. Move vindicates our long-held view that this was the correct way to position Netflix and other OTT content aggregators.

                  Cable takes the fight to OTT [[IHS] Screen Digest commentary, Oct 28, 2013]

                  After years of subscriber losses, Comcast announced on October 25, 2013, the first widespread test of a cable network lite bundle, the combination high-speed data (HSD) and broadcast basic video and premium channel. The trial is slated to run a minimum of one year, the operator plans to have stepped increases in the starter $49.99 per month price at 6 months ($60-$70) and again at one year ($70+).
                  This is not the first such offering however. In August of 2013 Time Warner Cable (TWC) initiated a similar promotion targeted at transitioning college students back toward video products, including HBO and HSD. TWC partnered with nine colleges in this limited trial, again the term is likely to run for a year or less. TWC is charging $79.99 per month for one year, but did not list a non-promotional price.
                  In the following analysis IHS makes two assumptions: 1) That cord-cutting and cord-never households will likely buy HSD from pay TV providers, and that it will skew toward higher speed tiers. 2) The price for bundled 25-30Mbit is ~$55 and unbundled ~$60.
                  Our take
                  The fact that the business of pay TV is  changing is no longer in doubt, but the business has insulated itself well and is preparing to weather the storm. Comcast and TWC are not the first to experiment with new offerings, Cox recently concluded its flareWatch trial, the first pay TV OTT trial. The difference between the Cox effort and Comcast and TWC is that the two latter companies have price efficiencies working  on their side.
                  That’s not to say that IHS believes that the Cox trial was ended because of price, more likely Cox received valuable customer feedback and experience. The Comcast and TWC deals are predicated on completely different foundations. Both offerings provide significant perceived value, and combined monetary value to subscribers.
                  Both deals compare to a HSD and Netflix and/or Hulu+ plan. The Netflix/Hulu+ plan will likely cost $68 to $76 depending upon HSD tier and number of OTT subscriptions, compared with Comcast’s year one monthly average of $60 and TWC’s $70. Another significant point of difference is the depth of offering.
                  Both pay TV providers share four common features, 25-30Mbit HSD, local broadcast channels, HBO, HBO GOComcast also includes StreamPix (Library title Subscription VOD). IHS believes that both Comcast and TWC are at a minimum matching Netflix on a like-for-like price content offering when considering HSD and HBO versus HSD and Netflix/Hulu+. The addition of broadcast local channels as well as SVOD in the case of Comcast, signal that cable is not going to give up the fight.

                  2. Pay-TV market (cable and satellite, IPTV, terrestrial)

                  Then, according to Worldwide pay-TV Subscribers to Exceed 1.1 Billion in 2019 with Increasing IPTV Market Share [ABI Research, Jan 22, 2014]

                  imageWorldwide pay-TV market reached 903.3 million subscribers in 2013, generating $249.8 billion in service revenue. IPTV operators enjoyed significant growth (18.5% YoY) in 2013 to 92 million subscribers with a total of $37.2 billion in service revenue.
                  “Increasing FTTH [Fiber To The Home] subscriber base and bundled subscriber base of telcos are boosting the IPTV market. ABI Research forecasts that the IPTV subscriber base will grow to 161 million subscribers in 2019 accounting for 15% of overall pay-TV market,” comments Jake Saunders, VP and practice director of core forecasting.
                  The global terrestrial TV market reached 9.5 million subscribers at the end of 2013. A declining pay DTT subscriber base in Italy and Spain had an impact on the overall Western European DTT market which dropped around 5% in 2013. Unlike Western Europe, the DTT market in Africa grew a remarkable 45% to 2.1 million subscribers in 2013. “As African countries start to switch over to digital, digital terrestrial TV has become an affordable alternative to satellite TV service in the region. ABI Research forecasts that Africa will have over 4.8 million DTT subscribers in 2019,” adds Khin Sandi Lynn, industry analyst.
                  DirecTV maintains its largest market share in terms of pay-TV service revenue. The company had around 20.2 million subscribers in the US with an ARPU above $102 by the end of 3Q-2013. Globally, the pay-TV market is expected to grow to 1.1 billion subscribers with $320.3 billion in service revenue in 2019.

                  3. The overall TV market (home video, on demand video, linear TV)

                  Or a broader view representing all other segments of the TV market as well:
                  Global TV market revenue to grow at a steady pace: up 23% by 2018
                  [DigiWorld by IDATE blog, Jan 30, 2014] by Florence Le Borgne
                  Head of the TV & Digital content Practice, IDATE.

                  At a time when video has become pervasive across all of our screens, most national TV markets are losing steam: shrinking viewership and pressure on advertising markets, especially in Europe. Although pay-TV seems to be holding its own, the fast-growing popularity of OTT offerings is shaking up the traditional pay-TV model, while the demise of physical media is virtually a foregone conclusion.
                  If the decline of physical media now seems inevitable, television still has a chance to reinvent itself in a way that takes into account changes in viewer behaviour and competition from new online vendors.
                  Accessing TV
                  According to IDATE, the number of TV households worldwide will reach 1.675 billion in 2018 (+9.6% in 5 years), with the number of digital TV households worldwide being 1.542 billion in 2018, which translates into 92% of TV households
                  • Cable will the remain the chief access channel (592.3 million households in 2018) but will gradually lose ground to satellite and IPTV which will account for 32.9% and 10.9% of TV households, respectively, at the end of 2018.
                  • Despite the development of hybrid TV solutions, terrestrial TV should continue its decline on the first TV set and drop down to number three spot by 2018, with roughly 21% share of the global market.
                  • The development of hybrid solutions that combine live programming on broadcast networks (terrestrial and DTH) and OTT video services over the open Web is a key variable in the future development of the various TV access modes, and may well shake up current trends.
                  TV: top money-maker
                  Breakdown of audiovisual market revenue in 2013

                  image 

                  TV revenue
                  According to IDATE, the global TV industry’s revenue will come to €374.8 billion in 2013 and €459.2 billion in 2018.
                  • Pay-TV revenue will grow by 21.3% between 2013 and 2018, or by an average 3.9% annually, to reach €220.2 billion in 2018.
                  • Ad revenue will enjoy even stronger growth of 27.3% between 2013 and 2018, to reach €201.1 billion in 2018.
                  • Public financing/licensing fees will continue to increase significantly (+7.7% in 5 years) to reach nearly €38 billion in 2018.
                  Video revenue
                  According to IDATE, physical media sales will total €16.3 billion in 2018, when video on demand (VoD) revenue will reach €35.4 billion in 2018, which is 90% more than in 2013.
                  • This means that the global market will have shrunk to more than a quarter of what it was in 2013 (-27.2%).
                  • Blu-ray will be the most common format and help temper plummeting physical media sales.
                  • OTT video will continue to be the biggest earner, generating 51% of total revenue.
                  • VoD will still be the dominant model on managed networks. It will generate €6.9 billion in 2018 versus €2.3 billion for subscription video on demand (S-VoD).
                  American OTT video providers’ footprint in Europe as of 31 December 2013
                  image
                  Source: IDATE, December 2013
                  American OTT vendors already have a solid foothold in Europe
                  Netflix is already present in seven European countries: Britain, Ireland, the Netherlands, Denmark, Norway, Finland and Sweden. The service had 1.6 million subscribers in the UK and Ireland at the end of 2013.
                  • LoveFilm was reporting 1.9 million subscribers in the UK and Germany at the end of 2013.
                  • At the end of 2013, iTunes’ VoD rental service was available in close to 110 countries, and permanent downloads in 14 countries, chiefly in North America and Europe.
                  More information on TV and new video services market report & database

                  UK Video Rental Market Plunges in 2013 as Half of Country’s Blockbuster Stores Close [IHS iSuppli press release, April 23, 2013]

                  The market for Blu-ray (BD) and DVD rental in the United Kingdom is expected to plunge by 22 percent in 2013, as half the country’s Blockbuster video stores shut down in a restructuring initiated by the company’s new management.
                  The U.K. market for physical-video rental will drop to £202 million in 2013, down £57 million, or 22 percent from £259 million in 2012, according to a newly updated forecast from IHS (NYSE: IHS) . While the market is generally on the decline, 2013 will bring the sharpest predicted annual decrease for the 11-year period from 2007 through 2017.

                  image

                  By the end of 2013, only 264 Blockbuster stores will be open in the country, down 50 percent from 530 in 2012. Blockbuster is the largest video rental chain in the country.
                  “The year 2013 is set to become a watershed for the U.K. video rental market as a result of the wholesale closure of Blockbuster UK stores,” said Tony Gunnarsson, senior video analyst at IHS. “The massive downturn in the store-based video rental market represents a significant loss to the video market and will result in a major decline and radical transformation of the U.K. video market overall. From 2013 on, the U.K. physical-video rental business increasingly will be dominated by online rent-by-mail subscription services.”
                  Both DVD and BD transactions are due to decrease across the store-based sector this year. DVD rentals will fall by a steep 53.2 percent to 15.4 million. BD is set to drop by an even larger 61.3 percent to 2.8 million respectively.
                  Blockbuster gets busted up
                  After filing for administration in January 2013, Blockbuster’s administrators Deloitte announced two separate rounds of store closures, including some 224 sites. In February 2013, supermarket chain Morrisons purchased 49 of these former Blockbuster stores in its drive to increase its store presence in southeast England.
                  Out of the remaining Blockbuster stores, Gordon Brothers acquired a total of 264 locations, including a number of Blockbuster outlets earmarked for closure that will now remain open.
                  Pay-TV killed the video store
                  In 2012, rental stores were responsible for 41.3 percent of the video rental market based on consumer spending. In the latest forecast for 2013, however, the store-based sector is now projected to generate just 24.7 percent of the overall market. This tilts the market toward the online sector, which will see its share of market increase massively from 58.7 percent in 2012 to 75.3 percent this year.
                  At the same time, the lost rental business won’t result in customers that used to rent at Blockbusters automatically signing up to become rent-by-mail customers with online providers, IHS believes. Rather, those customers are more likely to turn to a host of other video platforms, primarily pay-TV services.
                  Video rental market winds down
                  In the longer view, the U.K. rental market will return to a normal trend of decline after 2013, with spending on renting physical video shrinking at an annual rate of under 5 percent until 2017. By then, the retreat in spending is expected to be slightly more negative at 7 percent.

                  4. IPTV—AT&T U-verse TV and Verizon FiOS video in particular

                  As far as the U.S. is concerned AT&T U-verse TV and Verizon FiOS video are the leading IPTV services by far*, having 5.5 million and 5.3 million subscribers respectively, which is 11.7% of the above 92 million subscribers number by ABI Research:

                  image* The next service provider, CenturyLink “Ended the quarter with 149,000 CenturyLink® PrismTM TV subscribers, an increase of approximately 17,000 subscribers in third quarter 2013” according to its Q3 203 report [Nov 6, 2013]. CenturyLink only entered five U.S. markets after acquiring Embarq (2009) and Qwest (2010). In fact no other U.S. providers are in the Top 20 globally according to SNL Kagan Reports World’S 20 Largest IPTV Operators Served 83% of Global IPTV Households at End-2012 [June 6, 2013]. More:
                  – China’s leading telcos– China Telecom and China Unicom– serve an estimated 30% of the global IPTV subscriber base.
                  – Asian telcos accounted for 49.2% of the top 20’s IPTV subscribers in 2012, reflecting the region’s large market size and limited telco competition.
                  – France — the second-largest IPTV market by subscribers after China — is home to four operators ranked among the global top 20. [
                  Note that among Top 5 are Iliad and France Telecom. Iliad’s Freebox TV offering proposes a broad selection of TV channels (over 450, of which more than 200 are included in the basic package), as well as numerous audiovisual services, such as catch-up TV (with 45 channels available on Freebox Replay), and a wide video-on-demand offering. It was actually the largest IPTV deployment in the world with 2.4 million IPTV-enabled customers as of end 2007 (see here).]
                  – Nine operators out of 20 are located in Western Europe and seven in Asia
                  .

                  U-verse® Drives Wireline Consumer Growth and Broadband Gains

                  • Wireline consumer revenue growth of 2.9 percent versus the year-earlier period
                  • Total U-verse revenues, including business, up 27.9 percent year over year, now a $13 billion annualized revenue stream
                  • 10.7 million total U-verse subscribers (TV and high speed Internet) in service:
                    • 630,000 high speed Internet subscriber net adds; record annual net adds of 2.7 million
                    • 194,000 U-verse TV subscribers added, lowest churn in product history
                  • Continued U-verse broadband gains in the business customer segment, up 78,000, nearly doubling year-ago net adds
                  • Strategic business services growth accelerates with revenues up 17.4 percent year over year, now more than 25 percent of wireline business revenues

                  Record-Low U-verse TV Churn. Total U-verse subscribers (TV and high speed Internet) reached 10.7 million in the fourth quarter. U-verse TV had the lowest-ever churn in its history. U-verse TV added 194,000 subscribers in the fourth quarter with an increase of 924,000 for the full year to reach 5.5 million in service. AT&T has more pay TV subscribers than any other telecommunications company.

                  U-verse TV penetration of customer locations continues to grow and was at 21 percent at the end of the fourth quarter.

                  image

                  Note that after AT&T Extends TV Watching to More Devices with Launch of U-verse TV on Xbox 360 [press release, Oct 11, 2010] and even after New U-verse Internet Customers Can Take Their Pick: A Free Xbox 360, SONOS PLAY:3, Kindle Fire or Nexus 7 Tablet [press release, March 18, 2013] that Xbox tie-up ended with AT&T U-verse TV To Drop Support For Xbox 360 on December 31 [Multichannel, Nov 26, 2013]:

                  “We’ve made this decision due to low customer demand,” an AT&T spokeswoman said via email on Tuesday. AT&T declined to say how many customers currently use the Xbox 360 as a set-top. … AT&T, the spokeswoman added, currently has no plans to support U-verse TV on the Xbox One. Verizon Communications FiOS TV is the first, and so far only, U.S. pay-TV provider to offer an authenticated app for the Xbox One during its initial launch phase.

                  In FiOS video we added 92,000 new subscribers in the quarter. Total FiOS videos customers reached 5.3 million, representing 35% penetration.

                  As far as the OnCue acquisition [from Intel, i.e. the Intel Media operation], look, the focus here is really to accelerate the availability of the next-generation IP video service which we will integrate into the FiOS video service. And really what we are trying to do is differentiate this even more so with fiber to the home versus others with the TV offerings and reducing the deployment costs. And this really accelerates us from if we were trying to build IP TV versus buying the IP TV technology.

                  From an FiOS customer perspective, we expect the benefits that they will have more elegant search and discovery activity and cost stream ease of use. But also keep in mind, with the acquisition of Verizon Wireless and becoming 100% ownership of that we also plan to take that platform and integrate it more deeply with our Verizon Wireless 4G LTE network. So that really was the strategy behind this.

                  As far as would we enable this platform to take us over the top, obviously we have our video digital media services that we have been working on for 2.5 years. We’ve just made two acquisitions related to that platform. So, look, we are positioning ourselves strategically to be in a position to competitively compete around the whole mobile first world and video, so I think that is where we are.

                  image

                  Pay-TV Operators Can Stave Off OTT Threat with Multiscreen and CDN Investments [iSuppli press release, April 17, 2013]

                  Despite the dire competitive threat posed by over-the-top (OTT) services, pay-TV operators can thrive by investing in additional service offerings that should include multiscreen services to more than make up for the erosion in their customer base, according to the IHS Screen Digest TV Intelligence Service from information and analytics provider IHS (NYSE: IHS).

                  Speaking here today at the IHS PEVE Entertainment 2013 Conference, Guy Bisson, research director for television at IHS, noted that although European cable operators have lost 1.4 million households, they have gained 17.8 more revenue-generating units (RGUs), during the five-year period from 2007 through 2012.

                  While cable operators in Europe and other regions are expected to lose more households in the coming years, RGUs will continue to increase, driving revenue growth for the industry. The below figure presents the IHS history and forecast of cable households and RGUs for the 27 countries of the European Union.

                  5. OTT (Over-the-top content)

                  OTT and IPTV Integration Increasingly Popular [Pyramid Points, Nov 27, 2012]

                  How do you plan to spend your evening most times when you order a pizza? You’re very likely to watch a video.
                  In the UK, Domino’s Pizza Group saw the value of over-the-top (OTT) online video to boost customer loyalty, and back in October launched the Domino’s Pizza Box Office video streaming offer. Customers order a pizza and get a download code to stream a movie at home. This is just another example of how OTT is revolutionizing the way video content is delivered to consumers: Today almost anyone can become a content provider.
                  Exhibit: Evolving video delivery environment and video platforms
                  image
                  Source: Pyramid Research

                  Many operators see the proliferation of OTT as a threat to their established IPTV business models. They fear that OTT will subvert their role in the pay-TV value chain and cannibalize revenue. We’ve found, however, that the opposite is just as likely to be true. In our new report, “OTT Growth Sparks Innovation Multiscreen Video Business Models,” we argue that OTT is serving as an innovation stimulus for the pay-TV market, pushing telcos to enhance their IPTV services with more screens. We also find that an increasing number of operators, alongside their managed IPTV services, are directly entering into non-managed OTT environments. This means that more operators are using the open Internet to offer video services to potentially any consumer with a broadband connectivity, being their existing customers or not.

                  OTT in emerging markets: Challenges and opportunities
                  Operators are warming up to the idea of launching their own OTT services, especially in emerging markets. While IPTV remains a premium service, which requires subscribers to purchase more expensive bundles, OTT is more flexible and only requires a good broadband connection. This means that in the more price-sensitive markets, where there is still strong demand for online video, OTT is becoming an attractive option for users. Besides, OTT services are typically delivered over a wide range of screens and at different price points, including smartphones, tablets and gaming consoles, making them more accessible to different consumer profiles.
                  In Colombia, for example, ETB has announced that it will shortly launch an OTT service to complement its upcoming IPTV deployment. In Mexico, the OTT service provided by fiber-to-the-home (FTTH) operator Totalplay, dubbed Totalmovie, has rapidly become the main competitor to Netflix. It offers video content in Mexico alongside the operators’ IPTV platform and across Latin America by using third-party operator infrastructure. As of October, it had 1.9m registered users and 5m unique monthly visitors.
                  We expect to see more Latin American operators launching OTT services. The second largest regional group, Telefonica, is considering positioning OTT commercial offers in several countries. The decision between managed (IPTV) or unmanaged video delivery (OTT) ultimately depends on each country’s infrastructure, competitive environment and operator position. Telefonica has, however, confirmed that there are already ongoing OTT initiatives outside Spain.
                  In Turkey, TTNET, the ISP of fixed-line incumbent Turk Telekom, has already been quite successful in combining its IPTV and OTT offerings. TTNET wants to add value to the bundles, which in turns helps increase customer loyalty and reduce churn. This is crucial in preventing the decline of Turk Telekom’s fixed-line base. While IPTV is positioned as a premium service, OTT is priced very competitively. As of August this year, TTNET had over 1.2m OTT and 150,000 IPTV subscriptions.
                  OTT can provide significant benefits to operators. In the case of TTNET, positioning OTT alongside IPTV is encouraging consumers to break through their broadband allowances, thus creating the need to migrate to higher-value packages. In the case of Totalplay in Mexico, OTT is contributing to the monetization of the operator’s superfast fiber-based network. For both operators, using third-party infrastructure breaks the link between content delivery and network management.

                  The outlook is positive

                  In the near future, we expect to see significant revenue-generating opportunities associated with VoD, catch-up TV, and targeted advertising, especially when telcos can integrate their OTT and IPTV offerings with interactive and social media functions.

                  Using the open Internet for content delivery, however, has its downsides. The main shortcoming with OTT is that the operator is not in control of quality of service (QoS). Especially in emerging markets, quality of service and network speeds vary wildly from country to country, making it challenging to ensure the same quality of experience (QoE) that can be guaranteed through a managed IPTV network. Another challenge for operators is securing in-demand content for OTT platforms. Without doubt content is king, but content is also costly. Unless they are backed by multimedia and broadcasting groups, operators tend to be the weak link in the content production and delivery value chain. But that is a challenge with IPTV too.

                  All in all, if telcos are serious about developing a pay-TV offering that can resonate with the demand for multiple viewing platforms at different price levels, they need to seriously consider the opportunity of complementing IPTV platforms with OTT.

                  — Daniele Tricarico, Analyst

                  More information from Pyramid Research:
                  Is the Arab World Ready for OTT Video? [Sept 13, 2013]
                  CDNs Offer New OTT Revenue Hope [Feb 20, 2013]
                  Chinese Regulator Opens Up to MVNOs [mobile virtual network operators] [March 15, 2013]

                  Finally here is a list of Top 10 Online Streaming Video Services [tom’sGUIDE, Jan 1, 2014] in the U.S. in order to understand the state-of-the-art of OTT video services:

                  Digital video options

                  Streaming video has just about displaced the DVD on the list of home entertainment options, and it may supersede cable and broadcast TV in the near future. Every modern computer has access to streaming video services, as do most game consoles and mobile devices, and even a growing proportion of televisions. Whether you’re looking to get your feet wet or expand your streaming horizons, check out 10 of the best services for watching movies, TV, music videos, Web shows and more.

                  image

                  Netflix

                  Netflix is the most popular video streaming service out there, and with good reason. The service is available on just about every platform, including computers, game consoles, set-top boxes and mobile devices, and it hosts movies and TV shows to accommodate every taste. From hit films like “The Avengers” to every “Star Trek” TV series to original programming like “Orange Is the New Black,” Netflix’s variety of content is unparalleled. You can even share an account among five different users to keep recommendations and viewing habits separate. Netflix costs $8 per month.

                  Inserts of mine:
                  Netflix added to Virgin’s TiVo platform [[IHS] Screen Digest commentary, Sept 10, 2013] with “UK cable company Virgin Media has signed what is effectively an OTT carriage deal with Netflix to bring the streaming service onto the Virgin TiVo platform. Groundbreaking move is the first deal of its type and indicates a change in the positioning of Netflix and the competitive positioning of OTT against ‘traditional’ pay TV. … that more firmly positions Netflix as a content aggregator (read: channel) rather than a platform and opens the door for similar deals internationally. Move vindicates our long-held view that this was the correct way to position Netflix and other OTT content aggregators.
                  Netflix passes 38m paying ‘streaming’ subscribers [[IHS] Screen Digest commentary, Oct 22, 2013] with:

                  Netflix’s total number of paid streaming subscribers increased by 2.4m over the quarter, to reach 29.9m subscribers in the United States and 8.1m subscribers internationally. The international streaming service saw a larger than expected increase of free trialists to 1.1m driven by Latin America and the September launch of the service in the Netherlands.
                  The third quarter of 2013 is a significant milestone for Netflix, as the quarter in which the Netflix US streaming subscriber count pulled even with the US subscriber count of pay TV giant HBO. The company ended the quarter just shy of 30 million streaming subscribers with estimates for HBO at roughly the same level.
                  The comparison with HBO is the most appropriate for companies such as Netflix, Amazon and Hulu‘s subscription service, rather than with the pay TV operators. Netflix, as well as Hulu Plus and Amazon, are acting as premium channels in investing in acquired and original content and following in HBO’s early-1990s footsteps. Despite the investment Netflix has made in its own original programming, the company has reported that a greater percentage of overall viewing on the platform is of previous-season TV episodes and catalogue movies. Netflix indicates that it plans to double its investment in original content in 2014, although this will still represent less than 10per cent of global content expenditure.
                  Netflix’s international business remained a loss-making venture as the company struggles to gain profitability without scale and without a legacy high margin physical business. Whereas in the US the company initially bundled its streaming proposition with disc rentals add value to the physical subscription Netflix has not had a preexisting business from which to launch a digital subscription internationally. At present, the international ventures are subsidized by domestic market return and with ongoing market expansions planned by Netflix; IHS does not expect this to change in the mid-term.

                  End od my inserts for Netflix

                  Hulu Plus

                  If you want to catch TV shows almost as soon as they air, Hulu Plus may be right for you. This streaming service hosts a plethora of TV shows and movies. Whether you want to watch “Leverage,” “Family Guy” or “Spongebob Squarepants,” Hulu generally posts new episodes within days of their airing on TV. Hulu Plus costs $8 per month (with some shows available only on computers for free), and provides past seasons of shows along with Hulu’s original programs. It is available on computers, game consoles, streaming boxes and mobiles.

                  Amazon Prime Instant Video

                  Amazon Prime Instant Video is a streaming service that comes with an Amazon Prime subscription. In addition to offering free shipping on Amazon orders and free Kindle books to borrow, Amazon Prime allows subscribers to access approximately 40,000 movies and TV shows. In addition to unlimited streaming offerings, users can rent and buy other TV shows and movies a la carte. This makes Amazon Instant Video a good choice for watching newer movies before they touch down on unlimited streaming services like Netflix. Amazon Prime costs $79 per year.

                  See also: Amazon may hike Prime cost as earnings disappoint and further challenges lay ahead of the company for which it needs to adjust its business model and expand its operations [‘Experiencing the Cloud’, Jan 31, 2014]

                  M-Go

                  If you’re not interested in paying a monthly fee for your streaming video content, M-Go might be up your alley. M-Go, which is the default streaming service on Roku boxes and also available on computers and mobile devices, allows you to rent and buy TV shows and movies. Prices range from $2 for individual TV episodes to $20 for HD movie purchases. M-Go excels in offering both HD and SD versions of content, making it an attractive choice if you want a one-off rental.

                  Blip

                  Watching big-budget movies and TV is all well and good, but for curated, original Web shows from charismatic creators, nothing fits the bill like Blip. Think of Blip as a more curated, creator-friendly version of YouTube. Individuals create and upload original series, ranging from comedy to reviews to funny pet videos, and Blip ensures that the content has professional production values and that new entries are added regularly. All content on Blip is free, and you can access it via your computer, mobile device or Xbox 360.

                  image

                  Vevo

                  MTV hasn’t played music videos since the ’90s, but the medium is not dead just yet. Vevo hosts the latest music videos from artists ranging from Katy Perry to Old Crow Medicine Show, but audiophiles would be wise to stick around for its scads of original content. Users can access biographies, retrospectives, behind-the-scenes footage and interviews about their favorite musicians, and curated playlists for both individual artists and entire genres. Vevo is free, and available on computers, mobile devices, Rokus, Apple TVs and Xbox 360s.

                  MLB.TV

                  If you’re a baseball fan, you’re in luck: Major League Baseball‘s streaming service is one of the best in professional sports. MLB.TV allows viewers to watch most games during the regular MLB season. (Postseason games are available through the Postseason.TV service at additional cost.) Fans can watch both home and away games from anywhere in the world. Stat junkies can examine each pitch as it happens and compare their fantasy teams in real time. MLB.TV costs $130 per year and is available on computers, mobile devices, set-top boxes, Xbox 360s and PS3s.

                  Crackle

                  If you crave pop cinema, Crackle could be the best thing to happen to your TV since afternoon basic cable. The Crackle service offers a rotating selection of a few hundred movies and TV shows, including “Ghostbusters,” “The Cable Guy” and “The Shield.” Crackle also creates and hosts original content, ranging from espionage thriller serials to “Comedians in Cars Getting Coffee” starring Jerry Seinfeld. Crackle is free (though you’ll have to watch some commercials) and available on computers, mobile devices, set-top boxes and game consoles.

                  Twitch

                  Watching other people play video games is, surprisingly, almost as much fun as playing yourself — sometimes more so, if you have a good host. Twitch is a platform for gamers to livestream their play sessions. You can find streams of everything from “League of Legends” to “Minecraft.” Whether you want to see tutorials, speed runs or popular Web personalities’ reviews, Twitch has you covered. The service is free, both to watch and to stream your own sessions. Twitch is available on computers, mobile devices, set-top boxes and PS4s.

                  YouTube

                  The biggest video streaming service online is just about unbeatable when it comes to variety of content. YouTube is the go-to site to upload short videos: cats, clips from your favorite TV programs, cats, original Web shows, cats, movie trailers, cats and more. The service will be one of the first to support content for the higher-resolution 4K TVs. If you’re looking to watch short-form videos, this is the place to start. YouTube is free and available on just about every device with a screen and an Internet connection.

                  Discovery to take majority control of Eurosport [[IHS] Screen Digest commentary, Jan 22, 2014]

                  Discovery Communications has agreed to take a controlling interest in Eurosport International, the pan-European sports channel, from its partner TF1 Group [of France]. … Discovery, which primarily operates a portfolio of factual channels in Europe, has branched out in recent years with the acquisitions of SBS Nordic in Scandinavia and Italy’s Switchover Media. It now has the option to acquire 100% of Eurosport International and could also increase its interest in Eurosport France, though TF1 expects to retain its 80% interest until at least 1 January 2015.
                  The US group‘s move to take control of Eurosport is, as the company noted yesterday, taking place a year sooner than originally planned. While sports is clearly a new playing field for Discovery, the male-skewing profile of Eurosport is a good fit with its factual channel brands, offering possibilities for combining advertising and network sales. To date, co-operation has focused on markets where Eurosport is not present, notably the US and China. In the US, Discovery has been showcasing Eurosport rally and superbike programming on its Velocity channel.
                  A further move into the US appears unlikely given the presence of ESPN and powerful rivals like Fox Sports, NBC Sports and CBS. Fox Sports in particular has recently made strong moves into the international market place. Outside the US, Eurosport successfully occupies a niche where it is not competing with premium pay operators like BSkyB, Canal Plus and Sky Italia for high cost events like league football, but instead focuses on lower profile events where rights are often shared with local free-to-air broadcasters.
                  The main uncertainty over Eurosport’s change of ownership surrounds its content supply from the European Broadcasting Union (EBU), which provides hundreds of hours of events like cycling, grand slam tennis, winter sports and athletics. TF1 is an EBU member, but with Discovery holding the reins, this arrangement will almost certainly have to be renegotiated, with possible implications for Eurosport’s cost base. Even now, there appears to be room for improvement: Discovery’s operating margin for its international operations was 44% in 2012, compared to a slender 14% for Eurosport International.

                  Sky sees future in OTT as upsell becomes focus [[IHS] Screen Digest commentary, July 26, 2013]

                  Sky [more precisely BSkyB] added more Now TV customers in the quarter to end June 2013 than new satellite customers and is increasingly pushing OTT access and connected devices as the core of its future growth strategy. In calendar second quarter (Sky’s fiscal Q4), the pay TV provider added 34,000 new TV customers to reach 10.442m and said the ‘bulk’ of TV growth came from OTT service Now TV. Organic growth for broadband stood at 119,000 (25 per cent more than BT added in the same quarter) with a further 400,000 added through the acquisition of O2’s UK broadband operations to reach 4.9m. Telephony grew 140,000 organically with 153,000 coming from O2 to reach 4.5m. The number of HD subscribers grew 117,000 to reach 4.789m or 46 per cent of the TV base, with 2.7m HD boxes connected to broadband. Annualised ARPU hit £577 up £29 in the year.
                  Sky said that its future strategy would focus on becoming the centre of the connected home across a range of content windows that would increasingly include DVD window for paid on-demand and movie retail as well as the traditional subscription window. The move comes on the back of impressive figures for on-demand and OTT content subscription with a three fold increase in Sky Store (on-demand) revenue and 166,000 customers paying £5 a month for the Sky Go Extra service that allows content download to mobile devices. Sky said that, on average, Sky homes have seven connected devices and that content access inside and outside the home was increasingly important to its offer. The operator said it had concluded four new studio deals with a wider range of rights to service this market and would prioritise getting its customer base connected. The new Sky HD box now comes with built in Wi-Fi and a new low-cost wireless connector is being made available. Sky also released a new Now TV box priced at £9.99 when a Now TV subscription is taken, the device also enables Smart TV functionality and is targeted at the 13m Freeview [a free-to-air digital terrestrial television service in UK, a joint venture between the BBC, ITV,Channel 4, BSkyB and transmitter operator Arqiva] homes who don’t currently subscribe to Sky services.
                  Our take
                  The latest move is interesting in that is represents a significant vote of confidence in both the incremental revenue that can be derived from OTT services and the potential to tap an entirely new customer base in the form of ‘dip-in’ Now TV users. While this goes hand in hand with an increased investment in original content and channels as well as sports rights to support the core service, it is clear that Sky sees the most upside in incremental revenue driven by OTT rather than strong additional growth in traditional satellite pay TV customers. With broadband and telephony being an increasingly important area of revenue growth, the connected device/OTT space becomes the next area for up-sell, meaning that the so-called ‘multi-product strategy’ becomes central. While none of the services require a tie-in to Sky’s own broadband, it is this very area that BT has chosen to attack with its bold move into sport. The free access to BT’s suite of new sports channels with a BT broadband 12 month contract means that not only could there be a subset of Sky TV customers who will migrate to BT broadband, but a further segment of existing BT customers who will not be available to Sky for triple-play up-sell. To date there has been no evidence that’s BT’s strategy is paying off (net TV additions for BT were roughly flat in the quarter to end June 2013). But BT says it believes this will change when the channels launch.
                  Sky’s strategy, then, is to fall back on its traditional strengths centred on content, but to do this in a way that embraces new forms of distribution and leverages the power of its existing customer relationships. Headroom for growth remains strong: despite triple-play penetration reaching 35 per cent among the Sky customer base, two-thirds of Sky customers have yet to take a broadband offer. With Sky out-performing the market in broadband net adds, this area is likely to ensure continued strong revenue and ARPU growth. But two areas of risk remain. If the majority of TV growth comes from Now TV, Sky will have to deal with an increasingly large segment of TV customers who are not tied into a contract and who are relatively low value in terms of ARPU. As this segment scales, clearly this could lead to large and seasonal fluctuations in churn and ARPU. The second area of risk is related: cannibalisation. While this is a risk that Sky is well aware of and keeping close tabs on, the renewed connected home push risks accelerating the transition to a more transient customer base.

                  UK TV viewing is about connection, says Ofcom report [[IHS] Screen Digest commentary, Aug 1, 2013]

                  The home entertainment experience is becoming increasingly connected with multi-tasking central to the enjoyment of TV content, according to the latest Communications Market Report from UK regulator Ofcom. According to the findings, there has been a huge increase in the devices that people take to the living room. On average, each UK household owns three different types on Internet-enabled device. The biggest growth over the last year in take-up of services and devices has been on the number of tablets and smart-phones. Thanks to the device mix, 22 per cent of people in the living room watching screens other than the main TV most of the time.
                  The main TV set remains important. Ninety-one per cent of UK population tune into the main TV set weekly up from 88 per cent in 2002, with viewers on average watching four hours a day in 2012 compared to 3.6 hours in 2006. Although the report finds that people gather around the TV in the living room, there has been a decline in the number of children with TVs in their bedroom; 52 per cent of UK children now have a TV set in their bedrooms which represents a 17 per cent decline over the last six years, mainly related to the increase number of tablets and Internet connected devices.
                  Average media household spend has increased in the last year to £113.51 a month after many years of decline. The biggest increase has been on mobile services (£46.73 a month) and fixed internet (£11.91 a month) and the biggest decrease has been on fixed voice (down £22.48 versus one year ago to £21.61 in 2012). TV spend has been stable over the last five years at between £28-£29 a month.
                  Our take

                  Ofcom figures reflect what IHS Screen Digest has long noted: live linear TV is not dead.

                  According to Ofcom, time-shifted viewing represents just 10 per cent of the total and hasn’t changed much over the last years despite the huge increase in DVR ownership. According to BARB figures DVR has grown from 18 per cent in 2007 to 55 per cent in 2010 and 67 per cent in 2012. Despite this, growth in time-shifted viewing has been only moderate up from six per cent four years ago. 
                  More than half of UK adults are regular media multi taskers, they ‘stack’ or ‘mesh’while watching TV weekly, with tablet owners more likely to multi-task than average. Almost one in four UK adults made direct communication with friends and family about the programs as they watching (media meshing) and half of UK adults conduct other activities while they are watching TV on a weekly basis (media staking).
                  The increase in tablet owners has also changed consumer viewing behaviour with VOD requests coming from tablets increasing from three per cent in 2011 to 12 per cent in 2012. More than 56 per cent of tablet owners used them to watch TV and 57 per cent of those watched linear TV on the tablet.

                  Ofcom also found that, when it comes to the much-hyped area of social TV, it is news, reality shows and sport events that are engaging viewers through social media, but the knock-on effect is that consumers want to watch these shows live in order to engage socially, providing another boost for linear TV.

                  TV Everywhere Spreads Among US Television and Cable Networks; NBCUniversal Leads [IHS iSuppli press release, Oct 18, 2013]

                  NBCUniversal leads the US TV Everywhere (TVE) effort in providing access to TV content on second screens like smartphones and tablets, while EPIX and HBO share the distinction of supporting TVE on more second-screen devices than any other premium or basic cable network, according to a new report from the TV Intelligence Service at IHS Inc. (NYSE: IHS).

                  From Wikipedia: TV Everywhere (also sometimes known as authenticated streaming)[1] refers to a model wherein television providers and broadcasters, particularly cable channels, allow their subscribers to access their respective content on digital platforms, including video on demand and live streaming of the channels themselves. TV Everywhere systems utilize user accounts provided by the television provider—which are used to verify whether the user is a subscriber to a particular channel, thus allowing or denying access to the content. The U.S. provider Time Warner Cable first introduced the concept in 2009; in 2010, many television providers and broadcasters began to roll out TV Everywhere services for their subscribers, including major networks such as ESPN, HBO,NBC (particularly for its Olympics coverage).
                  NBCUniversal provides TVE in 15 of its 18 channels, or 83 percent of the studio’s total stable of cable and broadcast networks to pay-TV subscribers willing to authenticate on second-screen devices. Meanwhile, EPIX and HBO have been at the forefront of the TVE experience, with both very willing to embrace new technologies and offering significant amounts of content on their apps and portals.
                  EPIX first kicked off the TVE phenomena in October 2009, formed by partners Paramount, MGM and Lionsgate after their failed renewal with Showtime HBO followed suit in February 2010 with the launch of its web portal and how has a vast TVE library online, even though it does not yet offer live linear streaming.

                  image

                  HBO, along with Cinemax and BTN2Go, are the only three networks to have TV Everywhere authentication agreements with all major US pay TV operators.
                  For its part, Showtime is the only premium network offering live linear streaming through TV Everywhere. The company is also allowing for authentication outside of the home, a feature likely to expand to other basic and premium cable networks as TVE continues to evolve.
                  The last premium channel group to the party is Liberty’s STARZ. STARZ and Encore launched TV Everywhere services in October of 2012, but are still missing authentication deals with both Comcast and DISH Network.
                  One entity so far remains the lone hold-out among the major channel groups not providing any TV Everywhere content—Discovery Communications. But that will change as Discovery is expected to finally jump into the fray in the near to midterm time frame. It will likely become critical to offer similar services, IHS believes, as TVE access becomes more central to the future of US pay TV video.
                  Solving the cord-cutting problem before it gathers steam
                  All major pay-TV operators to date have implemented some form of the TVE service, although sometimes in very limited form, via either live linear streaming or video on demand (VoD). But while the streaming of live linear network feeds is largely relegated to in-home use, video on demand (VoD) is a significant out-of-home TVE product.
                  VoD streaming channels, at 73 including cable, premium and broadcast, far outnumber the channels offering live streaming, at 31, as shown in the attached figure. NBCUniversal, the TVE leader, has 15 VoD channels and five live streaming channels, followed by Time Warner with nine VoD channels and three live streaming channels.
                  “TV Everywhere has been developed as a collective strategy by both pay-TV operators and TV content owners to enhance the traditional linear TV proposition, so that secondary screens like tablets and smartphones can be used to view TV content in addition to the primary screen,” said Erik Brannon, analyst for U.S. cable networks at IHS. “And in spite of the differences in strategy, all TVE products have one thing in common: They allow for current pay-TV video subscribers to authenticate and consume on secondary screens a significant amount of content that they purchase as part of their normal pay-TV video subscriptions.”
                  TVE is one approach that pay-TV operators and network owners are using to stem the tide of cord-cutting among cable subscribers before the number of defections become significant. In many cases, cable subscribers are finding themselves increasingly tempted to end their subscriptions—either because of high costs or because of other alternatives now available, such as over-the-top (OTT) alternatives like Netflix. In the second quarter of 2013 alone IHS estimates that the pay-TV business shed 352,000 subscribers, mostly to seasonality, but some elements of cord-cutting are likely to have been present as well. To be sure, the combined price of $28 (Netflix, Hulu Plus and Aereo) may be more appealing to consumers than the $80+ average revenue per user that IHS estimates pay-TV video customers will pony up for service in 2013.
                  Through TVE, both pay-TV operators and network owners hope to add new functionality and interactivity to the television viewing experience. And by partnering with pay-TV operators, content owners like the broadcast networks hope they can continue to solidify their hold on the distribution of premium television content.
                  Device compatibility extending beyond iOS and Android
                  TV Everywhere is also evolving beyond Apple iOS and Android, the two platforms on which TVE apps first appeared. Now, TVE apps from networks are becoming available and are being deployed across a wide range of connected devices, including smart TVs; video game consoles like Microsoft’s Xbox; Amazon’s Kindle Fire; Blu-ray players; and digital media products such as Roku and Apple TV.
                  Adoption of TVE initiatives by the major channels is a reaction to the changing landscape of TV viewers in the country, Brannon noted. And as it continues to grow in awareness and popularity, TV Everywhere will remain a central focus for pay-TV operators.

                  6. Microsoft’s live TV solution on Xbox

                  From Worldwide launch of Xbox One sparks global celebration for a new generation in games and entertainment [Microsoft press release, Nov 21, 2013]

                  Xbox One’s innovative architecture means you no longer have to choose between your games and entertainment. Get multiplayer alerts while you watch TV, and keep watching TV while you play. Snap your NFL fantasy football stats next to the game. Jump instantly from a game to TV, movies, fitness, music, sports, the Internet and Skype video chat with the sound of your voice. With Xbox One, you never have to stop playing to talk to a friend, surf the Web or watch live TV. You also have access to a new generation of TV experiences, and starting in the U.S. and coming to many markets soon, OneGuide will allow you to access your favorite shows, channels, apps or games with the Bing natural language voice search.

                  Xbox One Live TV, Xbox Fitness with Yusuf Mehdi [scarlettgarden YouTube channel, Oct 27, 2013]

                  Here’s an Xbox Wire interview with Yusuf Mehdi regarding live TV on the Xbox with instant switching and Xbox experiences tailored to the gamer profile

                  From Xbox One: Your Top Questions Answered [May, 2013]

                  Our goal is to enable live TV through Xbox One in every way that it is delivered throughout the world, whether that’s television service providers, over the air or over the Internet, or HDMI-in via a set top box (as is the case with many providers in the US). The delivery of TV is complex and we are working through the many technologies and policies around the world to make live TV available where Xbox One is available.

                  The TWC Case:

                  This deal, which will bring more live channels than any other experience on Xbox 360, will offer Time Warner Cable [TWC] subscribers with an Xbox Live Gold membership the ability to watch their favorite shows from right from their Xbox 360 — including favorites like AMC, BBC World News, Bravo, Cartoon Network, CNN, Comedy Central, Food Network, HGTV and more. And unlike any other platform, you’ll be able to control your entertainment using your voice via Kinect for Xbox 360.
                  I’m excited to announce, alongside our good friends at Time Warner Cable, that the TWC TV app has launched on Xbox 360 today, delivering up to 300 of the most popular TV channels to Xbox Live Gold members in the U.S. who are also TWC subscribers.
                  At Xbox, our vision has always been to provide all the entertainment people want in one place, putting the best in TV, movies, music and sports right next to your favorite games. Like Xbox 360, Xbox One will be the best place in your house for gaming, apps and TV and we can’t wait to show you more on that soon. Today, we’re thrilled to expand our growing entertainment app portfolio of more than 130 voice-controlled apps on Xbox 360 with the addition of TWC TV. TWC customers, thank you for your support and welcome to the Xbox family.
                  When we launched the TWC TV app on Xbox 360 in August, we promised we were hard at work with our partners at Time Warner Cable to bring you Video On Demand and just in time for the holidays, we’re delivering. Starting today, Xbox Live Gold members in the U.S. who are also Time Warner Cable subscribers can now get On Demand content right on their Xbox 360 in addition to the nearly 300 channels of live TV that the TWC TV on Xbox 360 app already offers. And don’t forget that with Kinect for Xbox 360, the app lets you control your favorite shows using voice and gestures so you can kick back, remote-free.
                  The update also includes an exclusive “share” feature that allows you to send messages to other Xbox Live members that are TWC TV customers while channel surfing. With the TWC TV app, you’ll have access to more than 5,000 On Demand choices and in-app messaging. Look for the update today or download the app now!

                  With an unexperienced person: Hands-on video: The Xbox One as a media device [gigaom YouTube channel, Nov 19, 2013]

                  The Xbox One promises to combine state-of-the-art video gaming with live TV and streaming apps. Check out or hands-on video for a closer look at the device’s entertainment offerings.

                  First Wave of TV & Entertainment Apps Coming to Xbox One Unveiled [Xbox Wire from Microsoft, Nov 8, 2013]

                  Offering more entertainment options has always been important to Xbox fans. For years, we’ve been working with leading entertainment brands and TV providers to offer our customers a wide variety of live and on-demand entertainment. Today we unveiled the complete Xbox One experience – showcasing how Xbox One delivers the best games and multiplayer features, along with your favorite movies, music, sports and live TV experiences – all in one place.

                  “We set out to make Xbox One the all-in-one games and entertainment hub for your home. The one system that offers the best games next to the best entertainment experiences and apps,” said Marc Whitten, Xbox Chief Product Officer. “Along with offering a stellar app portfolio from around the world, Xbox One takes the next step by offering them in a way that is seamless and easy to use.”

                  In addition to delivering live TV in every market where Xbox One will be sold, we are also bringing premium, voice and gesture controlled TV and entertainment apps specifically designed for your living room. These apps have been built from the ground up uniquely for Xbox One and are designed to harness the power of the all-in-one platform.  For example, Xbox One is empowering partners to bring media achievements and exclusive Snap experiences, as well as many other things to entertainment apps, offering everybody the opportunity to achieve badges or rewards for the media they consume in addition to gaming.

                  image

                  Something truly unique we’re doing with Xbox One is bringing together your favorite TV channels and entertainment app channels into one screen. Xbox One is also the only games and entertainment system that enables HDMI pass through. You can create your own personal Favorites in OneGuide – the Xbox One electronic program guide – so you can quickly and easily choose what you want to watch, whether it’s a TV channel like CBS, NBC or ESPN, or something inside an app like Xbox Video, Hulu Plus or the NFL on Xbox One. You can even add your personal photos and videos from the SkyDrive app to your OneGuide Favorites.

                  Additionally, a Bing search for TV, movies, games, or music scans across all apps to find exactly what you’re looking for, instead of having to hunt through each app individually. For the first time, you don’t have to juggle multiple screens across cable TV, video streaming services and other entertainment apps to quickly find the entertainment you’re looking for.

                  Today, we’re announcing the very first wave of some of the world’s biggest names in entertainment rolling out on Xbox One in the 13 launch markets between Nov. 22 at launch and spring 2014:

                  Australia

                  • Crackle
                  • Machinima
                  • MUZU TV
                  • Network Ten’s tenplay
                  • Quickflix
                  • SBS On Demand
                  • TED
                  • Twitch

                  Austria

                  • Eurosport
                  • Machinima
                  • MUZU TV
                  • TED
                  • Twitch

                  Brazil

                  • Crackle
                  • Machinima
                  • Muu
                  • Netflix
                  • Saraiva Player
                  • Sky Online
                  • SporTV
                  • TED
                  • Telecine
                  • Twitch
                  • Vivo Play

                  Canada

                  • CinemaNow
                  • Crackle
                  • Machinima
                  • MUZU TV
                  • Netflix
                  • Rogers Anyplace TV
                  • Sportsnet
                  • TED
                  • The NFL on Xbox One
                  • Twitch

                  France

                  • 6Play
                  • Canal+/CanalSat
                  • France 2,3,4,5
                  • La TV d’Orange
                  • Machinima
                  • MUZU TV
                  • MyTF1
                  • MYTF1VOD
                  • SFR TV
                  • TED
                  • Twitch

                  Germany

                  • Amazon\LOVEFiLM
                  • Eurosport
                  • Machinima
                  • MUZU TV
                  • TED
                  • Twitch
                  • Watchever
                  • Zattoo

                  Ireland

                  • Eurosport
                  • Machinima
                  • MUZU TV
                  • Netflix
                  • TED
                  • Twitch

                  Italy

                  • Eurosport
                  • Machinima
                  • MUZU TV
                  • Premium Play
                  • TED
                  • Twitch

                  Mexico

                  • Clarovideo
                  • Crackle
                  • Machinima
                  • Netflix
                  • TED
                  • Televisa
                  • The NFL on Xbox One
                  • TV Azteca
                  • Twitch
                  • Veo

                  New Zealand

                  • Machinima
                  • MUZU TV
                  • Quickflix
                  • TED
                  • Twitch

                  Spain

                  • Eurosport
                  • Gol Televisión
                  • Machinima
                  • MUZU TV
                  • RTVE
                  • TED
                  • Twitch
                  • Wuaki.tv
                  • Zattoo

                  United Kingdom

                  • 4oD
                  • Amazon\LOVEFiLM
                  • blinkbox
                  • Crackle
                  • Demand 5
                  • Eurosport
                  • Machinima
                  • MUZU TV
                  • Netflix
                  • NOW TV
                  • TED
                  • Twitch
                  • Wuaki.tv

                  United States

                  • Amazon Instant Video
                  • Crackle
                  • The CW
                  • ESPN
                  • FOX NOW
                  • FXNOW
                  • HBO GO (coming soon)
                  • Hulu Plus
                  • Machinima
                  • MUZU TV
                  • Netflix
                  • Redbox Instant by Verizon
                  • Target Ticket
                  • TED
                  • The NFL on Xbox One
                  • Twitch
                  • Univision Deportes
                  • Verizon FiOS TV
                  • VUDU

                  The list above* is just the first wave of third-party apps that are coming to Xbox One over the course of the next few months.  We will continue to announce more apps coming to the platform and both the Xbox One and Xbox 360 entertainment app portfolios will continue to grow weekly.
                  *Xbox LIVE Gold membership required

                  In addition to the entertainment apps coming from partners, in every market Xbox One will also feature:

                  • Internet Explorer
                  • Skype
                  • SkyDrive
                  • Upload

                  With games, multiplayer gaming, live TV and the best entertainment apps, Xbox One is the most complete entertainment system.

                  Note that after AT&T Extends TV Watching to More Devices with Launch of U-verse TV on Xbox 360 [press release, Oct 11, 2010] and even after New U-verse Internet Customers Can Take Their Pick: A Free Xbox 360, SONOS PLAY:3, Kindle Fire or Nexus 7 Tablet [press release, March 18, 2013] that Xbox tie-up ended with AT&T U-verse TV To Drop Support For Xbox 360 on December 31 [Multichannel, Nov 26, 2013]:

                  “We’ve made this decision due to low customer demand,” an AT&T spokeswoman said via email on Tuesday. AT&T declined to say how many customers currently use the Xbox 360 as a set-top. … AT&T, the spokeswoman added, currently has no plans to support U-verse TV on the Xbox One. Verizon Communications FiOS TV is the first, and so far only, U.S. pay-TV provider to offer an authenticated app for the Xbox One during its initial launch phase.

                  With highly experienced users: Xbox One All-in-One Demo with Yusuf Mehdi and Marc Whitten [xbox YouTube channel, Nov 8, 2013]

                  Marc Whitten and Yusuf Mehdi walk through a comprehensive demo of Xbox One, including instant switching, biometric sign-in, Live TV, Skype, game DVR, OneGuide and more.

                  From Xbox One: The Complete All-in-One Games and Entertainment System [Xbox Wire from Microsoft, Nov 8, 2013]

                  As we head toward Nov. 22, we’re showcasing the all-in-one capabilities of Xbox One. This is the real Xbox One in action. Corporate Vice President of Marketing and Strategy, Yusuf Mehdi, and Chief Product Officer Marc Whitten show the best of Xbox One in this new video. And, you can see 10 of our favorite new features below.
                  #1 – Unleashing the Power of Your Voice
                  A simple voice command turns on your Xbox One, your TV, your set-top box and your AV system because Kinect for Xbox One is an Infra-Red blaster. And when you say “Xbox On,” your game is always ready to resume from wherever you left off. You can start playing your favorite game, find your favorite show, change channels, turn up the TV volume and more – with the sound of your voice, powered by Bing voice recognition technology.  Just say “Xbox, go to ESPN” and your TV will change directly to the ESPN channel. Or, “Xbox, go to Hulu Plus,” “Xbox, Volume Up,” “Xbox, Mute,” “Xbox, go to Music” – it’s simple. Kinect “talks to” your TV, set-top box and AV receiver, making it easier than ever to navigate entertainment in your living room.
                  #2 – Biometric Sign In
                  #3 – Instant Resume and Instant Switching
                  We’ve talked about instant switching before, but now you can see it for yourself in action. The video showcases how quickly you can jump from one experience to another and right back where you left off. You can literally jump from a game to live TV, music, movies, sports, Web sites and back again in seconds, just by using your voice.
                  #4 – Watch Live TV via Xbox One
                  Xbox One lets you watch live TV from your HDMI-compatible cable or satellite box, making it easy to switch from games to live TV – all with the sound of your voice, and without having to switch TV inputs. No more multiple remotes, missed multiplayer matches while you’re watching TV, or frustrating delays. Just connect your set top box to your Xbox One and you can watch live TV through your Xbox One.
                  #5 – Get a Multiplayer Invite, while you are Watching a movie or live TV
                  #6 – Game DVR and Upload Studio Let You Record and Share Your Greatest Moments
                  #7 – Do Two Things at Once
                  You can also choose to snap two experiences together – so you can play a game while you watch TV or listen to Xbox Music. Or, watch the big NFL game while you manage your fantasy football team. For gamers, snapping Machinima opens up a whole new world of opportunity. Just by saying “Xbox, Snap Machinima,” the Machinima app will be snapped next to “Dead Rising 3” or your favorite game, and walk-throughs, game reviews, help videos and more will appear.
                  #8 – Skype on the Big Screen, With Groups and Free Long Distance
                  Skype is amazing on Xbox One, offering the only big screen experience with Group Video Chat with up to four people. Kinect is the only camera in the world that will follow the caller and pan and zoom automatically as if you had your own camera man. You can talk with your friends while surfing the Web or checking the latest stats of a sports team. And have full 1080p video calls for one-to-one chats on your TV. 1
                  1 For 1080p video call, both users must have compatible HD displays, web cams, messaging clients, and broadband internet.

                  #9 – OneGuide Delivers Personalized Guide to TV, Apps and More

                  Xbox One has its own TV listings guide that can be navigated with your voice. Say “Xbox, what’s on Discovery Channel?” and boom, there you have the list of shows.  Call out your favorite TV show by name and start watching it instantly.  And, Xbox One is the only system that brings together your favorite TV channels and entertainment app channels into one screen.  Create your own personal Favorites in OneGuide, so you can easily choose what you want to watch – whether it’s on Fox, CBS, NBC, ESPN, Hulu Plus or the NFL on Xbox One app. For the first time you don’t have to juggle multiple screens across cable TV, video streaming services and other entertainment apps to quickly find the entertainment you’re looking for.

                  #10 – Xbox SmartGlass Enhances Gaming in New Way

                  Microsoft Is Changing the Game for Sports Fans [Xbox Wire from Microsoft, Sept 3, 2013]

                  Whether it’s in the living room or on the playing field, Microsoft and products like Xbox, Surface, and Windows 8 are impacting the way we experience our favorite sports.
                  In May, Microsoft announced a multi-year, landmark partnership with the NFL. For the Xbox community, this means exclusive interactive NFL experiences for fans at home, found only on Xbox One starting this November. Today, we’re excited to share more details about this partnership and the game changing NFL experiences for Xbox One and Surface. We are also pleased to introduce NFL.com Fantasy Football on Xbox 360, Windows 8 and Windows Phone which are available for download today.
                  Also making headlines today is the confirmation and details around the all-new ESPN application for Xbox One. Leveraging the unique platform capabilities of Xbox One, sports fans will now have control of the live programming, highlights, stats and more across ESPN like never before. 
                  The NFL on Xbox One and Surface 
                  Tailored for you, the NFL on Xbox One will deliver the best of the NFL, in a way that will completely reimagine the way you experience football from the comfort of your home.  Only Xbox One can bring interactivity to live games, stats, scores, highlights and your NFL.com Fantasy Football team all together on the best screen in the house – your TV. Xbox One will personalize your NFL experience, for your team, with the best content the NFL has to offer including NFL.com, NFL Network, and NFL RedZone.  Whether you’re watching the game or not, Xbox One makes it easy to keep tabs on the league with Snap mode. You can watch live TV, play games, or watch movies, while simultaneously tracking your NFL.com Fantasy Football team, or checking in for the latest scores and stats. 
                  While you’re watching at home, Surface technology is in the stadium, on the sidelines to help protect your favorite players.  Teams and trainers will implement use of the X2 concussion testing application this season to quickly diagnose potential player concussions immediately after leaving the playing field with the help of Surface tablets, helping quickly determine if they can get back in the game or call it a day.
                  ESPN on Xbox One We are also excited to announce ESPN on Xbox One, which builds on our innovations with ESPN on Xbox 360, and provides you with the best of ESPN networks and web content personalized just for you.  Featuring deeper sports content personalization, ESPN on Xbox One gives you immediate access to the teams and sports you care about most.  With WatchESPN, ESPN.com, and ESPN3 video content, you get the best highlights, live events and on-demand sports in full screen mode.  Additionally, you will receive personalized scores and stats in Snap mode from the most popular sports.
                  NFL.com Fantasy Football on Xbox 360, Windows 8, and Windows Phone 
                  Beginning today, NFL.com Fantasy Football is now available on Xbox 360, offering a whole new way to track your team and leagues on the best screen in your house – your TV. This destination is tailored just for you and your NFL.com Fantasy Football team, and is easy to jump into and simple to use. The Xbox 360 app brings you an endless playlist of Fantasy Football highlights, Fantasy analysis, stats, scores and standings about your NFL.com Fantasy Football team and leagues, making sure you don’t miss a thing. Consumers can head to NFL.com today to sign up for a league and get in the game before kick off on September 5th.
                  And, with the new NFL.com Fantasy Football apps for Windows 8 and Windows Phone, you can keep tabs on your team and leagues on your tablet, PC, and mobile device as well.
                  Download NFL.com Fantasy Football for Xbox 360, Windows 8 and Windows Phone today, and stay tuned for game changing experiences on Xbox One this November.

                  7. Preliminary information on the upcoming products from Xbox Entertainment Studios

                  Best Advice: Nancy Tellem [Fortune Magazine YouTube channel, Oct 31, 2013]

                  Nancy Tellem is the entertainment and digital media president of Microsoft.

                  Faces to Watch in 2014: Digital media | Nancy Tellem, Mike Hopkins, Issa Rae [Los Angeles Times, Dec 27, 2013]

                  A big year is coming up for game designers Ryan and Amy Green and Ruben Farrus, plus Microsoft’s Nancy Tellem, Hulu’s Mike Hopkins and Web writer-actress Issa Rae.

                  The Times asked its reporters and critics to highlight figures in entertainment and the arts who will be making news in 2014. Here’s who they picked:

                  Nancy Tellem | Microsoft’s president of entertainment and digital media

                  The veteran CBS television executive had her work cut out when she joined Microsoft Corp. in 2012 to launch a Santa Monica studio to create original content.

                  Long fascinated with changes in consumer behavior, Tellem is now playing an important role in determining what appeals to younger consumers accustomed to getting their entertainment on multiple screens. She is trying to build on the momentum that Microsoft has achieved by encouraging millions of consumers to consider the Xbox more than just a video game console. Xbox users spend more than half of their time online listening to music and streaming movies, TV shows and exploring other entertainment options. Microsoft wants to build a trove of exclusive content to differentiate its game system from the rival Sony PlayStation.

                  Microsoft’s slate of new shows designed to appeal to the digitally connected generation is expected to launch in the first half of 2014. Microsoft also brought Tellem on board to make inroads with Hollywood’s creative community. One of the first projects she announced was a live-action TV series, produced by Steven Spielberg, based on the “Halo” game franchise for Xbox Live, a feature that enables gamers to play against online opponents.

                  Tellem was trained as a lawyer and worked her way up the ranks in business affairs at Lorimar, Warner Bros. and then CBS. At the broadcast network, Tellem was a key executive in the development of new shows, including the hit reality show “Survivor.” She was one of the TV industry’s first female entertainment presidents.

                  — Meg James

                  Mike Hopkins | Hulu chief executive

                  Issa Rae | Actress-writer-director
                  … 

                  Nancy Tellem at Wrap Power Women Breakfast: Microsoft Is Aiming for a “Game of Thrones”  [The Wrap YouTube channel, Oct 30, 2013]

                  TheWrap’s keynote speaker at its fourth annual Power Women Breakfast (Oct 30, 2013) says Microsoft’s new studio has the ambition and the means to create landmark programming

                  Nancy Tellem at Wrap Power Women Breakfast: Microsoft Is Aiming for a ‘Game of Thrones’ (Video) [TheWrap, Oct 30, 2013]

                  Nancy Tellem, Microsoft’s new president of entertainment and digital media, said on Wednesday she has the means and the ambition to make a “Game of Thrones”-like series for the new studio backed by the technology giant.

                  “I have the ambition” to make a show as grand as “Game of Thrones,” said the former president of CBS entertainment at TheWrap‘s Power Women Breakfast at the Montage in Beverly Hills.

                  That, to me, was greatest testament to how wonderful television can be and how engrossed people are and committed – and it was a social experience,” she said.

                  And, she said smiling, Microsoft’s budget was “enough for me to do my job, let’s just say.”

                  Not being bound by the constraints of a 22-episode season or even show length and with the technology to engage the viewer through the Xbox platform allows Tellem and her team to “focus on the content itself” and then way viewers canshare that experience.”

                  Tellem said she expects that Microsoft will begin rolling out its new shows — which will range from sports programs to scripted series — as soon as the spring. She says they have not decided whether to release the episodes over time or put them all out at once, like Netflix.

                  Asked about binge viewing, Tellem said she was not sure if Microsoft would release all its content at once, observing that interactivity was more the distinctive purview of Microsoft.

                  Xbox One Reveal: Halo TV and NFL [xbox YouTube channel, May 23, 2013]

                  Nancy Tellem’s Xbox Entertainment Studios announcements of Halo TV with 343 Industries and Steven Spielberg and NFL from Xbox One Reveal Press Briefing.

                  From Microsoft unveils Xbox One: the ultimate all-in-one home entertainment system [press release, May 21, 2013]

                  Blockbuster titles, Steven Spielberg-produced Halo TV series, and exclusive agreements with the NFL transform games, TV and entertainment for the 21st century living room.

                  “Halo” television series. Award-winning filmmaker, director and producer, Steven Spielberg will executive-produce an original “Halo” live-action television series with exclusive interactive Xbox One content, created in partnership with 343 Industries and Xbox Entertainment Studios.

                  RTS Cambridge Convention 2013: Xbox One – From Gaming to Content [Royal Television Society, Sept 12, 2013]

                  Created for gaming, the Xbox One is the latest challenger to old-fashioned telly, but, said Microsoft entertainment and digital media president Nancy Tellem, it is not the final nail in the TV industry’s coffin.

                  “It’s an augmentation,” she argued. “Right now, [TV] is in a renaissance period — what Xbox offers is a different TV experience.” 

                  Since joining Microsoft from CBS a year ago, Tellem has been spearheading the software giant’s move into TV, delivered via its Xbox One gaming console.

                  Interactivity among its current 76 million connected console users would be the key to Xbox One’s success. “It isn’t just delivering content. It really offers an immersive experience,” she said.

                  The session was chaired by Matt Frei from Channel 4 News, who said that he enjoyed being a “passive” consumer of TV. To laughter from the audience, Tellem replied: “Xbox addresses the next generation.”  

                  Tellem identified sport, live events and scripted entertainment as genres particularly suited to Xbox One. Mentioning Game of Thrones as the type of complex drama suited to the console, she claimed: “You can give a much richer understanding of the characters and their history.”

                  Tellem is in discussions with studios and talent about commissions, which she hopes to announce in a few weeks. Earlier this year, Microsoft revealed that a TV show based on the Halo game, with the involvement of Steven Spielberg, was in the pipeline.

                  She also countered a suggestion from the audience that Xbox One’s programming would be geared at 18-year-old boys. Tellum said that her ambition was to reach out beyond traditional gamers, adding that 40% of its platform users were female, with most of the audience between 18 and 40. 

                  My mission it to transform it into an entertainment service,” she said, which would include music, film and sport as well as games. “It’s a simple offering that can access all your entertainment needs.”

                  Before joining Microsoft, Tellem had worked at the US network CBS for a decade and a half, latterly as senior adviser to chief executive Leslie Moonves.

                  Xbox Entertainment Studios to Debut Documentary Series Exclusively on Xbox in 2014 [Xbox Wire from Microsoft, Dec 19, 2013]

                  First Documentary Explores the Fabled Atari Mystery

                  Today, Xbox Entertainment Studios announced an original documentary series that will debut exclusively on Xbox in 2014. Xbox will produce the series with two-time Academy Award-winning producer Simon Chinn (Searching for Sugar Man and Man on Wire) and Emmy-winning producer Jonathan Chinn (FX’s 30 Days and PBS’s American High) through their new multi-platform media company, Lightbox

                  “Our collaboration with Xbox offers an unparalleled opportunity to make a unique series of films around the extraordinary events and characters that have given rise to the digital age,” said Simon Chinn. “Our goal is to produce a series of compelling and entertaining docs which will deploy all the narrative techniques of Simon’s and my previous work. It’s particularly exciting to be partnering with filmmakers like Zak Penn who come to this process from other filmmaking disciplines and who will bring their own distinctive creative vision to this,” added Jonathan Chinn.

                  “Jonathan and Simon Chinn are the perfect team to spearhead this series for Xbox. They are consummate story tellers and they plan to match their creative sensibility with the best talent in the industry,” commented Xbox Entertainment Studios President Nancy Tellem. “These stories will expose how the digital revolution created a global democracy of information, entertainment and commerce, and how it impacts our lives every day.”

                  The first film in the groundbreaking series investigates the events surrounding the great video game burial of 1983. The Atari Corporation, faced with overwhelmingly negative response to the video game “E.T. the Extra-Terrestrial,” buried millions of unsold game cartridges in the middle of the night in the small town of Alamogordo, New Mexico.

                  Fuel Entertainment, an innovator in cross-platform content development, secured the exclusive rights to excavate the Atari landfill and approached Xbox. Lightbox will document the dig, which is planned for early next year.

                  Filmmaker and avid gamer Zak Penn (X-Men 2, Avengers, Incident at Loch Ness) will direct. This episode will not only document the excavation, it will also place the urban legend of the burial in the context of the precipitous rise and fall of Atari itself.

                  “When Simon and Jonathan Chinn approached me about this story, I knew it would be something important and fascinating,” said Penn. “I wasn’t expecting to be handed the opportunity to uncover one of the most controversial mysteries of gaming lore.”

                  Shooting begins in January. The series will air exclusively on Xbox One and Xbox 360 in 2014 and will be available globally in all markets where Xbox Live is supported.

                  8. Xbox Music and Xbox Video services for other devices

                  Xbox Music + Video apps for Windows Phone 8 [Windows Phone Central YouTube channel, Dec 18, 2013]

                  On December 18th, Microsoft released two new apps for Windows Phone 8: Xbox Music and Xbox Video. We give a tour of both apps and show off some of their features on a Lumia 1520. More info: http://www.wpcentral.com/xbox-music-and-video-app-tour

                  Xbox Video for Windows Phone 8 Walkthrough [Pocketnow YouTube channel, Dec 19, 2013]

                  Microsoft finally released the Xbox Video application for Windows Phone 8. We go hands-on with the new application in this walkthrough video, and discover all its features and missing functionality. See more at Pocketnow: http://pocketnow.com/2013/12/19/xbox-video-for-windows-phone

                  New Xbox Video and Xbox Music apps Available for Windows Phone 8 Customers [Xbox Wire from Microsoft, Dec 18, 2013]

                  It’s a big day for Windows Phone 8 customers. New apps for Xbox Video and Xbox Music are becoming available today in the Windows Phone store.

                  Xbox Video Comes to Windows Phone

                  Today, Xbox Video launches on Windows Phone 8, so now you can truly take your movies and TV shows with you wherever you go. Stream from the cloud or download your favorite movie or TV episodes to your phone to watch them offline. You can rent or buy the newest hit movies or search for classics from the massive catalog with the only app that lets you download movies and TV episodes right to your Windows Phone 8. You’ll even get Rotten Tomatoes ratings and Metacritic scores right on your phone.

                  Xbox Video on Windows Phone 8 also delivers countless TV shows. With a Season Pass, brand new episodes are automatically added to your collection so you don’t miss a beat from your favorite new shows. Or catch up with every episode from past seasons and relive the glory days of your favorite shows from years past.

                  With Xbox Video, your collection follows you from screen to screen in the cloud. For example, you can buy and start a movie or TV show from XboxVideo.com or Xbox Video on a Windows 8.1 tablet, and continue watching on your Xbox One, Xbox 360 or Windows Phone 8. And with Xbox SmartGlass, you get a richer viewing experience that isn’t found anywhere else. Xbox SmartGlass integrated with Xbox Video for Xbox One and Xbox 360 offers second-screen experiences with bonus content and exclusive extras, serves as a remote control, and gives you new ways to interact with whatever you’re watching.

                  Xbox Video is a free download in the Windows Phone Store today, and don’t forget to check out our new Web store at XboxVideo.com

                  A Peek at the New Xbox Music for Windows Phone

                  Also releasing today is a new Xbox Music preview app. This early-access app gives Xbox Music Pass users a look into the new music experience on Windows Phone 8.¹ Stream millions of songs from your phone or download the ones you want for offline listening. Create playlists that sync across your devices. Play songs from your personal music collection alongside your Xbox Music Pass content. It’s the best way to experience all the music you love on your Windows Phone.

                  The Xbox Music Preview is available in all 22 markets where Xbox Music is available today and can be found in the Windows Phone Store. The full release will roll out in 2014. Xbox Music is available today on Windows Phone, Xbox One, Xbox 360, Windows 8/8.1, online at Music.Xbox.com and iOS and Android devices.

                  ¹ Xbox Music Pass required to use the app. Compatible devices and internet required. Data charges apply. See Xbox.com/music.

                  Xbox Music For Android Review [Mikey Capoccia YouTube channel, Sept 9, 2013]

                  In todays video I will be reviewing the Xbox Music application for Android

                  Microsoft launches Xbox Music across iOS and Android, adds free streaming on the Web [press release, Sept 8, 2013]

                  Enjoy your favorite music from a 30 million-song global catalog powered by the one service that integrates your music experiences across your tablet, PC, phone and TV. All the music you love, every way you want it.

                  Nearing its one-year anniversary, Microsoft Corp.’s all-in-one music service, Xbox Music, continues making strides to deliver all the music people want, wherever they want it played. Today, Microsoft announced its plans to bring Xbox Music to iOS and Android devices, as well as free streaming on Xbox Music via the Web.[1]

                  Accessing music across all the different devices people interact with has become complicated. People today use PCs, laptops, tablets, phones and TVs to access different music services that don’t connect with one another. Xbox Music is designed to solve this common problem by combining the best of all music offerings with free streaming on the Web and on Windows 8 PCs and tablets, Internet radio, subscription (called Xbox Music Pass), and download-to-own options.[2] With today’s news, access to Xbox Music grows to include iOS and Android devices, as well as a free Web-based interface on computers.

                  “Xbox Music now, more than ever, powers music experiences between Windows 8, Xbox, Windows Phone, and now iOS, Android and the Web,” said Jerry Johnson, general manager of Xbox Music. “We’re also excited to connect artists with their fans on the most anticipated consumer product of the year when Xbox One launches Nov. 22.”

                  Expanding the Xbox Music family of devices

                  Starting today, your Xbox Music Pass brings the catalog of music to iOS and Android devices. Get unlimited access to the songs and artists you want at any time with playback across your tablet, PC, phone and Xbox console for $9.99 per month or $99.99 per year. Add a song to your collection on your Xbox, and you’ll also have that song on your iOS, Android or Windows 8 device on the go or at the office. Xbox Music Pass also unlocks unlimited access to tens of thousands of music videos on your Xbox 360.

                  With the addition of free streaming on the Web, enjoy on-demand access to 30 million songs globally for free on the Xbox Music Web player at http://music.xbox.com or through the Xbox Music app on all Windows 8 tablets and PCs. Discovering and enjoying free music is as easy as typing an artist or song name and hitting “play.” Songs are instantly available to stream at no cost and for you to create an unlimited amount of playlists.[1]

                  Continued innovation

                  Xbox Music will continue to grow and evolve over the coming months. Microsoft will add Radio to the free Web player, a quick and dynamic way to personalize your collection, discover new favorites, and create ultimate playlists by launching instant mixes based on your favorite artists. With unlimited skips and a view of the full recommended music stream, Radio puts you in control of your Internet radio experience.[1]

                  Xbox Music will grow on Windows 8 when it adds the anticipated new Web Playlist tool this fall. The tool scans all the artists and music available on a given Web page and creates a custom playlist of all that music. Think about the Web page of your favorite radio station, or an upcoming music festival, and all the bands and songs included on that Web page. Web Playlist identifies all that music and creates an instant, custom playlist inside Xbox Music with the simple touch of a button. Web Playlist along with Windows 8.1 will be released Oct. 17.

                  In the coming months, additional updates for iOS and Android platforms will become available, including an offline mode that lets you save your music to your device for playback without an Internet or data connection.

                  About Xbox

                  Xbox is Microsoft’s premier entertainment brand for the TV, phone, PC and tablet. In living rooms or on the go, Xbox is home to the best and broadest games, as well as one of the world’s largest libraries of movies, TV, music and sports. Your favorite games, TV and entertainment come to life in new ways through the power of Kinect, Xbox SmartGlass and Xbox Live, the world’s premier social entertainment network. More information about Xbox can be found online at http://www.xbox.com.

                  [1] Free streaming available only on the Web and devices running Windows 8 or later. Limited hours of free streaming after six months; unlimited with paid subscription. Coming later this fall: artist-based Radio on Android, iOS and the Web.
                  [2] Xbox Music Pass is streaming only on Xbox consoles, Android, iOS and the Web. Applicable taxes extra. On Xbox consoles, Xbox Music requires an Xbox Music Pass and an Xbox Live Gold membership (both sold separately). Download music on up to four devices. Some Xbox Music content may not be available via Xbox Music Pass, and may vary over time and by region. Coming later this fall: Xbox music download-to-own on Android and iOS, and playlists and song sync on Windows Phone 8. See http://www.xbox.com/music.
                  For details, please visit http://news.xbox.com.
                  For assets, please visit http://news.xbox.com/media.

                  Device businesses should have a China-based independent headquarter at least for Asia/Pacific if they want to succeed

                  Back in August I found that China is the epicenter of the mobile Internet world, so of the next-gen HTML5 web [Aug 5, 2013]. That statement was strengthened even more recently with MediaTek MT6592-based True Octa-core superphones are on the market to beat Qualcomm Snapdragon 800-based ones UPDATE: from $147+ in Q1 and $132+ in Q2 [Dec 22, 2013; Jan 27, 2014].

                  Latest Nokia vs Apple vs Android:
                  image

                  With a trend analysis of the importance of the Asia/Pacific market in general, and the Chinese market in particular one comes to an even more striking conclusion: except Samsung (as it is just nearby) all dominant players in the mobile device market of today, and especially tomorrow, have to operate from China based headquarters. Otherwise they are unable to take the relevant decisions (unlike what was possible for PC era, just from U.S. based headquarters). This is especially applied to the merged Nokia-Microsoft Device Business!

                  image

                  Sources:
                  Analysys International: China Mobile Phone Sales Hit 100 Million in Q3, 2013 [Nov 8, 2013]
                  – Gartner sources: like the latest Gartner Says Smartphone Sales Accounted for 55 Percent of Overall Mobile Phone Sales in Third Quarter of 2013 [Nov 14, 2013]
                  IDC Finds Worldwide Smartphone Shipments on Pace to Grow Nearly 40% in 2013 While Average Selling Prices Decline More Than 12% [Nov 26, 2013]
                  – For IDC look at Smartphones Expected to Grow 32.7% in 2013 Fueled By Declining Prices and Strong Emerging Market Demand, According to IDC [June 4, 2013] as well (for Worldwide Smartphone Shipments by Market Maturity i.e. emerging and developed). Here is the historical chart embedded there:
                   imageDeveloped Markets include: USA, Canada, Western Europe, Japan, Australia, and New Zealand.

                  With the latest news release of Jan 27 (Android ends the year on top but Apple scores in key markets) from Kantar Worldpanel Comtech one can compile an almost 2 years long representation of the smartphone trends in the key markets via a set of following charts:

                  The latest comments on that (from the news release) by Kantar Worldpanel Comtech:

                  Android ended 2013 as the top OS across Europe with 68.6% share, while Apple held second place with 18.5%. Windows Phone continues to show high year-on-year growth, but its share of the European market has essentially remained flat at 10.3% for the past three months.

                  Android finished 2013 strongly, showing year-on-year share growth across 12 major global markets including Europe, USA, Latin America, China and Japan. Apple has lost share in most countries compared with this time last year, but importantly it has held strong shares in key markets including 43.9% in USA, 29.9% in Great Britain and 19.0% in China.

                  Windows Phone has now held double digit share across Europe for three consecutive months. Unfortunately for Nokia the European smartphone market is only growing at 3% year on year so success in this market has not been enough to turn around its fortunes – reflected in its recent disappointing results. Its performance also deteriorated toward the end of 2013 in the important growth markets of China, USA and Latin America.

                  It’s no surprise that everyone is concentrating on high growth China, but currently local brands are proving clear winners. In December, Xiaomi overtook both Apple and Samsung to become the top selling smartphone in China – a truly remarkable achievement for a brand which was only started in 2010 and sells its device almost exclusively online. The combination of high spec devices, low prices and an ability to create unprecedented buzz through online and social platforms has proved an irresistible proposition for the Chinese.

                  Additional information for the period was provided by earlier Kantar Worldpanel Comtech news releases:

                  Android leads OS U.S. sales, as LG and Nokia see resurgence [Jan 7, 2014]

                  In the 3 months ending November 2013, Android maintained its lead of smartphone sales on the U.S., capturing 50.3% of the smartphone market. iOS follows with 43.1% of smartphone sales, an increase month on month, however, down 9.9% versus the same period a year ago, according to data on the U.S. market released today by Kantar Worldpanel ComTech.
                  Windows Phone, the third largest OS in the U.S, sold nearly 5% of smartphones in the 3 months ending November 2013, up 2.1% points from the previous year.
                  As with the previous period, Verizon maintained its lead as the top smartphone carrier, with just under a third of sales (31.7%). AT&T, in second, had 28.3% of smartphone sales in the 3 months ending November 2013. T-Mobile, overtaking Sprint as the third largest carrier had 13.3% of sales, and was the only major carrier to see growth year on year (up 6.3%).
                  The data is derived from Kantar Worldpanel ComTech USA’s consumer panel, which is the largest continuous consumer research mobile phone panel of its kind in the world, conducting more than 240,000 interviews per year in the U.S. alone. ComTech tracks mobile phone behavior and the customer journey, including purchasing of phones, mobile phone bills/airtime, and source of purchase and phone usage. This data is exclusively focused on the sales within this 3 month period rather than market share figures. Sales shares exemplify more forward focused trends and should represent the market share for these brands in future.
                  Kantar Worldpanel ComTech Global Strategic Insight Director, Dominic Sunnebo states, “The iPhone 5S and 5C were the two bestselling smartphones in the U.S for the 3 months ending November 2013. However, increased rivalry from Android brands and a resurgence of LG and Nokia, has made year-on-year share gains for Apple difficult. This is especially true on T-Mobile.”
                  On T-Mobile, the ‘UNcarrier’ strategy, launched earlier in 2013, has been successful because it has attracted first-time smartphone buyers, looking to upgrade to their first smartphone. Among T-Mobile smartphone buyers in November 2013, 55% of those that purchased an LG and Nokia smartphone were first-time smartphone buyers, compared to just 39% of Apple customers.
                  Sunnebo continues, “First-time smartphone buyers remain a key demographic for carriers and brand alike. The lower end iPhone 5C represents an opportunity for Apple to attract these customers. Thus far the majority of 5C customers have come from other smartphone platforms, though if historical trends hold, the lower end model (historically the older iPhone model following the release of a new iPhone), should be able to attract this demographic with its lower price and comparable specs.”

                  Apple launch momentum continues [Jan 7, 2014]

                  The latest smartphone sales data from Kantar Worldpanel ComTech, for the three months to November 2013, shows Apple’s share of smartphone sales continuing to grow month on month following the release of the iPhone 5S and 5C models. However, its share of most major markets remains lower than the same time last year as it increasingly faces challenges from its rivals.

                  While there’s no doubt that sales of the iPhone 5S and 5C have been strong, resurgent performances from LG, Sony and Nokia have made making year on year share gains increasingly challenging for Apple. Windows Phone, for example, is now the third largest OS across Europe with 10.0% – more than double its share compared with last year.

                  Apple now accounts for 69.1% of the Japanese market, 43.1% in the United States, 35.0% in Australia and 30.6% in Great Britain.

                  Strong sales of the iPhone 5S and 5C can be linked to high levels of customer satisfaction with both models, despite fears that the lower-end 5C could damage Apple’s appeal.

                  Some people worried that Apple was risking its historically high consumer satisfaction levels by releasing a lower cost, plastic iPhone. However, the latest data for the US shows that the iPhone 5C has an average owner recommendation score of 9.0/10 versus 9.1/10 for the iPhone 5S. Both devices attract different customers but crucially each group of owners remains very happy with their choice and are recommending it to others.

                  Android gains 3% market share each quarter in China [Nov 28, 2013]

                  Kantar Worldpanel ComTech is the first continuous panel to gather representative mobile phone data in China. The panel has been created to provide insight including mobile phone ownership, sales, usage, churn, loyalty and pricing in Chinese telecoms market.

                  The key points of Q3 report:

                  • Android’s steady growth in China is mainly coming from cheaper local brands, consumers are seeking for the ultimate value for money device.
                  • There were many speculations about the iPhone 5s and 5c prior their official launch. It actually made negative impact to iPhone Q3 sales, as people were holding out for the new models, and reduced Apple’s sales by almost 50% compared to the previous quarter.
                  • Almost a quarter of smartphone sales were made via online channels in 2013Q3. Even though online channels usually offer better price, the ability to test the phone is also a key purchase decision factor for Chinese customers, making it important for manufactures and leading retail chains to develop effective O2O strategies
                  • As most Android devices offer similar user experience, consumers are more focusing on cost effective devices.

                  image

                  ¥ 1000 – ¥ 2000: US$ 165 – US$ 331
                  ¥ 2000 – ¥ 3000: US$ 331 – US$ 496
                  ¥ 3000- ¥ 4000: US$ 496 – US$ 661

                  With Kantar Worldpanel Comtech vendor market shares be very careful as the latest Analysys International: Apple’s Share Declining in China Smartphone Market in Q3, 2013 is providing quite a different picture:

                  The statistics from EnfoDesk, the Survey of China Mobile Terminals Market in Q3, 2013, newly released by Analysys International, shows that the sales of China mobile phone (excluding parallel imports and the cottage) hit 102.66 million, up 54.5 percent year on year with a sequential growth rate being 13.6 percent by 2013Q3. Samsung, Lenovo and Coolpad still ranked top three with market share being 18.1 percent, 11.4 percent and 9.0 percent.
                  image
                  China smartphone sales hit 93.08 million in Q3,2013, rose as high as 89.3 percent with the sequential growth rate being 20.7 percent. Smartphone continued to rise 90.7 percent of the total market share. Compared to 2013Q2, Apple’s share was in the largest decline, down by 1.1 percentage points.
                  EnfoDesk Analysys International holds that Apple’s declining of mobile phone sales is mainly due to the small influence of iPhone 5S/5C although it was released in mid-September. The sales of iPhone new products are expected to boost Apple’s overall share in Q4. However, this momentum will not last long, and Apple’s share will ultimately continue to decline.
                  image
                  Research definition:
                  Mobile phone sales refers to the number of mobile phone that sell to the users through various channels. Part of the mobile phone sales data in this report does not include smuggled and parallel goods, see specific data in the report.

                  Nokia and Windows global momentum continues [Nov 4, 2013]

                  The latest smartphone sales data from Kantar Worldpanel ComTech, for the three months to September 2013, shows Windows Phone now makes up one in 10 smartphone sales across the five major European markets*, has overtaken iOS in Italy, and is gaining momentum in emerging markets. Android remains the dominant operating system across Europe with 71.9%, an increase of 4.2 percentage points compared with the same period last year.

                  Windows Phone, driven almost entirely by Nokia sales, continues to make rapid progress in Europe and has also shown signs of growth in emerging markets such as Latin America.

                  With the smartphone market in developed countries so congested, it is emerging economies that now present manufacturers with the best opportunity for growth.

                  Nokia dominated in Latin America for many years, and while its popularity declined with the fortunes of Symbian it now has an opportunity to regain the top-spot. The majority of consumers in Latin America still own a Nokia featurephone and upgrading to an entry level Lumia is a logical next step. Price is the main barrier in developing markets and the budget Lumia 520 opens the door to smartphone ownership for many.

                  Local brands growing in China

                  China is increasingly dominated by Android which accounts for 81.1% of the market, up 14.6 percentage points from last year. Domestic manufacturers made up 44% of smartphone sales in the latest period, compared to just 30% the previous year. Huawei, Xiaomi, Lenovo and Coolpad handsets are particularly popular outside of China’s largest cities and represent a more value-for-money option than global brands.

                  Chinese consumers are prepared to make a huge investment in their smartphone, with some spending up to 70% of their monthly salary on a new device. With such a high investment, Chinese consumers want to get the best value for money and are increasingly opting for a high-spec local brand over a low-spec global equivalent. The message for global manufacturers is clear – Chinese consumers demand value, and overpriced entry-levels models no longer cut it against increasingly impressive local competition.

                  Kantar Worldpanel ComTech: Urban China Smartphone Sales Data to Q313

                  image

                  Windows Phone nears double digit share across Europe [Sept 30, 2013]

                  The latest smartphone sales data from Kantar Worldpanel ComTech, for the three months to August 2013, shows Windows Phone has posted its highest ever sales share of 9.2% across the five major European markets* and is now within one percentage point of iOS in Germany. Android remains the top operating system across Europe with a 70.1% market share, but its dominant position is increasingly threatened as growth trails behind both Windows and iOS.

                  Windows Phone has hit double digit sales share figures in France and Great Britain with 10.8% and 12% respectively – the first time it has recorded double digits in two major markets.

                  After years of increasing market share, Android has now reached a point where significant growth in developed markets is becoming harder to find. Android’s growth has been spearheaded by Samsung, but the manufacturer is now seeing its share of sales across the major European economies dip year on year as a sustained comeback from Sony, Nokia and LG begins to broaden the competitive landscape.

                  Windows Phone’s latest wave of growth is being driven by Nokia’s expansion into the low and mid range market with the Lumia 520 and 620 handsets. These models are hitting the sweet spot with 16 to 24 year-olds and 35 to 49 year-olds, two key groups that look for a balance of price and functionality in their smartphone.

                  image

                  A key milestone for Android in China  [May 31, 2013]

                  Kantar Worldpanel ComTech, the global market leader in longitudinal Telecom research panels, reports at the end of Q1 2013, Urban China Smartphone penetration reached 42%, an increase of 1.2% compared to Q4 2012. According to Kantar Worldpanel ComTech’s latest research in China, most of Smartphone growth comes from new Smartphone adopters, with almost half of Featurephone owners who changed their device in last quarter upgrading to a Smartphone. Craig Yu, Consumer Insight Director at Kantar Worldpanel ComTech, comments:”Featurephones are losing their price advantage as Android Smartphones are rapidly becoming more affordable and delivering better value. We expect to see accelerated Smartphone adoption in China in the coming months.”

                  image

                  During the first quarter of 2013, Android continued its steady growth in China, marking a key milestone in reaching 50% share of Smartphone Installed Base. At the end of March 2013, Android widened its lead of Smartphone operating systems with a 51.4% market share, an increase of 2.8% compared to the previous quarter. Second and third place was taken by Symbian and iOS, whose market share is 23% and 19.9% respectively. Symbian has declined 2% in the last quarter, whilst iOS remained resilient. Following the same trend, Symbian looks likely to lose its second place to be the third in the next 2 quarters.

                  Kantar Worldpanel ComTech also tracks the performance of various mobile device brands, according to its latest report, many Chinese local brands have been working closely with carriers and demonstrated strong growth in the Smartphone market for the first three months of 2013. ZTE, Lenovo and Xiaomi all have experienced share increases.

                  image

                  The combined market share of above four local brands are at 20%, a 17.6% growth in the past 6 months. Huawei, ZTE, Lenovo, Coolpad & Xiaomi combined make up 1 in 5 of all Smartphones in active use in China-this proportion will continue to grow as Nokia’s existing dominance is challenged.

                  Yu continues:”Local manufacturer brands have been able to drive strong growth through bundling their handsets with carriers tariff offers, seeking out new sales channels & combining innovative product design with value to capture many first time Smartphone buyers and those residing in City tiers 2/3/4.

                  However, Samsung remains the fastest growing Smartphone brand in China, ended Q1 2013 with 15.2% share of Installed Base (1.5%pts). Craig Yu continues:”Samsung has recently launched the Galaxy S4, selling over 10 million units globally in less than one month-we predict the launch of Galaxy S4 mini in the not too distant future will greatly increase its product reach in urban China.”

                  Apple achieves its highest ever Smartphone share in US [Dec 12, 2012]

                  The latest smartphone sales data from Kantar Worldpanel ComTech shows Apple has achieved its highest ever share in the US (53.3%) in the latest 12 weeks*, with the iPhone 5 helping to boost sales. In Europe, however, Android retains the highest share with 61% of the market, up from 51.8% a year ago.

                  * 12 w/e 25th November 2012

                  Apple has reached a major milestone in the US by passing the 50% share mark for the first time, with further gains expected to be made during December.

                  Meanwhile in Europe, Samsung continues to hold the number one smartphone manufacturer spot across the big five countries, with 44.3% share in the latest 12 weeks. Apple takes second place with 25.3% share while HTC, Sony and Nokia shares remain close in the chase for third position.

                  Although Windows sales in the US remain subdued, Nokia is managing to claw back some of its share in Great Britain through keenly priced Lumia 800 and 610 prepay deals. The next period will prove crucial in revealing initial consumer reactions to the Nokia 920 and HTC Windows 8X devices.

                  Nokia continues to find it tough to attract younger consumers in Great Britain. Over the past six months, just 28% of Nokia Lumia 800 sales have come from under 35’s, compared with 42% of all smartphone sales. With the Nokia Lumia 920 being one of the few handsets available on EE 4G, new tariffs may help to change this by attracting early adopters in the coming months.

                  Smartphone percentage penetration in Great Britain hit 60% in the latest period, with 83% of all mobile phone sales over the past 12 weeks being smartphones.

                  iPhone 5 release slows Android gains [Oct 30, 2012]

                  Recent smartphone sales data from Kantar Worldpanel ComTech shows Android continuing to gain share across Europe in latest 12 weeks of sales* increasing its share to 67.1% share, up from 50.9% a year ago. However, its rate of growth has slowed as week one of iPhone 5 sales show iOS gaining in the US and Great Britain.

                  My insert (see Q4’12):
                  image

                  * 12 w/e 30th September 2012

                  (Apple iPhone 5 released on 21st September in US, GB, Germany & France. Italy & Spain on 28th September. Not yet released in China & Brazil).

                  Apple has increased its share from 18.1% to 28.0% in the past year across Britain, while in the US its share increased by 14.2 percentage points.

                  While this latest data set only includes one week of iPhone 5 sales, we can see that in markets with a large number of existing Apple customers, sales have already seen a significant boost. We expect this momentum to be fully realised in the next set of results.

                  Tomorrow the UK joins the likes of the US, Germany and much of Scandinavia with the rollout of EE’s superfast 4G network.

                  Chinese consumers are rarely loyal to their brands [June 29, 2013]

                  Bain & Company, a global business consulting firm, and Kantar Worldpanel, a global leader in consumer panel insights, released the 2012 China FMCG Shopper report in Beijing. In most of 26 of the top consumer goods categories sold in China across packaged foods, beverages, personal care and homecare, covering more than 80 percent of the country’s fast-moving consumer goods (FMCG) market, shoppers who purchase more frequently in a category tend to buy more brands rather than more of the same brands.

                  Kantar Worldpanel equips shoppers from 40,000 households throughout urban China with barcode scanners to record their purchases from all channels. The findings dispel several misunderstood notions about how Chinese consumers respond to product brands. Although over 60 percent of Chinese shoppers have said that brands were their top consideration when purchasing (in previous Bain research), in reality, they rarely act on that consideration at the moment of purchase. Instead, they are in a near-constant state of trial, without leading to eventual preference and loyalty.

                  The extraordinary attempt by Nokia/Microsoft to crack the U.S. market in terms of volumes with Nokia Lumia 521 (with 4G/LTE) and Nokia Lumia 520

                  Update: there was no effective result of the price elasticity trial on the U.S. market as per the below chart composed by myself from news releases from Kantar Worldpanel Comtech over the period in question:

                  image
                  (Credit: Kantar Worldpanel ComTech)

                  While we have Lumia 520 for Rs. 8850, i.e.US$ 144 in India, and US$121.5 in China as the minimum price, current prices in U.S. are much below of that (although in China there is also the improved and upgraded to 1GB RAM version, Lumia 525 is available from $103 up since December, but in all of the other places the price is up from $115, like here):

                  * Meanwhile the Nokia/Microsoft effort was instrumental for a new trend of ending smartphone subsidies on the U.S. market (see towards the end of the post related to T-Mobile’s so called Un-carrier strategy)
                  Nokia Lumia 521
                  [May 22, 2013 – T-Mobile U.S. $149]
                  July 25 –
                  $99 MetroPCS
                  Nokia Lumia 521 [T-Mobile, excerpted on Jan 18, 2014]
                  $126 $102 full retail price
                  $0 up front + $425 x 24/mo.
                  Nokia Lumia 521 No Contract for T-Mobile [Microsoft Store, excerpted on Jan 18, 2014]
                  Now $69.00
                  – was $99.00
                  – Free shipping. Free returns.
                  Nokia Lumia 521 (T-Mobile) by Nokia [Amazon, excerpted on Jan 18, 2014]
                  – List Price: $149.99
                  – Price: $89.95 & FREE Shipping. Details
                  – You Save: $60.04 (40%)
                  – In Stock. Sold by Cellular Specialty and Fulfilled by Amazon. Gift-wrap available.
                  Nokia Lumia 521 (Metro PCS) by Nokia [Amazon, excerpted on Jan 18, 2014]
                  – List Price: $99.00
                  – Price: $89.94 & FREE Shipping. Details
                  – You Save: $9.06 (9%)
                  – In Stock. Ships from and sold by Amazon.com. Gift-wrap available.
                  Nokia Lumia 520 
                  March’13 EUR 139 [US$180]
                  July 27 –
                  $99.99 as prepaid AT&T GoPhone
                  Nokia Lumia 520- GoPhone® [prepaid] [AT&T, excerpted on Jan 18, 2014]
                  – Due Today $99.99
                  Requires new activation and qualifying monthly voice plan.
                  Nokia Lumia 520 No Contract for AT&T [Microsoft Store, excerpted on Jan 18, 2014]
                  Now $59.00
                  was $99.00
                  – Free shipping. Free returns.
                  Nokia Lumia 520 GoPhone (AT&T) by Nokia [Amazon, excerpted on Jan 18, 2014]
                  – List Price: $99.99
                  – Price: $59.99 & FREE Shipping. Details
                  – You Save: $40.00 (40%)
                  – In Stock. Ships from and sold by Amazon.com. Gift-wrap available.
                  Nokia Lumia 520 8GB Black – International Version, Factory Unlocked WP8 by Nokia [Amazon, excerpted on Jan 18, 2014]
                  – Price: $136.08
                  – In Stock. Ships from and sold by HassleFreeCell.

                  Everything started to work that way in Q3 2013:

                  image

                  Note that:image(Credit: Kantar Worldpanel ComTech)

                  Android leads OS U.S. sales, as LG and Nokia see resurgence [Kantar Worldpanel press release, Jan 7, 2014]

                  In the 3 months ending November 2013, Android maintained its lead of smartphone sales on the U.S., capturing 50.3% of the smartphone market. iOS follows with 43.1% of smartphone sales, an increase month on month, however, down 9.9% versus the same period a year ago, according to data on the U.S. market released today by Kantar Worldpanel ComTech.

                  Windows Phone, the third largest OS in the U.S, sold nearly 5% of smartphones in the 3 months ending November 2013, up 2.1% points from the previous year.

                  As with the previous period, Verizon maintained its lead as the top smartphone carrier, with just under a third of sales (31.7%). AT&T, in second, had 28.3% of smartphone sales in the 3 months ending November 2013. T-Mobile, overtaking Sprint as the third largest carrier had 13.3% of sales, and was the only major carrier to see growth year on year (up 6.3%).

                  The data is derived from Kantar Worldpanel ComTech USA’s consumer panel, which is the largest continuous consumer research mobile phone panel of its kind in the world, conducting more than 240,000 interviews per year in the U.S. alone. ComTech tracks mobile phone behavior and the customer journey, including purchasing of phones, mobile phone bills/airtime, and source of purchase and phone usage. This data is exclusively focused on the sales within this 3 month period rather than market share figures. Sales shares exemplify more forward focused trends and should represent the market share for these brands in future.

                  Kantar Worldpanel ComTech Global Strategic Insight Director, Dominic Sunnebo states, “The iPhone 5S and 5C were the two bestselling smartphones in the U.S for the 3 months ending November 2013. However, increased rivalry from Android brands and a resurgence of LG and Nokia, has made year-on-year share gains for Apple difficult. This is especially true on T-Mobile.”

                  On T-Mobile, the ‘UNcarrier’ strategy, launched earlier in 2013, has been successful because it has attracted first-time smartphone buyers, looking to upgrade to their first smartphone. Among T-Mobile smartphone buyers in November 2013, 55% of those that purchased an LG and Nokia smartphone were first-time smartphone buyers, compared to just 39% of Apple customers.

                  Sunnebo continues, “First-time smartphone buyers remain a key demographic for carriers and brand alike. The lower end iPhone 5C represents an opportunity for Apple to attract these customers. Thus far the majority of 5C customers have come from other smartphone platforms, though if historical trends hold, the lower end model (historically the older iPhone model following the release of a new iPhone), should be able to attract this demographic with its lower price and comparable specs.”

                  image(data from Kantar Worldpanel ComTech, chart compiled by theguardian.com)

                  Regarding the US installed base see also Apple and Samsung Grow to Represent 68 Percent of Smartphones Owned in the US, According to The NPD Group [press release, Jan 16, 2014]

                  From Nokia Lumia 521 Price Deal: Lumia 521 Windows Phone 8 Smartphone Price Cut by MetroPCS to $29 [Video] [Tech: Latinos Post, Jan 15, 2014]

                  The price of the Nokia Lumia 521 received a significant reduction but only for MetroPCS customers.

                  The Lumia 521 Windows Phone 8 smartphone received a $29 price tag, although tax was not included.

                  The original price for the Lumia 521 was $99, but MetroPCS issued a couple discounts including a rebate offer for the $29 cost to come into fruition. To be precise, MetroPCS added an “Instant Discount” worth $50 followed by a $20 mail-in rebate.

                  The Lumia 521, also available with T-Mobile, can be purchased at full price $102. A $0 up-front offer is also available with T-Mobile as long as the consumer pays $4.25 for the next 204 months.

                  Meanwhile, the online Microsoft Store has the Lumia 521 for $69, which is a discount from the original $99 price tag. The online Microsoft Store is offering free shipping and returns.

                  Amazon.com also has the Lumia 521 in stock. For $79.99, the Lumia 521 can be purchased following a 47 percent discount from its original $149.99 price tag.

                  From Nokia Lumia 520 vs. Nokia Lumia 521 Specs: Price Cuts for Windows Phone 8 Smartphones on Online Microsoft Store End Jan. 12 [Video] [Tech: Latinos Post, Jan 9, 2014]

                  The Nokia Lumia 520 and Lumia 521 have received price discounts but the opportunity to purchase either Windows Phone 8 smartphones is about to end this weekend.

                  The Lumia 520 is available with no contract courtesy of mobile carrier AT&T for $99. The online Microsoft Store, however, is offering the Lumia 520 for $40 less than AT&T. Until Jan. 12, potential customers can purchase the Lumia 520 for $59 from the online Microsoft Store, which includes free shipping and free returns.

                  The Lumia 521 is also on sale on the online Microsoft Store. The Lumia 521 also had an original price tag of $99 but can be bought for $69. The online Microsoft Store will also ship the Lumia 521 for free.

                  From AdDuplex Windows Phone Statistics Report for December 2013 [AdDuplex blog, Dec 30, 2013]

                  … The raw data analyzed was collected over the day of December 27th, 2013 (UTC time) unless otherwise stated. …

                  United States

                  clip_image005

                  We have a new number 2 in the US and that’s the global leaderLumia 520. Therefore, the share of Lumia 52x devices in US is now pretty much aligned with global situation. It’s notable that L520 has jumped 3 places in just one month and that is, most likely, the result of the aggressive pricing we’ve seen recently.

                  image

                  Another notable fact about US is that T-Mobile is the new number 2 Windows Phone 8 operator in the country with 23.3% of the market. We will see if Verizon is able to come back next month when, hopefully, their new Windows Phone flagship is out. But, if history is any indication, it is the low end that drives the market share. So it is more likely that we won’t see any major changes in the lineup next month.

                  Note that according to MetroPCS Aggressively Expands, Adding 15 New Markets; Triples Reach to 45 Markets across the United States just Six Months after T-Mobile-MetroPCS Combination [press release Nov 5, 2013]

                  About MetroPCS
                  MetroPCS provides the freedom and convenience of unlimited, no-annual-contract wireless services on an advanced nationwide 4G LTE network for a flat rate. With MetroPCS, customers get great value and a wide variety of device choices from leading brands. A flagship brand operated by T-Mobile US, Inc. (NYSE: “TMUS”), MetroPCS products and services are available online and across the United States through a network of company-owned stores, authorized dealer locations, and leading national retailers.

                  In addition to that MetroPCS Broadens 4G Smartphone Lineup with Addition of New Nokia, LG Phones and Access to Nationwide 4G Network [press release July 25, 2013]

                  Nokia Lumia 521 is MetroPCS’ First Windows Phone 8 Smartphone; LG Optimus F3™ Adds LTE to Popular Optimus Line

                  In conjunction with the expansion of its MetroPCS brand into 15 new markets, T-Mobile US, Inc. (NYSE: TMUS) is tomorrow introducing two new smartphones to the brand’s 4G portfolio: the LG Optimus F3™ and the Nokia Lumia 521. These devices can be paired with MetroPCS’ affordable 4G service plans, featuring unlimited data, talk and text – taxes and regulatory fees included – starting at just ‘$40, period’ and all on a nationwide 4G network.

                  The Nokia Lumia 521 and LG Optimus F3 join MetroPCS’ lineup of recently launched HSPA+ and LTE smartphones — delivering all the speed wireless consumers need for a great mobile experience.

                  As the first Windows Phone 8 product in the MetroPCS lineup, the Lumia 521 is another way the company is delivering what consumers demand – more choice – and at a great price for just $99 (plus taxes and fees).

                  From Nokia Lumia 521 Specs, Price Deal Offer: Microsoft Store Cuts Lumia 521 Windows Phone 8 Smartphone Cost to $69 [Video] [Tech: Latinos Post, Dec 26, 2013]

                  The online Microsoft Store has reduced the price of the Nokia Lumia 521 Windows Phone 8 smartphone just as the holiday shopping season ends.

                  Previously priced at $99, the Lumia 521 can be purchased on the online Microsoft Store for $69. Back on Dec. 1, the online Microsoft Store had priced the Lumia 521 at $79.

                  The online Microsoft Store will also ship the Lumia 521 for free.

                  On Amazon, as of Christmas Eve, the Lumia 521 could be purchased for $97.93, which is a savings of $52.06 or 35 percent from the original price. The Lumia 521 was previously listed at $149.99 for its original price.

                  From Nokia Lumia 521 Specs and Price in USA: Amazon, T-Mobile New Price Offers for Lumia 521 Windows Phone 8 Smartphone [Video] [Tech: Latinos Post, Dec 7, 2013]

                  As Latinos Post reported on Dec. 4, Amazon had priced the Lumia 521 Windows Phone 8 smartphone for T-Mobile at $135.99. The price change comes two days after it was priced at $79.95 instead, which was a 47 percent discount from the original listed price of $149.99.

                  As of Dec. 7, the Lumia 521’s price changed and slightly better than the previous $135.99 cost. The Lumia 521’s new price tag is $94.97, which is a savings of $55.02 from the original $149.99 list price.

                  From Nokia Lumia 521 Specs and Price in USA: Amazon, T-Mobile New Price Offers for Lumia 521 Windows Phone 8 Smartphone [Video] [Tech: Latinos Post, Dec 7, 2013]

                  The Lumia 521 can also be purchased from the official T-Mobile website with $0 up front. On T-Mobile’s website, the full retail price of the Lumia 521 is $102, which is down from $126 previously listed. With the $0 up front offer, the T-Mobile customer will have to pay $4.25 for the next 24 months. If the consumer cancels the wireless service, the remaining balance on the Windows Phone 8 device will be due. With the purchase of the Lumia 521 from T-Mobile’s official website, the consumer will receive a $20 app credit.

                  From Nokia Lumia 520 Specs and Price in Amazon at $30 Off From Original $99.99 Price Tag for AT&T GoPhone Prepaid Plan [Video] [Tech: Latinos Post, Nov 26, 2013]

                  As seen on Amazon, the Lumia 520 Windows Phone 8 was originally offered for $99.99 as part of AT&T’s GoPhone prepaid plans. Amazon, however, reduced the price by $30, or the equivalent of 30 percent.

                  The new price for the Lumia 520 is $69.99 and with free shipping due to the item costing more than $35.

                  According to Amazon, the Lumia 520 is ranked as No. 35 in the Cell Phones and Accessories Best Sellers Rank. The Windows Phone 8 device is No. 2 among the No-Contract Cell Phones rank.

                  Meanwhile, AT&T’s official website is offering the Lumia 520 at the original cost with the GoPhone plan. AT&T has the Lumia 520 for $99.99, but the item comes with free shipping.

                  From AdDuplex Windows Phone Statistics Report for October 2013 [AdDuplex blog, Oct 14, 2013]

                  … The raw data analyzed was collected over the day of October 11th, 2013 (UTC time) unless otherwise stated. …

                  United States

                  image

                  US now has a new leader and if you combine it with its 520 sibling you can see that almost quarter of the Windows Phones in US is now represented by the lower end 52x line. There’s also a new leading operator for Windows Phone 8.

                  image

                  AT&T has reclaimed the crown from Verizon, and MetroPCS and T-Mobile continue to steal the market share from the 2 leaders.

                  image

                  Windows Phone 8 is now on almost 83% of devices in the US, so if you are targeting US market, I guess, you can start developing for WP8-only.

                  Canada

                  image

                  I don’t think we’ve covered Canada ever before, so it’s time to fix that. Lumia 520 and 920 almost share the top spot and it’s pretty clear that the smaller sibling will claim the crown next month. What is interesting to see is that Samsung ATIV S is at number 3 with a solid 14%. Not a picture we are used to seeing.

                  T-Mobile US Inc TMUS, Q3 2013 Earnings Call Transcript [Morningstar, Nov 5, 2013]

                  Our Un-carrier strategy continues to separate us from the competition, so let me give you a quick refresher of what we’ve done at a torrid pace so far this year. Un-carrier 1.0, Simple Choice, announced in March, got rid of the annual service contracts and introduced a radically simplified consumer rate plan. Un-carrier 2.0, JUMP!, launched in July, gave consumers the freedom to affordably upgrade their device when they want, not when they are told.

                  We already have more than 2.2 million net enrollments. It’s really working. Simple Choice for family launched at the same time as JUMP! Less families get rid of annual service contracts and offers affordable service plan with no credit check.

                  Simple Choice for business launched in August extended the benefits that consumers had been enjoying, no annual service contracts and simple rate plan. Businesses of all size can decouple services from the cost of devices to get rid of unpredictability.

                  Un-carrier 3.0 Simple Choice Global launched in October made the world our customer’s network as we now offer unlimited data in texting worldwide at no extra charge in over 100 countries.

                  Most recently, Un-carrier 3.0 part 2 where we unleashed tablets without up to 200 megabytes of three 4G LTE data every month. We think it’s insane that most tablet owners don’t even signup for mobile Internet data services, because they are worried about high fees and overages.

                  Smartphone sales showed continued growth coming in at 5.6 million devices or 88% of total phone units. This is a great trend that will continue as we now have all the leading devices in our lineup.

                  So, the strategy is managing to peel people away from AT&T, Sprint, Verizon, and the other carriers and you’ve seen churns start to elevate at some of the other carriers as the Un-carrier strategy at T-Mobile has taken hold.

                  In Bonn, we talked about the Un-carrier pain point strategy and how it would be successful, and we pointed out that the most vulnerable was AT&T, but not that we would solely take share from AT&T. But any customer that was feeling these pain points and that’s really been the process, but in the beginning obviously a lot of the ease of switching technology-wise were AT&T customers. We’re actually in a new phase now where there has been a lot of discussion recently in the industry about potential network speeds in 2015 or trials of speed in 25% of five Metro markets. What Neville and his team have announced now, are smoking fast industry-leading speeds everywhere now. That clearly along with our full device portfolio and our Un-carrier strategy does make us a threat across the entire market. So we will continue to attack the same foes. …

                  … We’re taking share from the low end and the high end. At the high end, it’s particularly AT&T and Sprint, and at the low end from a variety of prepaid players. I think we’ve described before that the Un-carrier strategy is about establishing a really effective midmarket space that’s in no trade-offs positioning. Meaning, traditionally people have had to down at the low end trade off what’s great about wireless to get great value or at the high ends go after one of the top two or three networks, but suffer restrictions and lack of value and lack of transparency and pricing, et cetera. What we do, as John just said, is provide a no-apologies, fantastic, market-leading network position booked back by the Un-carrier value proposition that’s simple and transparent, fair and flexible. So it’s a midmarket position, and therefore it’s taking share from above and from below, but more from above. So more of the share coming from AT&T and Sprint so far.

                  T-Mobile drops upfront price of Nokia Lumia 925 and 521 to zero [Neowin, July 26, 2013]

                  T-Mobile just started selling the Nokia 925 in the US less than 10 days ago for $49.99 upfront, but starting tomorrow that new Windows Phone 8 device, and indeed nearly all of the smartphones that T-Mobile sells, will have the low, low upfront price of zero.

                  T-Mobile Promotes Unprecedented Deal This Summer – Zero Dollars Down for All Devices [press release, July 26, 2013]

                  T-Mobile Promotes Unprecedented Deal This Summer – Zero Dollars Down for All Devices

                  America’s Un-carrier offers the ultimate promotion with the lowest upfront price on devices

                  BELLEVUE, Wash. – July 26, 2013 In time for back-to-school, T-Mobile US, Inc. (NYSE: TMUS) will drive an unparalleled promotion this summer, dropping the upfront price on its entire lineup of devices in stores nationwide to zero dollars down. With this promotion, new and existing well-qualified consumers and small business customers will get affordable and hassle-free access to the latest 4G LTE smartphones, tablets, mobile hotspots and feature phones at the upfront price of $0 down with monthly device payments[1].
                  This limited-time promotion is available starting tomorrow, July 27, 2013. In addition to the promotion, customers also can take advantage of T-Mobile’s groundbreaking upgrade program, JUMP!(TM), which enables them to sign up to upgrade their phones when they want, up to twice a year as soon as six months from enrollment.
                  The number of reasons not to switch to T-Mobile this summer is ZERO,” said John Legere, president and chief executive officer, T-Mobile US. “This is a fantastic offer and we’re making it easier than ever for customers to get the latest amazing devices. Adding Zero Down in addition to JUMP!, and Simple Choice with no contract is all about making wireless work for consumers and shaking up this industry.”
                  The hottest devices of the summer at the lowest upfront cost combined with T-Mobile’s Simple Choice Plan, unlimited data on a nationwide 4G network and no annual service contracts gives customers an opportunity that’s tough to beat. The promotion will be available nationwide at participating T-Mobile retail stores, via customer care, and online athttp://www.T-Mobile.com. A selection of the devices included in the promotion is as follows:
                  Device
                  Down Payment
                  Monthly Payments
                  24 equal monthly payments for 0% APR on approved credit1
                  Samsung Galaxy S® 4
                  $0
                  $25
                  Samsung Galaxy Note® II
                  $0
                  $26
                  Samsung Galaxy S® III
                  $0
                  $22
                  Xperia® Z from Sony
                  $0
                  $25
                  iPhone 5[2]
                  $0
                  $27
                  Nokia Lumia 925
                  $0
                  $22
                  Nokia Lumia 521
                  $0
                  $6
                  BlackBerry® Q10
                  $0
                  $25
                  HTC One®
                  $0
                  $25
                  Samsung Galaxy Tab(TM) 2 10.1
                  $0
                  $20
                  JUMP!
                  JUMP!, only from T-Mobile, offers customers the freedom to upgrade to a new device more frequently and affordably, and it includes handset protection that helps to protect against malfunction, damage, loss or theft – all for just $10 per month, per phone (plus taxes and fees). Customers can upgrade to a new phone, financed through T-Mobile’s Equipment Installment Program (EIP), twice every 12 months after they’ve been in the JUMP! program for six months.[3] Simply trade in an eligible T-Mobile phone in good working condition at a participating store location. Any remaining EIP payments will be eliminated, and current customers can purchase new phones for the same upfront pricing as new customers, with device financing3 and a no-annual-service contract Simple Choice Plan.
                  Simple Choice Plan
                  T-Mobile’s Simple Choice and Simple Choice for Business Plan start with a base rate of $50 per month for unlimited talk, text and Web with 500MB of high-speed data. Customers can get 2.5GB of high-speed data for $10 more per month per line or unlimited data for an additional $20 per month per line. Customers can add a second phone line for $30 per month, and each additional line is just $10 per month. There are no caps and no overages on T-Mobile’s network, and no restrictive annual service contracts.
                  When small business customers activate a new line or renew an existing line of service with a Simple Choice for Business plan (min. 500 MB data required), they can access T-Mobile’s Business Extras bundle for added value. Business Extras customers with capable devices may opt into a free year of 24/7 remote IT support from a well-recognized, third-party service provider and a year of T-Mobile North American Flat-rate Data feature, allowing free monthly access to up to 150MB of overage-free, high-speed data across North America, including Canada and Mexico on our partner networks.[4] Other benefits include two free paper-to-mobile form conversions, waived activation fees, Business Care support, Wi-Fi Calling on enabled devices, and free Smartphone Mobile Hotspot on select rate plans.
                  T-Mobile’s 4G LTE Network
                  T-Mobile’s 4G LTE network now reaches more than 157 million people across the United States and is live in 116 metropolitan areas. T-Mobile remains on target to deliver nationwide 4G LTE coverage by the end of the year, reaching 200 million people in more than 200 metropolitan areas. In addition, T-Mobile’s 4G HSPA+ service is available to 228 million people nationwide. By combining 4G HSPA+ and LTE network technologies, T-Mobile can provide customers with a strong, seamless nationwide 4G network experience[5].
                  About T-Mobile US, Inc.:
                  As America’s Un-carrier, T-Mobile US, Inc. (NYSE: “TMUS”) is redefining the way consumers and businesses buy wireless services through leading product and service innovation. The company’s advanced nationwide 4G and 4G LTE network delivers outstanding wireless experiences for customers who are unwilling to compromise on quality and value. Based in Bellevue, Wash., T-Mobile US operates its flagship brands, T-Mobile and MetroPCS. It currently serves approximately 43 million wireless subscribers and provides products and services through 70,000 points of distribution. For more information, please visit http://www.t-mobile.com.

                  Statement: Nokia Lumia 521 [T-Mobile, May 22, 2013]

                  Beginning May 22, the Nokia Lumia 521 – an exclusive to T-Mobile – will be available online at www.T-Mobile.com, at T-Mobile retail stores, select dealers and national retailers for $29.99 down with 24 equal monthly payments1 of $5 0% APR O.A.C. for well-qualified customers with the new Simple Choice Plan

                  Powered by Windows Phone 8, the Lumia 521, which will run on T-Mobile’s fast nationwide 4G network, is a perfect, everyday smartphone that will embody a range of high-end features at an affordable price.  The smartphone features a super sensitive 4″ touch screen, HD Voice, 5MP camera with auto focus and 720p HD video recording.  It will also include exclusive Nokia applications such as Nokia Music, Cinemagraph, Creative Studio, Panorama, Smart Shoot and HERE Drive, Maps and Transit. 

                  Lumia 521 smartphones sold through T-Mobile will also feature Wi-Fi Calling.  Customers who have already purchased or ordered a Lumia 521 through HSN, Microsoft Retail Stores or Walmart will receive an over-the-air maintenance release beginning May 20, which will enable the Wi-Fi Calling feature.

                  The Nokia Lumia 521 will go on sale at Microsoft Retail Stores for $149 and at Walmart for $129.88 on May 11, as previously announced.  For more information on T-Mobile’s Nokia Lumia 521, please visit the media kit.

                  1 If you cancel wireless service, remaining balance on phone becomes due.

                  T-Mobile’s Growth Is Bad News for Apple [The Motley Fool, Jan 8, 2014]

                  No wonder rival AT&T (NYSE: T ) has gotten so aggressive — T-Mobile‘s (NYSE: TMUS ) “un-carrier” initiatives have proven to be wildly successful. According to new data from Kantar Worldpanel, T-Mobile almost doubled its share of U.S. smartphone sales in the third quarter of 2013, while market-leaders AT&T and Verizon lost ground.

                  T-Mobile’s growth flies in the face of industry observers, who have long argued that consumers favor smartphone subsidies. If current trends persist, subsidies could soon become a thing of the past — and that’s not good for Apple(NASDAQ: AAPL ) shareholders.

                  T-Mobile ends contracts

                  T-Mobile spent 2013 rolling out a number of initiatives that cumulatively comprise its un-carrier strategy, the most significant of which has been the end of two-year contracts. Starting in 2013, T-Mobile did away with them — new subscribers pay for their service strictly on a month-to-month basis. Because T-Mobile’s subscribers now have the freedom to easily ditch their service, T-Mobile no longer pays for its subscribers’ handsets. T-Mobile customers can buy their phones in full, or pay them off in monthly installments, but either way, they’re paying the full retail price.

                  This stands in stark contrast to the business model long championed by T-Mobile’s rivals, including AT&T. Under the standard, two-year contract model, carriers foot the bill for much of their subscribers’ handsets, but lock them up with a contract.

                  AT&T warns smartphone subsidies are coming to an end
                  AT&T, however, could soon join T-Mobile in ditching subsidies. Last month, AT&T’s CEO warned that the current smartphone subsidy model cannot continue to persist.

                  AT&T still offers subsidies and contracts for now, but in December the carrier rolled out a new initiative structured much like T-Mobile’s. AT&T’s new “Mobile Share Value” plan lets subscribers pay for their service on a month-to-month basis but doesn’t cover the cost of their phone. Last week AT&T went further, offering to give T-Mobile subscribers up to $450 in credit if they switched to one of AT&T’s new plans.

                  Based on Kantar’s recent numbers, AT&T has reason to shake up its business: last quarter, AT&T sold just 28.3% of new US smartphones, down from nearly 35% in the same quarter last year.

                  Apple’s iPhone business remains subsidy-dependent
                  A war between the two companies is good for consumers, but potentially bad for Apple. The King of Cupertino still derives the vast majority of its profit from the iPhone, and sales could take a hit if smartphone subsidies go away.

                  Although Apple has just slightly more than 13% of the global smartphone market, it sells more than 43% of the smartphones in the U.S. The reason for the discrepancy comes down to subsidies — U.S. carriers’ willingness to heavily subsidize Apple’s iPhones has made them affordable to U.S. consumers. In most other countries around the world, smartphone subsidies are uncommon.

                  The pricing pressures that drive global consumers to pick alternative handsets could apply in the U.S. if smartphone subsidies go away. Without subsidies, the $350 Nexus 5 looks much more attractive when compared to Apple’s $649 iPhone 5s. And even if consumers continue to buy Apple-made handsets, they could choose to hold on to their old models for longer, resulting in much longer upgrade cycles.

                  Watch for other carriers to copy T-Mobile
                  T-Mobile’s incredible success has shown rival carriers that it’s possible to attract customers even without offering smartphone subsidies. With AT&T’s sales on the decline (according to Kantar), the carrier seems to be following T-Mobile’s lead in abandoning subsidies.

                  If this trend spreads to other major U.S. carriers, Apple shareholders should be concerned.

                  2014 will be the last year of making sufficient changes for Microsoft’s smartphone and tablet strategies, and those changes should be radical if the company wants to succeed with its devices and services strategy

                  For the company’s most recent “ONE Microsoft” strategy see:
                  Microsoft reorg for delivering/supporting high-value experiences/activities [‘Experiencing the Cloud’, July 11, 2013]
                  How the device play will unfold in the new Microsoft organization? [‘Experiencing the Cloud’, July 14, 2013]
                  Update: There are extremely worrying signs on the horizon as per Jan 27, 2014:
                  MediaTek MT6592-based True Octa-core superphones are on the market to beat Qualcomm Snapdragon 800-based ones UPDATE: from $147+ in Q1 and $132+ in Q2
                  End of the Nokia “magic” hurting European and Asian consumers while mobile carriers are uncertain about the future under the Microsoft brand
                  End of Update
                  As 2014 will be the last year of “free ride” in the smartphone and tablet spaces for ARM-based competitors of Intel – at least what Intel is insisting again [‘Experiencing the Cloud’, Jan 17, 2014] it is time to summarize the ARM-based opportunities for 2014 (note that Intel’s goal in the tablet space is only 40 million units, both Android and Windows):

                  imageCompare everything to 2014 global notebook demand forecast [DIGITIMES Research, Dec 5, 2013] which estimates that global notebook shipments in 2014 will reach around 160 million units, down from a peak of over 200 million in 2011, but the drop in 2014 will be lower than the on-year drop in 2013, with new market developments, new product opportunities, and changes in the major players’ strategies all playing critical roles in the IT industry’s future trends.

                  Digitimes Research: Global smartphone shipments to top 1.24 billion units in 2014 [Jan 14, 2014]
                  Global smartphone shipments are expected to top 1.24 billion units in 2014, with Samsung Electronics, Apple, LG Electronics, Sony Mobile Communications, Lenovo, Huawei [according to the company: 52 million units in 2013 vs 60 million target] , Microsoft, ZTE, Coolpad and TCL serving as top-10 vendors, according to Digitimes Research.
                  Apple may see its shipments double in 2014 largely due to increased shipments to China and Japan as it will benefit from its cooperation with the largest telecom operators in the two countries, said Digitimes Research.
                  The growth rate for Samsung will be limited in 2014 as its sales in the US, China and Japan will be depressed by growing popularity of iPhones.
                  China-based Lenovo, Huawei and Coolpad are expected to step up their efforts to boost sales in overseas markets after being enlisted among the top-10 vendors due to higher shipment volumes in the home market in China.
                  However, TCL and ZTE will continue to ship smartphones to overseas markets mainly, but will also strengthen sales in China, with domestic sales to account for less than 50% of their total shipments in 2014, commented Digitimes Research.
                  This article is an excerpt from a Digitimes Research Special Report (2014 global smartphone market forecast).
                  Digitimes Research: China smartphone-use application processor shipments edge up 2.4% in 4Q13 [Jan 15, 2014]
                  Shipments of application processors for smartphone applications to China grew 2.4% sequentially and 20.8% on year in the fourth quarter of 2013, according to data compiled by Digitimes Research.
                  MediaTek saw its AP shipments decline 3.9% sequentially in the fourth quarter due to inventory checks at clients and a high growth recorded in the previous quarter.
                  However, it was a 20% sequential shipment decline suffered by Qualcomm the fourth quarter that weakened the growth momentum of the application processor sector, said Digitimes Research.
                  Meanwhile, MediaTek has been shifting its focus to the high-margin segment, instead of seeking high shipment growth. China-based Spreadtrum Communications was hit with high inventory of TD-SCDMA chips and slow sales of its dual- and quad-core solutions, Digitimes Research indicated.
                  Qualcomm also saw its performance weaken in the fourth quarter as its QRD (Qualcomm reference design) chips were less competitive than those offered by rivals in terms of product features.
                  This article is an excerpt from a Chinese-language Digitimes Research report. Click here if you are interested in receiving more information about the content and price of a translated version of the full report.
                  Digitimes Research estimates that in 2014 global tablet shipments will reach 289 million units [Dec 31, 2013]
                  China white-box makers add extra value to tablets as cost reduction is no longer possible [DIGITIMES Research, Jan 16, 2014]
                  China white-box players have not been able to lower their Wi-Fi-based tablets’ prices since the third quarter of 2013 because there is no room for further reductions in their BOM costs.
                  The average BOM cost for a white-box tablet – most of which adopted a dual-core processors – stood at about US$25 as of the fourth quarter of 2013. Dual-core processor pricing could not drop any further, as their average prices came to about US$4, only less than US$1 higher than that of a single-core one.
                  Memory and 7-inch TN LCD panels are the two key components that account for major shares of white-box tablet BOM costs. However, most panel suppliers have been only willing to upgrade specifications instead of dropping their quotes, and therefore, white-box players have been left with upgrading their devices with better panels without an option of reducing the panel cost.
                  While cost reduction is no longer a feasible way to attract consumers, many white-box players have turned to push tablets with phone functions to increase their devices’ functionalities and value. The devices also provide higher gross margins for vendors.
                  Digitimes Research estimates that currently, 80% of white-box tablets are available in countries other than China, because white-box tablets with phone functions have seen rising demand in Russia and other markets in Eastern Europe and Southeast Asia since the second half of 2013.
                  China white-box players’ partnerships with regional brand vendors in emerging markets have also helped raise local consumers’ demand for tablets with phone functions.
                  In the first half of 2013, most white-box tablets with phone functions adopted China-based Allwinner Technology’s solution which combined an entry-level single-core processor with a discrete baseband module. However, many white-box device makers have turned to MediaTek solutions for their tablets since the second half of 2013 after the Taiwan-based chipmaker also integrated a baseband chip into its tablet processor solution.
                  MediaTek’s solution is more expensive, but its support for product development and hardware design has given it an upper hand over competitions. Meanwhile, independent design houses (IDHs), which provide white-box players with product design services, also started to design tablets using MediaTek’s smartphone processors in the second half of 2013, which prompted white-box players to adopt MediaTek’s solutions.
                  Digitimes Research estimates that tablets with phone functions will account for 40% of 7-inch white-box tablet shipments in 2014, up from 20% in 2013.
                  image

                  In 2014, smartphones are expected to continue penetrating rapidly into emerging markets such as Russia, India, Indonesia and Latin America, while China’s smartphone shipments will see weakened on-year growth in the year, but still enormous volume. Within the top-10 smartphone vendors in 2013, four of them are from China and in 2014 more China-based vendors are expected to enter the top 10.
                  Three China-based handset vendors increase component deliveries [DIGITIMES, Dec 11, 2014]
                  China-based handset vendors Xiaomi Technology, Gionee and Hisense have been taking increasing deliveries of panels and touch panels from suppliers in preparation for launching new models during the peak period before the 2014 Lunar New Year at the end of January, according to Taiwan-based supply chain makers.
                  Other China-based vendors including Lenovo, Huawei Device and Oppo have begun to follow suit, the sources indicated.
                  Xiaomi has seen success in marketing its high-end Xiaomi 3, mid-range Xiaomi 2S and entry-level Hong-mi (Red Rice), the sources noted.
                  Gionee focuses on marketing high-end smartphones priced above CNY2,000 (US$328) through general retail chains without cooperation with China’s three mobile telecom carriers, the sources indicated. Gionee has shipped more than two million smartphones a quarter so far in 2013.
                  Hisense is among several licensed vendors of 4G smartphones and has launched the 5-inch X6T, its first 4G smartphone featuring TD-LTE, LTE-FDD, TD-SCDMA, WCDMA and GSM, on 12 frequency bands, the sources noted. Hisense has taken delivery of components for use in more than one million handsets to be launched before the 2014 Lunar New Year, the sources noted.
                  China market: Xiaomi lowers price for Hongmi smartphone [DIGITIMES, Jan 7, 2014]
                  China-based vendor Xiaomi Technology has reduced the retail price for its budget TD-SCDMA smartphone, the Hongmi, launched in August 2013, from CNY799 (US$132) to CNY699, heralding upcoming competition in the Android smartphone segment in China, according to industry watchers.
                  Rival vendor Huawei is likely to counteract by slashing the prices of its Honor-branded budget smartphones, while other local brands in China are also expected to follow suit soon, said the observers.
                  Optimizing its policy of offering smartphones with high hardware specifications and yet at low prices, Xiaomi has managed to ramp up its shipments to over three million units a month and is expected to ship over 40 million smartphones in 2014, the sources estimated. [According to Xiaomi: “7.2 million devices … in 2012 and 18.7 million …bought in 2013. … for 2014 – the CEO expects forty million Xiamoi smartphones to be bought”]
                  Asustek expected to ship 2014 target of 5 million smartphones [DIGITIMES, Jan 7, 2014]
                  Asustek Computer unveiled three ZenFone-series smartphones for the opening of CES 2013. Viewing that ZenFone models have comparatively high price-performance ratios, Asustek will be able to hit its target shipments of five million smartphones for 2014, and is likely to ship 8-10 million units, according to market analysts.
                  The three ZenFone models will initially launch in the Taiwan, China and Southeast Asia markets in March at contract-free retail prices of US$99 for the 4-inch model, US$149 for the 5-inch, and US$199 for the 6-inch.
                  All three models are equipped with Intel Atom processors and Asustek will launch 3-4 models also with Atom processors in the second half of 2014, the sources indicated.
                  Since Intel has offered incentives to attract PC vendors to adopt its platforms for smartphones, Asustek is expected to procure Atom processors at discount prices and receive subsidies from Intel for marketing the devices, the sources said.
                  Asustek likely to release smartphone orders to China ODMs in 2H14, says paper [DIGITIMES, Jan 15, 2014]
                  Asustek Computer does not rule out the possibility of tying up with handset ODMs in China for the production of smartphones in the second half of 2014, the Chinese-language Economic Daily News (EDN) has quoted company CEO Jerry Shen as saying.
                  After unveiling five new models at the recently concluded CES 2014, Asustek plans to launch another five smartphones in the second half of the year, and therefore it needs more ODMs to support production, Shen was quoted as indicating.
                  The three ZenFone-series smartphones out of the five models unveiled by Asustek at CES 2014, with displays sized in 4-, 5-, and 6-inch, will be available for US$99, US$149 and US$199 unlocked, respectively, and are designed to take on China-based rivals in the entry-level smartphone segment.
                  The possible switch of orders to China-based ODMs may affect its current production partners in Taiwan, including Wistron and Pegatron, said the paper.
                  Digitimes Research: Asustek ZenFone smartphones have lower price-performance ratios than comparable models from China [Jan 17, 2014]
                  Asustek Computer unveiled three ZenFone-series smartphones at CES 2014 and will initially launch the models in the Taiwan, China and Southeast Asia markets in March with prices comparable to low-cost models offered by China-based Xiaomi Technology and Huawei. But the price-performance ratios of the ZenFones will be still lower than rival models from China-based vendors due to the use of different marketing channels, according to Digitimes Research.
                  China-based vendors such as Huawei and Coolpad have been duplicating the business model initiated by Xiaomi by introducing entry-level models with higher hardware specifications and marketing the gadgets mainly through the Internet.
                  Leveraging subsidies offered by telecom operators, Asustek has been able to lower prices for its ZenFone models to levels comparable to those offered by Xiaomi, Huawei and Coolpad, but the price-performance ratios are lower than of the Hongmi smartphone from Xiaomi, the Honor 3C from Huawei and the Great God F1 from Coolpad, due to markup costs added by channel operators in China selling the ZenFones.
                  Due to the lower price-performance ratios, Asustek’s goal of shipping over five million smartphones in 2014 through a low-pricing model remains hard to achieve, commented Digitimes Research.
                  This article is an excerpt from a Chinese-language Digitimes Research report. Click here if you are interested in receiving more information about the content and price of a translated version of the full report.
                  Total: ~289+ million
                  Apple: 80-90 million
                  Non-Apple brand vendors: ~105+ million
                  – Samsung: 60-70 million
                  Whitebox vendors: ~104 million image
                  Apple, Samsung expected to ship 80-90 million, 60-70 million tablets in 2014, say sources [DIGITIMES, Jan 17, 2014]
                  Apple and Samsung Electronics will remain as the global top-two tablet vendors in 2014 with expected shipments of 80-90 million and 60-70 million units, respectively, according to Taiwan-based supply chain makers.
                  Samsung’s recent launch of its 12.2-inch model is expected to propel Apple to accelerate development of large-size iPads. Market sources indicated that Apple is likely to release a 12.9-inch model by the end of the third quarter at the earliest.
                  The two vendors are also expected to continue rolling out new versions of their existing models.
                  Samsung is also likely to launch more Galaxy Lite models, with prices going down as low as US$129, the sources indicated, adding that Samsung’s tablet shipments in 2014 are expected to reach 60-70 million units compared to 40 million shipped in 2013.
                  Meanwhile, Apple reportedly has asked its production partners and component suppliers to develop new models of 7.9- and 9.7-inch tablets, added the sources.
                  Foxconn expected to ship 55-60 million tablets in 2014, say Taiwan makers [DIGITIMES, Jan 16, 2014]
                  Foxconn Electronics (Hon Hai Precision Industry) shipped 50 million tablets to become the largest Taiwan-based ODM in 2013 and is expected to ship 55-60 million units to maintain the leading status in 2014, according to supply chain makers.
                  Foxconn is the main OEM for iPads and has undertaken ODM production of Amazon tablets, the sources noted.
                  Pegatron, with orders for iPad, Surface and tablets launched by Asustek Computer, shipped 25 million units in 2013 and is expected to remain as the second-largest ODM with shipments of 25-28 million units in 2014, the sources indicated.
                  With Lenovo and Acer being major clients, Compal Electronics shipped seven million tablets in 2013. With potential OEM orders for iPad mini with Retina display and additional ODM orders from Amazon, Compal is likely to ship 14 million tablets in 2014, the sources estimated.
                  Quanta Computer shipped 15-16 million tablets in 2013, of which a large portion were Nexus models for Google, the sources noted. Although Quanta may obtain OEM orders for a 12.9-inch iPad, shipments in 2014 will be low volume, the sources indicated. Therefore, Quanta’s 2014 tablet shipments are expected to remain at 15-16 million units.
                  Digitimes Research: Non-Apple brand vendors to ship 105 million tablets in 2014 [Nov 19, 2013]
                  Global tablet shipments are expected to reach 289 million units in 2014, up 23.6% on year. The growth, however, will be weaker than that for smartphones due to the fact that the tablet market has already entered the maturity stage, according to Digitimes Research’s latest figures.
                  In 2014, non-Apple first-tier brand vendors’ products are expected to have more room for price cuts, making their products even more competitive in China than their white-box competitors. The lower pricing means retailers will be more eager to promote their products. The gap in terms of functionality between Google’s official Android operating system and Android Open Source Project (used mostly by China white-box vendors) are also expected to be widen. As a result, the non-Apple first-tier vendors’ combined shipments are expected to grow dramatically to 105 million units in 2014, slightly surpassing China white-box vendors’ combined shipments of 104 million units, according to estimates by Digitimes Research.
                  Although the fifth-generation iPad (Air) is expected to attract consumers and stimulate replacement demand, the device’s high pricing are expected to limit iPad series products’ shipment growth in 2014 with the volume to reach only 80 million units.
                  As for brand vendors’ rankings, Apple and Samsung Electronics will remain in the top two in 2014. Since Samsung will adopt more aggressive marketing and pricing strategies in 2014, its shipments will reach 52.5 million units, reducing its gap with the market leader Apple. Lenovo, as the largest PC vendor worldwide and with advantages in its home market of China, is expected to ship 9.5 million units in 2014 to take third place in the tablet market.
                  Having failed to obtain orders for the next-generation Google Nexus tablets, Asustek Computer is expected to step up promoting its own-brand tablets, and it will ship nine million units in 2014, becoming the fourth largest vendor.
                  Acer will have a strong presence in entry-level segment, shipping 6.7 million tablets in 2014 to take sixth place, while Google will be the fifth largest vendor. Amazon‘s [5.45 million units in 2013 #5 with that in 2013] and Microsoft‘s shipments [max ~2-3 million of Surface Pro and ??? of Surface] will stay flat or grow only slightly on year.
                  Digitimes Research expects 7-inch models to remain as the mainstream size for branded tablets in 2014 with shipments set to reach 89.1 million units. But the segment’s share of total tablet shipments will drop below 50%. Brand vendors are expected to place more emphasis on 8-inch models as they look to avoid fierce competition in the 7-inch segment, which is crowded with low-price and white-box products. Shipments to the 8-inch segment are expected to reach 30 million units in 2014, triple the volume in 2013 and surpassing 10-inch models’ 25.4 million units.
                  As for Taiwan ODMs, their shipments will hit 117 million units in 2014, accounting only for 63% of the global total, down 5.2pp on year. The share will decline because Samsung and Lenovo, the second and the third largest vendors, are making most of their tablets internally.
                  To seek lower manufacturing quotes and to diversify risks, brand vendors are expected to further divide their tablet orders among ODMs. Foxconn Electronics (Hon Hai Precision Industry) and Pegatron Technology will remain as the top two ODMs for tablets in 2014. With more orders coming from Apple and Asustek, Pegatron will see significant tablet shipment growth in 2014, narrowing its gap with Foxconn. Compal Electronics is expected to surpass Quanta Computer to become the third-largest table ODM, thanks to orders from Apple and its acquisition of Compal Communication.
                  This article is an excerpt from a Digitimes Research Special Report (2014 global tablet demand forecast). Visit our latest Special reports.

                  More information (going back to end of July 2013) which is directly related to the possible changes on the 2014 markets in terms of 2014 will be the last year of “free ride” in the smartphone and tablet spaces for ARM-based competitors of Intel – at least what Intel is insisting again [‘Experiencing the Cloud’, Jan 17, 2014]:

                  Microsoft products for the Cloud OS

                  Part of: Microsoft Cloud OS vision, delivery and ecosystem rollout

                  1. The Microsoft way
                  2. Microsoft Cloud OS vision
                  3. Microsoft Cloud OS delivery and ecosystem rollout
                  4. Microsoft products for the Cloud OS

                  4. 1 Windows Server 2012 R2 & System Center 2012 R2
                  4.2 Unlock Insights from any Data – SQL Server 2014
                  4.3 Unlock Insights from any Data / Big Data – Microsoft SQL Server Parallel Data Warehouse (PDW) and Windows Azure HDInsights
                  4.4 Empower people-centric IT – Microsoft Virtual Desktop Infrastructure (VDI)
                  4.5 Microsoft talking about Cloud OS and private clouds: starting with Ray Ozzie in November, 2009 (separate post)

                  4.5.1 Tiny excerpts from official executive and/or corporate communications
                  4.5.2 More official communications in details from executives and/or corporate

                  4. 1 Windows Server 2012 R2 & System Center 2012 R2 [MPNUK YouTube channel, Nov 18, 2013]

                  Hosting technical training overview.

                  Windows Server 2012 R2: 0:00
                  Server Virtualization: 4:40
                  Storage: 11:07
                  Networking: 17:37
                  Server Management and Automation: 23:14
                  Web and Application Platform: 27:05

                  System Center 2012 R2: 31:14
                  Infrastructure Provisioning: 36:15
                  Infrastructure Monitoring: 42:48
                  Automation and Self-service: 45:30
                  Application Performance Monitoring: 48:50
                  IT Service Management: 51:05

                  More information is in the What’s New in 2012 R2 [Windows Server 2012 R2, System Center 2012 R2] series of “In the Cloud” articles by Brad Anderson:

                  Over the last three weeks, Microsoft has made an exciting series of announcements about its next wave of products, including Windows Server 2012 R2, System Center 2012 R2, SQL Server 2014, Visual Studio 2013, Windows Intune and several new Windows Azure services. The preview bits are now available, and the customer feedback has been incredible!

                  The most common reaction I have heard from our customers and partners is that they cannot believe how much innovation has been packed into these releases – especially in such a short period of time. There is a truly amazing amount new value in these releases and, with this in mind, we want to help jump-start your understanding of the key scenarios that we are enabling.

                  As I’ve discussed this new wave of products with customers, partners, and press, I’ve heard the same question over and over: “How exactly did Microsoft build and deliver so much in such a short period of time?” My answer is that we have modified our own internal processes in a very specific way: We build for the cloud first.

                  A cloud-first design principle manifests itself in every aspect of development; it means that at every step we architect and design for the scale, security and simplicity of a high-scale cloud service. As a part of this cloud-first approach, we assembled a ‘Scenario Focus Team’ that identified the key user scenarios we needed to support – this meant that our engineers knew exactly what needed to be built at every stage of development, thus there was no time wasted debating what happened next. We knew our customers, we knew our scenarios, and that allowed all of the groups and stakeholders to work quickly and efficiently.

                  The cloud-first design approach also means that we build and deploy these products within our own cloud services first and then deliver them to our customers and partners. This enables us to first prove-out and battle-harden new capabilities at cloud scale, and then deliver them for enterprise use. The Windows Azure Pack is a great example of this: In Azure we built high-density web hosting where we could literally host 5,000 web servers on a single Windows Server instance. We exhaustively battle-hardened that feature, and now you can run it in your datacenters.

                  At Microsoft we operate more than 200 cloud services, many of which are servicing 100’s of millions of users every day. By architecting everything to deliver for that kind of scale, we are sure to meet the needs of enterprise anywhere and in any industry.

                  Our cloud-first approach was unique for another reason: It was the first time we had common/unified planning across Windows Client, Windows Server, System Center, Windows Azure, and Windows Intune. I know that may sound crazy, but it’s true – this is a first. We spent months planning and prioritizing the end-to-end scenarios together, with the goal of identifying and enabling all the dependencies and integration required for an effort this broad. Next we aligned on a common schedule with common engineering milestones.

                  The results have been fantastic. Last week, within 24 hours, we were able to release the previews bits of Windows Client 8.1, Windows Server 2012 R2, System Center 2012 R2, and SQL Server 2014.

                  By working together throughout the planning and build process, we established a common completion and Release to Manufacturing date, as well as a General Availability date. Because of these shared plans and development milestones, by the time we started the actual coding, the various teams were well aware of each dependency and the time to build the scenarios was much shorter.

                  The bottom-line impact of this Cloud-first approach is simple:  Better value, faster.

                  This wave of products shows that the changes we’ve made internally allow us to deliver more end-to-end scenarios out of the box, and those scenarios are all delivered at a higher quality.

                  This wave of products demonstrates that the changes we’ve made internally allow us to deliver more end-to-end scenarios out of the box, and each of those scenarios are all delivered at a higher quality.  This cloud-first approach also helps us deliver the Cloud OS vision that drives the STB business strategy.

                  The story behind the technologies that support the Cloud OS vision is an important part of how we enable customers to embrace cloud computing concepts.  Over the next eight weeks, we’ll examine in great detail the three core pillars (see the table below) that support and inspire these R2 products:  Empower People-centric IT, Transform the Datacenter, and Enable Modern Business Apps.  The program managers who defined these scenarios and worked within each pillar throughout the product development process, have authored in-depth overviews of these pillars and their specific scenarios, and we’ll release those on a weekly basis.

                  Pillar

                  Scenarios

                  Empower People-centric IT

                  People-centric IT (PCIT) empowers each person you support to work virtually anywhere on PCs and devices of their choice, while providing IT with an easy, consistent, and secure way to manage it all. Microsoft’s approach helps IT offer a consistent self-service experience for people, their PCs, and their devices while ensuring security. You can manage all your client devices in a single tool while reducing costs and simplifying management.

                  Transform the Datacenter

                  Transforming the datacenter means driving your business with the power of a hybrid cloud infrastructure. Our goal is to help you leverage your investments, skills and people by providing a consistent datacenter and public cloud services platform, as well as products and technologies that work across your datacenter, and service provider clouds.

                  Enable Modern Business Apps

                  Modern business apps live and move wherever you want, and Microsoft offers the tools and resources that deliver industry-leading performance, high availability, and security. This means boosting the impact of both new and existing applications, and easily extending applications with new capabilities – including deploying across multiple devices.

                  The story behind these pillars and these products is an important part of our vision for the future of corporate computing and the modern datacenter, and in the following post, David B. Cross, the Partner Director of Test and Operations for Windows Server, shares some of the insights the Windows Server & System Center team have applied during every stage of our planning, build, and deployment of this awesome new wave of products.

                  People want access to information and applications on the devices of their choice. IT needs keep data protected and without breaking the budget. Learn how the Microsoft People-centric IT vision helps businesses address their consumerization of IT challenges. Learn More: http://www.microsoft.com/en-us/server-cloud/cloud-os/pcit.aspx
                  Hear from Dell and Accenture how Microsoft Windows Server 2012 R2 and System Center 2012 R2 enable a more flexible workstyle and people-centric IT through virtual desktop infrastructure (VDI). Find solutions and services from partners that span the entire stack of Microsoft Cloud OS products and technologies: http://www.microsoft.com/en-us/server-cloud/audience/partner.aspx#fbid=631zRfiT0WJ

                  image

                  The modern workforce isn’t just better connected and more mobile than ever before, it’s also more discerning (and demanding) about the hardware and software used on the job. While company leaders around the world are celebrating the increased productivity and accessibility of their workforce, the exponential increase in devices and platforms that the workforce wants to use can stretch a company’s infrastructure (and IT department!) to its limit.

                  If your IT team is grappling with the impact and sheer magnitude of this trend, let me reiterate a fact I’ve noted several times before on this blog: The “Bring Your Own Device” (BYOD) trend is here to stay.

                  Building products that address this need is a major facet of the first design pillar I noted last week: People-centric IT(PCIT).

                  In today’s post (and in each one that follows in this series), this overview of the architecture and critical components of the PCIT pillar will be followed by a “Next Steps” section at the bottom. The “Next Steps” will include a list of new posts (each one written specifically for that day’s topic) developed by our Windows Server & System Center engineers. Every week, these engineering blogs will provide deep technical detail on the various components discussed in this main post. Today, these blogs will systematically examine and discuss the technology used to power our PCIT solution.

                  The PCIT solution detailed below enables IT Professionals to set access policies to corporate applications and data based on three incredibly important criteria:

                  1. The identity of the user
                  2. The user’s specific device
                  3. The network the user is working from

                  What’s required here is a single management solution that enables specific features where control is necessary and appropriate, and that also provides what I call “governance,” or light control when less administration is necessary. This means a single pane of glass for managing PCs and devices. Far too often I meet with companies that have two separate solutions running side-by-side – one for every PC, and the second to manage devices. Not only is this more expensive and more complex, it creates two disjointed experiences for end users and a big headache for the IT pros responsible for managing.

                  In today’s post, Paul Mayfield, the Partner Program Manager for the System Center Configuration Manager/Windows Intune team, discusses how everything that Microsoft has built with this solution is focused on creating the capability for IT teams to use the same System Center Configuration Manager that they already have in place managing their PCs and now extend this management power to devices. This means double the management capabilities from within the same familiar console. This philosophy can be extended even further by using Windows Intune to manage devices where they live – i.e. cloud-based management for cloud-based devices. Cloud-based management is especially important for user-owned devices that need regular updates.

                  This is an incredible solution, and the benefit and ease of use for you, the consumer, is monumental.

                  People want access to corporate applications from anywhere, on whatever device they choose—laptop, smartphone, tablet, or PC. IT departments are challenged to provide consistent, rich experiences across all these device types, with access to native, web, and remote applications or desktops. In this video we take a look at how IT can enable people to choose their devices, reduce costs and complexity, as well as maintain security and compliance by protecting data and having comprehensive settings management across platforms.

                  In today’s post, we tackle a common question I get from customers: “Why move to the cloud right now?” Recently, however, this question has changed a bit to, “What should I move to the cloud first?

                  An important thing to keep in mind with either of these questions is that every organization has their own unique journey to the cloud. There are a lot of different workloads that run on Windows Server, and the reality is that these various workloads are moving to the cloud at very different rates. Web servers, e-mail and collaboration are examples of workloads moving to the cloud very quickly. I believe that management, and the management of smart devices, will be one of the next workloads to make that move to the cloud – and, when the time comes, that move will happen fast.

                  Using a SaaS solution is a move to the cloud, and taking this approach is a game changer because of its ability to deliver an incredible amount of value and agility without an IT pro needing to manage any of the required infrastructure.

                  Cloud-based device management is a particularly interesting development because it allows IT pros to manage this rapidly growing population of smart, cloud-connected devices, and manage them “where they live.” Today’s smart phones and tablets were built to consume cloud services, and this is one of the reasons why I believe that a cloud-based management solution for them is so natural. As you contemplate your organization’s move to the cloud, I suggest that managing all of your smart devices from the cloud should be one of your top priorities.

                  I want to be clear, however, about the nature of this kind of management: We believe that there should be one consistent management experience across PC’s and devices.

                  Achieving this single management experience was a major focus of these 2012 R2 releases, and I am incredibly proud to say we have successfully engineered products which do exactly that. The R2 releases deliver this consistent end-user experience through something we call the “Company Portal.” The Company Portal is already deployed here at Microsoft, and it is what we are currently using to upgrade our entire workforce to Windows 8.1. I’ve personally used it to upgrade my desktop, laptop, and Surface – and the process could not have been easier.

                  In this week’s post, Paul Mayfield, the Partner Program Manager for System Center Configuration Manager/Windows Intune, and his team return to discuss in deep technical detail some of the specific scenarios our PCIT [“People Centric IT”] team has enabled (cloud-based management, Company Portal, etc.).

                  Cloud computing is bringing new opportunities and new challenges to IT. Learn how Microsoft can help transform your datacenter to take advantage of the vast possibilities of the cloud while leveraging your existing resources. Learn more: http://www.microsoft.com/en-us/server-cloud/cloud-os/modern-data-center.aspx
                  • Part 4, July 24, 2013: Enabling Open Source Software

                  image

                  There are a lot of great surprises in these new R2 releases – things that are going to make a big impact in a majority of IT departments around the world. Over the next four weeks, the 2012 R2 series will cover the 2nd pillar of this release:Transform the Datacenter. In these four posts (starting today) we’ll cover many of the investments we have made that better enable IT pros to transform their datacenter via a move to a cloud-computing model.

                  This discussion will outline the ambitious scale of the functionality and capability within the 2012 R2 products. As with any conversation about the cloud, however, there are key elements to consider as you read. Particularly, I believe it’s important in all these discussions – whether online or in person – to remember that cloud computing is a computing model, not a location. All too often when someone hears the term “cloud computing” they automatically think of a public cloud environment. Another important point to consider is that cloud computing is much more than just virtualization – it is something that involves change: Change in the tools you use (automation and management), change in processes, and a change in how your entire organization uses and consumes its IT infrastructure.

                  Microsoft is extremely unique in this perspective, and it is leading the industry with its investments to deliver consistency across private, hosted and public clouds. Over the course of these next four posts, we will cover our innovations in the infrastructure (storage, network, compute), in both on-premise and hybrid scenarios, support for open source, cloud service provider & tenant experience, and much, much more.

                  As I noted above, it simply makes logical sense that running the Microsoft workloads in the Microsoft Clouds will deliver the best overall solution. But what about Linux? And how well does Microsoft virtualize and manage non-Windows platforms, in particular Linux?  Today we’ll address these exact questions.

                  Our vision regarding other operating platforms is simple: Microsoft is committed to being your cloud partner. This means end-to-end support that is versatile, flexible, and interoperable for any industry, in any environment, with any guest OS. This vision ensures we remain realistic – we know that users are going to build applications on open source operating systems, so we have built a powerful set of tools for hosting and managing them.

                  A great deal of the responsibility to deliver the capabilities that enable the Microsoft Clouds (private, hosted, Azure) to effectively host Linux and the associated open source applications falls heavily on the shoulders of the Windows Server and System Center team. In today’s post Erin Chapple, a Partner Group Program Manager in the Windows Server & System Center team, will detail how building the R2 wave with an open source environment in mind has led to a suite of products that are more adaptable and more powerful than ever.

                  As always in this series, check out the “Next Steps” at the bottom of this post for links to a variety of engineering content with hyper-technical overviews of the concepts examined in this post.

                  Back during the planning phase of 2012 R2, we carefully considered where to focus our investments for this release wave, and we chose to concentrate our efforts on enabling Service Providers to build out a highly-available, highly-scalable IaaS infrastructure on cost-effective hardware. With the innovations we have driven in storage, networking, and compute, we believe Service Providers can now build-out an IaaS platform that enables them to deliver VMs at 50% of the cost of competitors. I repeat: 50%. The bulk of the savings comes from our storage innovations and the low costs of our licenses.

                  At the core of our investments in 2012 R2 is the belief that customers are going to be using multiple clouds, and they want those clouds to be consistent.

                  Consistency across clouds is key to enabling the flexibility and frictionless movement of applications across these clouds, and, if this consistency exists, applications can be developed once and then hosted in any clouds. This means consistency for the developer. If clouds are consistent with the same management and operations tools easily used to operate these applications, that means consistency for the IT Pro.

                  It really all comes down to the friction-free movement of applications and VMs across clouds. Microsoft is very unique in this regard; we are the only cloud vendor investing and innovating in public, private and hosted clouds – with a promise of consistency (and no lock-in!) across all of them.

                  We are taking what we learn from our innovations in Windows Azure and delivering them through Windows Server, System Center and the Windows Azure Pack for you to use in your data center. This enables us to do rapid innovation in the public cloud, battle harden the innovations, and then deliver them to you to deploy. This is one of the ways in which we have been able to quicken our cadence and deliver the kind of value you see in these R2 releases. You’ll be able to see a number of areas where we are driving consistency across clouds in today’s post.

                  And speaking of today’s post – this IaaS topic will be published in two parts, with the second half appearing tomorrow morning.

                  In this first half of our two-part overview of the 2012 R2’s IaaS capabilities, Erin Chapple, a Partner Group Program Manager in the Windows Server & System Center team, examines the amazing infrastructure innovations delivered by Windows Server 2012 R2, System Center 2012 R2, and the new features in the Windows Azure Pack.

                  As always in this series, check out the “Next Steps” at the bottom of this post for links to wide range of engineering content with deep, technical overviews of the concepts examined in this post.  Also, if you haven’t started your own evaluation of the 2012 R2 previews, visit the TechNet Evaluation Center and take a test drive today!

                  I recently had an opportunity to speak with a number of leaders from the former VMWare User Group (VMUG), and it was an incredibly educational experience. I say “former” because many of the VMUG user group chapters are updating their focus/charter and are renaming themselves the Virtual Technology User Group (VTUG). This change is a direct result of how they see market share and industry momentum moving to solutions like the consistent clouds developed by Microsoft.

                  In a recent follow up conversation with these leaders, I asked them to describe some common topics they hear discussed in their meetings. One of the leaders commented that the community is saying something really specific: “If you want to have job security and a high paying job for the next 10 years, you better be on your way to becoming an expert in the Microsoft clouds. That is where this industry is going.” 

                  When I look at what is delivered in these R2 releases, the innovation is just staggering. This industry-leading innovation – the types of technical advances that VTUG groups are confidently betting on – is really exciting.

                  With this innovation in mind, in today’s post I want to discuss some of the work we are doing around the user experience for the teams creating the services that are offered, and I want to examine the experience that can be offered to the consumer of the cloud (i.e. the tenants). While we were developing R2, we spent a lot of time ensuring that we truly understood exactly who would be using our solutions. We exhaustively researched their needs, their motivations, and how various IT users and IT teams relate to each other. This process was incredibly important because these individuals and teams all have very different needs – and we were committed to supporting all of them.

                  The R2 wave of products have been built with this understanding.  The IT teams actually building and operating a cloud(s) have very different needs than individuals who are consuming the cloud (tenants).  The experience for the infrastructure teams will focus on just that – the infrastructure; the experience for the tenants will focus on the applications/ services and their seamless operation and maintenance.

                  In yesterday’s post we focused heavily on the innovations in these R2 releases in the infrastructure – storage, network, and compute – and, in this post, Erin Chapple, a Partner Group Program Manager in the Windows Server & System Center team, will provide an in-depth look at Service Provider and Tenant experience and innovations with Windows Server 2012 R2, System Center 2012 R2, and the new features in Windows Azure Pack.

                  As always in this series, check out the “Next Steps” at the bottom of this post for links to a variety of engineering content with hyper-technical overviews of the concepts examined in this post.  Also, if you haven’t started your own evaluation of the 2012 R2 previews, visit the TechNet Evaluation Center and take a test drive today!

                  Today, people want to work anywhere, on any device and have access to all the resources they need to do their job. How do you enable your users to be productive on the device of their choice, yet retain control of information and meet compliance requirements? In this video we take a look at how the Microsoft access and information protection solutions allow you to enable your users to be productive, provide them with a single identity to access all resources, and protect your data. Learn more: http://www.microsoft.com/aip

                  In the 13+ years since the original Active Directory product launched with Windows 2000, it has grown to become the default identity management and access-control solution for over 95% of organizations around the world.  But, as organizations move to the cloud, their identity and access control also need to move to the cloud. As companies rely more and more on SaaS-based applications, as the range of cloud-connected devices being used to access corporate assets continue to grow, and as more hosted and public cloud capacity is used companies must expand their identity solutions to the cloud.

                  Simply put, hybrid identity management is foundational for enterprise computing going forward.

                  With this in mind, we set out to build a solution in advance of these requirements to put our customers and partners at a competitive advantage.

                  To build this solution, we started with our “Cloud first” design principle. To meet the needs of enterprises working in the cloud, we built a solution that took the power and proven capabilities of Active Director and combined it with the flexibility and scalability of Windows Azure. The outcome is the predictably named Windows Azure Active Directory.

                  By cloud optimizing Active Directory, enterprises can stretch their identity and access management to the cloud and better manage, govern, and ensure compliance throughout every corner of their organization, as well as across all their utilized resources.

                  This can take the form of seemingly simple processes (albeit very complex behind the scenes) like single sign-on which is a massive time and energy saver for a workforce that uses multiple devices and multiple applications per person.  It can also enable the scenario where a user’s customized and personalized experience can follow them from device to device regardless of when and where they’re working. Activities like these are simply impossible without a scalable, cloud-based identity management system.

                  If anyone doubts how serious and enterprise-ready Windows Azure AD already is, consider these facts:

                  • Since we released Windows Azure AD, we’ve had over 265 billion authentications.
                  • Every two minutes Windows Azure AD services over 1,000,000 authentication requests for users and devices around the world (that’s about 9,000 requests per second).
                  • There are currently more than 420,000 unique domains uploaded and now represented inside of Azure Active Directory.

                  Windows Azure AD is battle tested, battle hardened, and many other verbs preceded by the word “battle.”

                  But, perhaps even more importantly, Windows Azure AD is something Microsoft has bet its own business on: Both Office 365 (the fastest growing product in Microsoft history) and Windows Intune authenticate every user and device with Windows Azure AD.

                  In this post, Vijay Tewari (Principle Program Manager for Windows Server & System Center), Alex Simons (Director of Program Management for Active Directory), Sam Devasahayam (Principle Program Management Lead for Windows Azure AD), and Mark Wahl (Principle Program Manager for Active Directory) take a look at one of R2’s most innovative features, Hybrid Identity Management.

                  As always in this series, check out the “Next Steps” at the bottom of this post for links to wide range of engineering content with deep, technical overviews of the concepts examined in this post.

                  One of the key elements in delivering hybrid cloud is networking. Learn how software-defined networking helps make hybrid real. Learn more: http://www.microsoft.com/en-us/server-cloud/solutions/software-defined-networking.aspx
                  [so called Application Centric Infrastructure (ACI)] Microsoft and Cisco will deliver unique customer value through new integrated networking solutions that will combine software-enabled flexibility with hardware-enabled scale/performance. These solutions will keep apps and workloads front and center and have the network adapt to their needs. Learn more by visiting: http://www.cisco.com/web/learning/le21/onlineevts/acim/index.html

                  One of the foundational requirements we called out in the 2012 R2 vision document was our promise to help you transform the datacenter. A core part of delivering on that promise is enabling Hybrid IT.

                  By focusing on Hybrid IT we were specifically calling out the fact that almost every customer we interacted with during our planning process believed that in the future they would be using capacity from multiple clouds. That may take the form of multiple private clouds an organization had stood up, or utilizing cloud capacity from a service provider [i.e. managed cloud] or a public cloud like Azure, or using SaaS solutions running from the public cloud.

                  We assumed Hybrid IT would really be the norm going forward, so we challenged ourselves to really understand and simplify the challenges associated with configuring and operating in a multi-cloud environment. Certainly one of the biggest challenges associated with operating in a hybrid cloud environment is associated with the network – everything from setting up the secure connection between clouds, to ensuring you could use your IP addresses (BYOIP) in the hosted and public clouds you chose to use.

                  The setup, configuration and operation of a hybrid IT environment is, by its very nature incredibly complex – and we have poured hundreds of thousands of hours into the development of R2 to solve this industry-wide problem.

                  With the R2 wave of products – specifically Windows Server 2012 R2 and System Center 2012 R2 – enterprises can now benefit from the highly-available and secure connection that enables the friction-free movement of VMs across those clouds. If you want or need to move a VM or application between clouds, the transition is seamless and the data is secure while it moves.

                  The functionality and scalability of our support for hybrid IT deployments has not been easy to build, and each feature has been methodically tested and refined in our own datacenters. For example, consider that within Azure there are over 50,000 network changes every day, and every single one of them is fully automated. If even 1/10 of 1% of those changes had to be done manually, it would require a small army of people working constantly to implement and then troubleshoot the human errors. With R2, the success of processes like these, and our learnings from Azure, come in the box.

                  Whether you’re a service provider or working in the IT department of an enterprise (which, in a sense, is like being a service provider to your company’s workforce), these hybrid networking features are going to remove a wide range of manual tasks, and allow you to focus on scaling, expanding and improving your infrastructure.

                  In this post, Vijay Tewari (Principle Program Manager for Windows Server & System Center) and Bala Rajagopalan(Principle Program Manager for Windows Server & System Center), provide a detailed overview of 2012 R2’s hybrid networking features, as well as solutions for common scenarios like enabling customers to create extended networks spanning clouds, and enabling access to virtualized networks.

                  Don’t forget to take a look at the “Next Steps” section at the bottom of this post, and check back tomorrow for the second half of this week’s hybrid IT content which will examine the topic of Disaster Recovery.

                  As business becomes more dependent on technology, business continuity becomes increasingly vital for IT. Learn how Micosoft is making it easier to build out business continuity plans. Learn more: http://www.microsoft.com/en-us/server-cloud/solutions/business-continuity.aspx

                  With Windows Server 2012 R2, with Hyper-V Replica, and with System Center 2012 R2 we have delivered a DR solution for the masses.

                  This DR solution is a perfect example of how the cloud changes everything

                  Because Windows Azure offers a global, highly available cloud platform with an application architecture that takes full advantage of the HA capabilities – you can build an app on Azure that will be available anytime and anywhere.  This kind of functionality is why we made the decision to build the control plane or administrative console for our DR solution on Azure. The control plane and all the meta-data required to perform a test, planned, or unplanned recovery will always be available.  This means you don’t have to make the huge investments that have been required in the past to build a highly-available platform to host your DR solution – Azure automatically provides this.

                  (Let me make a plug here that you should be looking to Azure for all the new application you are going to build – and we’ll start covering this specific topic in next week’s R2 post.)

                  With this R2 wave of products, organizations of all sizes and maturity, anywhere in the world, can now benefit from a simple and cost-effective DR solution.

                  There’s also another other thing that I am really proud of here: Like most organizations, we regularly benchmark ourselves against our competition.   We use a variety of metrics, like: ‘Are we easier to deploy and operate?’ and ‘Are we delivering more value and doing it a lower price?’  Measurements like these have provided a really clear answer: Our competitors are not even in the same ballpark when it comes to DR.

                  During the development of R2, I watched a side-by-side comparison of what was required to setup DR for 500 VMs with our solution compared to a competitive offering, and the contrast was staggering. The difference in simplicity and the total amount of time required to set everything up was dramatic.  In a DR scenario, one interesting unit of measurement is total mouse clicks. It’s easy to get carried away with counting clicks (hey, we’re engineers after all!), but, in the side-by-side comparison, the difference was 10’s of mouse clicks compared to 100’s. It is literally a difference of minutes vs. days.

                  You can read some additional perspectives I’ve shared on DR here.

                  In yesterday’s post we looked at the new hybrid networking functionality in R2 (if you haven’t seen it yet, it is a must-read), and in this post Vijay Tewari (Principal Program Manager for Windows Server & System Center) goes deep into the architecture of this DR solution, as well this solution’s deployment and operating principles.

                  As always in this 2012 R2 series, check out the “Next Steps” at the bottom of this post for links to a variety of engineering content with hyper-technical overviews of the concepts examined in this post.

                  A revolution is taking place, impacting the speed at which Business Apps need to be built, and the jaw dropping capabilities they need to deliver. Ignoring these trends isn’t an option and yet you have no time to hit the reset button. Learn how to deliver revolutionary benefits in an evolutionary way. Learn More: http://www.microsoft.com/en-us/server-cloud/cloud-os/modern-business-apps.aspx
                  Hear from Accenture and Hostway how Microsoft Windows Azure enables the development and deployment of modern business applications faster and more cost effectively through cloud computing. Find solutions and services from partners that span the entire stack of Microsoft Cloud OS products and technologies: http://www.microsoft.com/en-us/server-cloud/audience/partner.aspx#fbid=631zRfiT0WJ

                  image

                  The future of the IT Pro role will require you to know how applications are built for the cloud, as well as the cloud infrastructures where these apps operate, is something every IT Pro needs in order to be a voice in the meetings that will define an organization’s cloud strategy. IT pros are also going to need to know how their team fits in this cloud-centric model, as well as how to proactively drive these discussions.

                  These R2 posts will get you what you need, and this “Enable Modern Business Apps” pillar will be particularly helpful.

                  Throughout the posts in this series we have spoken about the importance of consistency across private, hosted and public clouds, and we’ve examined how Microsoft is unique in its vision and execution of delivering consistent clouds. The Windows Azure Pack is a wonderful example of Microsoft innovating in the public cloud and then bringing the benefits of that innovation to your datacenter.

                  The Windows Azure Pack is – literally speaking – a set of capabilities that we have battle-hardened and proven in our public cloud. These capabilities are now made available for you to enhance your cloud and ensure that “consistency across clouds” that we believe is so important.

                  A major benefit of the Windows Azure Pack is the ability to build an application once and then deploy and operate it in any Microsoft Cloudprivate, hosted or public.

                  This kind of flexibility means that you can build an application, initially deploy it in your private cloud, and then, if you want to move that app to a Service Provider or Azure in the future, you can do it without having to modify the application. Making tasks like this simple is a major part of our promise around cloud consistency, and it is something only Microsoft (not VMware, not AWS) can deliver.

                  This ability to migrate an app between these environments means that your apps and your data are never locked in to a single cloud. This allows you to easily adjust as your organization’s needs, regulatory requirements, or any operational conditions change.

                  A big part of this consistency and connection is the Windows Azure Service Bus which will be a major focus of today’s post.

                  The Windows Azure Service Bus has been a big part of Windows Azure since 2010. I don’t want to overstate this, but Service Bus has been battle-hardened in Azure for more than 3 years, and now we are delivering it to you to run in your datacenters. To give you a quick idea of how critical Service Bus is for Microsoft, consider this: Service Bus is used in all the billing for Windows Azure, and it is responsible for gathering and posting all the scoring and achievement data to the Halo 4 leaderboards (now that is really, really important – just ask my sons!). It goes without saying that the people in charge of Azure billing and the hardcore gamers are not going to tolerate any latency or downtime getting to their data.

                  With today’s topic, take the time to really appreciate the app development and app platform functionality in this R2 wave. I think you’ll be really excited about how you can plug into this process and lead your organization.

                  This post, written by Bradley Bartz (Principal Program Manager from Windows Azure) and Ziv Rafalovich (Senior Program Manager in Windows Azure), will get deep into these new features and the amazing scenarios that the Windows Azure Pack and Windows Azure Service Bus enable. As always in this 2012 R2 series, check out the “Next Steps” at the bottom of this for links to additional information about the topics covered in this post.

                  A major promise underlying all of the 2012 R2 products is really simple: Consistency.

                  Consistency in the user experiences, consistency for IT professionals, consistency for developers and consistency across clouds. A major part of delivering this consistency is the Windows Azure Pack (WAP). Last week we discussed how Service Bus enables connections across clouds, and in this post we’ll examine more of the PaaS capabilities built and tested in Azure data centers and now offered for Windows Server. With the WAP, Windows Server 2012 R2, and System Center IT pros can make their data center even more scalable, flexible, and secure.

                  Throughout the development of this R2 wave, we looked closely at what organizations needed and wanted from the cloud. A major piece of feedback was the desire to build an app once and then have that app live in any data center or cloud. For the first time this kind of functionality is now available. Whether your app is in a private, public, or hosted cloud, the developers and IT Professionals in you organization will have consistency across clouds.

                  One of the elements that I’m sure will be especially popular is the flexibility and portability of this PaaS. I’ve had countless customers comment that they love the idea of PaaS, but don’t want to be locked-in or restricted to only running it in specific data centers. Now, our customers and partners can build a PaaS app and run it anywhere. This is huge! Over the last two years the market has really began to grasp what PaaS has to offer, and now the benefits (auto-scale, agility, flexibility, etc.) are easily accessible and consistent across the private, hosted and public clouds Microsoft delivers.

                  This post will spend a lot of time talking about Web Sites for Windows Azure and how this high density web site hosting delivers a level of power, functionality, and consistency that is genuinely next-gen.

                  Microsoft is literally the only company offering these kinds of capabilities across clouds – and I am proud to say that we are the only ones with a sustained track record of enterprise-grade execution.

                  With the features added by the WAP [Windows Azure Pack], organizations can now take advantage of PaaS without being locked into a cloud. This is, at its core, the embodiment of Microsoft’s commitment to make consistency across clouds a workable, viable reality.

                  This is genuinely PaaS for the modern web.

                  Today’s post was written by Bradley Bartz, a Principal Program Manager from Windows Azure. For more information about the technology discussed here, or to see demos of these features in action, check out the “Next Steps” at the bottom of this post.

                  More information: in the Success with Hybrid Cloud series blog posts [Brad Anderson, Nov 12, Nov 14, Nov 20, Dec 2, Dec 5, and 21 upcoming blogs posts] which “will examine the building/deployment/operation of Hybrid Clouds, how they are used in various industries, how they manage and deliver different workloads, and the technical details of their operation.”


                  4.2 Unlock Insights from any Data – SQL Server 2014:

                  With growing demand for data, you need database scale with minimal cost increases. Learn how SQL Server 2014 provides speed and scalability with in-memory technologies to support your key data workloads, including OLTP, data warehousing, and BI. Learn more: http://www.microsoft.com/sqlserver2014
                  Hosting technical training overview.

                  Microsoft SQL Server 2014 CTP2 was announced by Quentin Clark during the Microsoft SQL PASS 2013 keynote.  This second public CTP is essentially feature complete and enables you to try and test all of the capabilities of the full SQL Server 2014 release. Below you will find an overview of SQL Server 2014 as well as key new capabilities added in CTP2:

                  SQL Server 2014 helps organizations by delivering:

                  • Mission Critical Performance across all database workloads with In-Memory for online transaction processing (OLTP), data warehousing and business intelligence built-in as well as greater scale and availability
                  • Platform for Hybrid Cloud enabling organizations to more easily build, deploy and manage database solutions that span on-premises and cloud
                  • Faster Insights from Any Data with a complete BI solution using familiar tools like Excel

                  Thank you to those that have already downloaded SQL Server 2014 CTP1 and started seeing first hand the performance gains that in-memory capabilities deliver along with better high availability with AlwaysOn enhancements.  CTP2 introduces additional mission critical capabilities with further enhancements to the in-memory technologies along with new hybrid cloud capabilities.

                  What’s new in SQL Server 2014 CTP2?

                  New Mission Critical Capabilities and Enhancements

                  • Enhanced In-Memory OLTP, including new tools which will help you identify and migrate the tables and stored procedures will benefit most from In-Memory OLTP, as well as greater T-SQL compatibility and new indexes which enables more customers to take advantage of our solution.
                  • High Availability for In-Memory OLTP Databases:  AlwaysOn Availability Groups are supported for In-Memory OLTP, giving you in-memory performance gains with high availability.  IO Resource Governance, enabling customers to more effectively manage IO across multiple databases and/or classes of databases to provide more predictable IO for your most critical workloads.  Customers today can already manage CPU and memory.
                  • Improved resiliency with Windows Server 2012 R2 by taking advantage of Cluster Shared Volumes (CSVs).  CSV’s provide improved fault detection and recovery in the case of downtime.
                  • Delayed Durability, providing the option for increased transaction throughput and lower latency for OLTP applications where performance and latency needs outweigh the need for 100% durability.

                  New Hybrid Cloud Capabilities and Enhancements

                  By enabling the above in-memory performance capabilities for your SQL Server instances running in Windows Azure Virtual Machines, you will see significant transaction and query performance gains.  In addition there are new capabilities listed below that will allow you to unlock new hybrid scenarios for SQL Server.

                  • Managed Backup to Windows Azure, enabling you to backup on-premises SQL Server databases to Windows Azure storage directly in SSMS.  Managed Backup also optimizes backup policy based on usage, an advantage over the manual Backup to Windows Azure.
                  • Encrypted Backup, offering customer the ability to encrypt both on-premises backup and backups to Windows Azure for enhance security.
                  • Enhanced disaster recovery to Windows Azure with simplified UI, enabling customers to more easily add Windows Azure Virtual Machines as AlwaysOn secondaries in SQL Server Management Studio for greater cost-effective data protection and disaster recovery solution.  Customers may also use the secondaries in Windows Azure for to scale and offload reporting and backups.
                  • SQL Server Data Files in Windows Azure – New capability to store large databases (>16TB) in Windows Azure and the ability to stream the database as a backend for SQL Server applications running on-premises or in the cloud.

                  Learn more and download SQL Server 2014 CTP2

                  SQL Server 2014 helps address key business challenges of ever growing data volumes, the need to transact and process data faster, the scalability and efficiency of cloud computing and an ever growing hunger for business insights.   With SQL Server 2014 you can now unlock real-time insights with mission critical and cloud performance and take advantage of one of the most comprehensive BI solutions in the marketplace today.

                  Many customers are already realizing the significant benefits of the new in-memory technologies in SQL Server 2014 including: Edgenet, Bwin, SBI Liquidity, TPP and Ferranti.  Stay tuned for an upcoming blog highlighting the impact in-memory had to each of their businesses.

                  Learn more about SQL Server 2014 and download the datasheet and whitepapers here.  Also if you would like to learn more about SQL Server In-Memory best practices, check out this SQL Server 2014 in-memory blog series compilation. There is also a SQL Server 2014 hybrid cloud scenarios blog compilation for learning best practices.

                  Also if you haven’t already download SQL Server 2014 CTP 2 and see how much faster your SQL Server applications run!  The CTP2 image is also available on Windows Azure, so you can easily develop and test the new features of SQL Server 2014.

                  To ensure that its customers received timely, accurate product data, Edgenet decided to enhance its online selling guide with In-Memory OLTP in Microsoft SQL Server 2014.

                  At the SQL PASS conference last November, we announced the In-memory OLTP (project code-named Hekaton) database technology built into the next release of SQL Server. Microsoft’s technical fellow Dave Campbell’s blog provides a broad overview of the motivation and design principles behind this project codenamed In-memory OLTP.

                  In a nutshell – In-memory OLTP is a new database engine optimized for memory resident data and OLTP workloads. In-memory OLTP is fully integrated into SQL Server – not a separate system. To take advantage of In-memory OLTP, a user defines a heavily accessed table as memory optimized. In-memory OLTP tables are fully transactional, durable and accessed using T-SQL in the same way as regular SQL Server tables. A query can reference both In-memory OLTP tables and regular tables, and a transaction can update data in both types of tables. Expensive T-SQL stored procedures that reference only In-memory OLTP tables can be natively compiled into machine code for further performance improvements. The engine is designed for extremely high session concurrency for OLTP type of transactions driven from a highly scaled-out mid-tier. To achieve this it uses latch-free data structures and a new optimistic, multi-version concurrency control technique. The end result is a selective and incremental migration into In-memory OLTP to provide predictable sub-millisecond low latency and high throughput with linear scaling for DB transactions. The actual performance gain depends on many factors but we have typically seen 5X-20X in customer workloads.

                  In the SQL Server product group, many years ago we started the investment of reinventing the architecture of the RDBMS engine to leverage modern hardware trends. This resulted in PowerPivot and In-memory ColumnStore Index in SQL2012, and In-memory OLTP is the new addition for OLTP workloads we are introducing for SQL2014 together with the updatable clustered ColumnStore index and (SSD) bufferpool extension. It has been a long and complex process to build this next generation relational engine, especially with our explicit decision of seamlessly integrating it into the existing SQL Server instead of releasing a separate product – in the belief that it provides the best customer value and onboarding experience.

                  Now we are releasing SQL2014 CTP1 as a public preview, it’s a great opportunity for you to get hands-on experience with this new technology and we are eager to get your feedback and improve the product. In addition to BOL (Books Online) content, we will roll out a series of technical blogs on In-memory OLTP to help you understand and leverage this preview release effectively.

                  In the upcoming series of blogs, you will see the following in-depth topics on In-memory OLTP:

                  • Getting started – to walk through a simple sample database application using In-memory OLTP so that you can start experimenting with the public CTP release.
                  • Architecture – to understand at a high level how In-memory OLTP is designed and built into SQL Server, and how the different concepts like memory optimized tables, native compilation of SPs and query inter-op fit together under the hood.
                  • Customer experiences so far – we had many TAP customer engagements since about 2 years ago and their feedback helped to shape the product, and we would like to share with you some of the learnings and customer experiences, such as typical application patterns and performance results.
                  • Hardware guidance – it is apparent that memory size is a factor, but since most of the applications require full durability, In-memory OLTP still requires log and checkpointing IO, and with the much higher transactional throughput, it can put actually even higher demand on the IO subsystem as a result. We will also cover how Windows Azure VMs can be used with In-memory OLTP.
                  • Application migration – how to get started with migrating to or building a new application with In-memory OLTP. You will see multiple blog posts covering the AMR tool, Table and SP migrations and pointers on how to work around some unsupported data types and T-SQL surface area, as well as the transactional model used. We will highlight the unique approach on SQL server integration which supports a partial database migration.
                  • Managing In-memory OLTP – this will cover the DBA considerations, and you will see multiple posts ranging from the tooling supporting (SSMS) to more advanced topics such as how memory and storage are managed.
                  • Limitations and what’s coming – explain what limitations exist in CTP1 and new capabilities expected to be coming in CTP2 and RTM, so that you can plan your roadmap with clarity.

                  In addition – we will also have blog coverage on what’s new with In-memory ColumnStore and introduction to bufferpool extension. 

                  SQL2014 CTP1 is available for download here or you can read the complete blog series here:

                  bwin is the largest regulated online gaming company in the world, and their success depends on positive customer experiences. They had recently upgraded some of their systems to SQL Server 2012, gaining significant in-memory benefit using xVelocity Column Store. Here, bwin takes their systems one step further by using the technology preview of SQL Server 2014 In-memory OLTP (formerly known as Project “Hekaton”). Prior to using OLTP their online gaming systems were handling about 15,000 requests per second. Using OLTP the fastest tests so far have scaled to 250,000 transactions per second.

                  Recently I posted a video about how the SQL Server Community was looking into emerging trends in BI and Database technologies – one of the key technologies mentioned in that video was in-memory.

                  Many Microsoft customers have been using in-memory technologies as part of SQL Server since 2010 including xVelocity Analytics, xVelocity Column Store and Power Pivot, something we recently covered in a blog post following the ‘vaporware’ outburst from Oracle SVP of Communications, Bob Evans. Looking forward, Ted Kummert recently announced project codenamed “Hekaton,” available in the next major release of SQL Server. “Hekaton” will provide a full in-memory transactional engine, and is currently in private technology preview with a small set of customers. This technology will provide breakthrough performance gains of up to 50 times.

                  For those who are keen to get a first view of customers using the technology, below is the video of online gaming company bwin using “Hekaton”.

                  Bwin is the largest regulated online gaming company in the world, and their success depends on positive customer experiences. They had recently upgraded some of their systems to SQL Server 2012 – a story you can read here. Bwin had already gained significant in-memory benefit using xVelocity Column Store, for example – a large report that used to take 17 minutes to render now takes only three seconds.

                  Given the benefits, they had seen with in-memory technologies, they were keen to trial the technology preview of “Hekaton”. Prior to using “Hekaton”, their online gaming systems were handling about 15,000 requests per second, a huge number for most companies. However, bwin needed to be agile and stay at ahead of the competition and so they wanted access to the latest technology speed.

                  Using “Hekaton” bwin were hoping they could at least double the number of transactions. They were ‘pretty amazed’ to see that the fastest tests so far have scaled to 250,000 transactions per second.

                  So how fast is “Hekaton” – just ask Rick Kutschera, the Database Engineering Manager at bwin – in his words it’s ‘Wicked Fast’! However, this is not the only point that Rick highlights, he goes on to mention that “Hekaton” integrates seamlessly into the SQL Server engine, so if you know SQL Server, you know “Hekaton”.

                  — David Hobbs-Mallyon, Senior Product Marketing Manager

                  Quentin Clark
                  Corporate Vice President, Data Platform Group

                  This morning, during my keynote at the Professional Association of SQL Server (PASS) Summit 2013, I discussed how customers are pushing the boundaries of what’s possible for businesses today using the advanced technologies in our data platform. It was my pleasure to announce the second Community Technology Preview (CTP2) of SQL Server 2014 which features breakthrough performance with In-Memory OLTP and simplified backup and disaster recovery in Windows Azure.

                  Pushing the boundaries

                  We are pushing the boundaries of our data platform with breakthrough performance, cloud capabilities and the pace of delivery to our customers. Last year at PASS Summit, we announced our In-Memory OLTP project “Hekaton” and since then released SQL Server 2012 Parallel Data Warehouse and public previews of Windows Azure HDInsight and Power BI for Office 365. Today we have SQL Server 2014 CTP2, our public and production-ready release shipping a mere 18 months after SQL Server 2012. 

                  Our drive to push the boundaries comes from recognizing that the world around data is changing.

                  • Our customers are demanding more from their data – higher levels of availability as their businesses scale and globalize, major advancements in performance to align to the more real-time nature of business, and more flexibility to keep up with the pace of their innovation. So we provide in-memory, cloud-scale, and hybrid solutions. 
                  • Our customers are storing and collecting more data – machine signals, devices, services and data from outside even their organizations. So we invest in scaling the database and a Hadoop-based solution. 
                  • Our customers are seeking the value of new insights for their business. So we offer them self-service BI in Office 365 delivering powerful analytics through a ubiquitous product and empowering users with new, more accessible ways of gaining insights.

                  In-memory in the box for breakthrough performance

                  A few weeks ago, one of our competitors announced plans to build an in-memory column store into their database product some day in the future. We shipped similar technology two years ago in SQL Server 2012, and have continued to advance that technology in SQL Server 2012 Parallel Data Warehouse and now with SQL Server 2014. In addition to our in-memory columnar support in SQL Server 2014, we are also pushing the boundaries of performance with in-memory online transaction processing (OLTP). A year ago we announced project “Hekaton,” and today we have customers realizing performance gains of up to 30x. This work, combined with our early investments in Analysis Services and Excel, means Microsoft is delivering the most complete in-memory capabilities for all data workloads – analytics, data warehousing and OLTP. 

                  We do this to allow our customers to make breakthroughs for their businesses. SQL Server is enabling them to rethink how they can accelerate and exceed the speed of their business.

                  image

                  • TPP is a clinical software provider managing more than 30 million patient records – half the patients in England – including 200,000 active registered users from the UK’s National Health Service.  Their systems handle 640 million transactions per day, peaking at 34,700 transactions per second. They tested a next-generation version of their software with the SQL Server 2014 in-memory capabilities, which has enabled their application to run seven times faster than before – all of this done and running in half a day. 
                  • Ferranti provides solutions for the energy market worldwide, collecting massive amounts of data using smart metering. With our in-memory technology they can now process a continuous data flow up to 200 million measurement channels making the system fully capable of meeting the demands of smart meter technology.
                  • SBI Liquidity Market in Japan provides online services for foreign currency trading. By adopting SQL Server 2014, the company has increased throughput from 35,000 to 200,000 transactions per second. They now have a trading platform that is ready to take on the global marketplace.

                  A closer look into In-memory OLTP

                  Previously, I wrote about the journey of the in-memory OLTP project Hekaton, where a group of SQL Server database engineers collaborated with Microsoft Research. Changes in the ratios between CPU performance, IO latencies and bandwidth, cache and memory sizes as well as innovations in networking and storage were changing assumptions and design for the next generation of data processing products. This gave us the opening to push the boundaries of what we could engineer without the constraints that existed when relational databases were first built many years ago. 

                  Challenging those assumptions, we engineered for dramatically changing latencies and throughput for so-called “hot” transactional tables in the database. Lock-free, row-versioning data structures and compiling T-SQL and queries into native code, combined with making the programming semantics consistent with SQL Server means our customers can apply the performance benefits of extreme transaction processing without application rewrites or the adoption of entirely new products. 

                  image

                  The continuous data platform

                  Windows Azure fulfills new scenarios for our customers – transcending what is on-premises or in the cloud. Microsoft is providing a continuous platform from our traditional products that are run on-premises to our cloud offerings. 

                  With SQL Server 2014, we are bringing the cloud into the box. We are delivering high availability and disaster recovery on Windows Azure built right into the database. This enables customers to benefit from our global datacenters: AlwaysOn Availability Groups that span on-premises and Windows Azure Virtual Machines, database backups directly into Windows Azure storage, and even the ability to store and run database files directly in Windows Azure storage. That last scenario really does something interesting – now you can have an infinitely-sized hard drive with incredible disaster recovery properties with all the great local latency and performance of the on-premises database server. 

                  We’re not just providing easy backup in SQL Server 2014, today we announced backup to Windows Azure would be available for all our currently supported SQL Server releases. Together, the backup to Windows Azure capabilities in SQL Server 2014 and via the standalone tool offer customers a single, cost-effective backup strategy for secure off-site storage with encryption and compression across all supported versions of SQL Server.

                  By having a complete and continuous data platform we strive to empower billions of people to get value from their data. It’s why I am so excited to announce the availability of SQL Server 2014 CTP2, hot on the heels of the fastest-adopted release in SQL Server’s history, SQL Server 2012. Today, more businesses solve their data processing needs with SQL Server than any other database. It’s about empowering the world to push the boundaries.


                  4.3 Unlock Insights from any Data / Big Data – Microsoft SQL Server Parallel Data Warehouse (PDW) and Windows Azure HDInsights:

                  Data is being generated faster than ever before, so what can it do for your business? Learn how to unlock insights on any data by empowering people with BI and big data tools to go from raw data to business insights faster and easier. Learn more: http://www.microsoft.com/datainsights
                  With the abundance of information available today, BI shouldn’t be confined to analysts or IT. Learn how to empower all with analytics through familiar Office tools, and how to manage all your data needs with a powerful and scalable data platform. Learn more: http://www.microsoft.com/BI
                  With data volumes exploding by 10x every five years, and much of this growth coming from new data types, data warehousing is at a tipping point. Learn how to evolve your data warehouse infrastructure to support variety, volume, and velocity of data. Learn more: http://www.microsoft.com/datawarehousing
                  Hear from HP, Dell and Hortonworks how Microsoft SQL Server Parallel Data Warehouse and Windows Azure HDInsights can unlock data insights and respond to business opportunities through big data analytics. Find solutions and services from partners that span the entire stack of Microsoft Cloud OS products and technologies: http://www.microsoft.com/en-us/server-cloud/audience/partner.aspx#fbid=631zRfiT0WJ
                  The idea that big data will transform businesses and the world is indisputable, but are there enough resources to fully embrace this opportunity? Join Quentin Clark, Microsoft Corporate Vice President, who will share Microsoft’s bold goal to consumerize big data — simplifying the data science process and providing easy access to data with everyday tools. This keynote is sponsored by Microsoft
                  Quentin Clark discusses the ever-changing big data market and how Microsoft is meeting its demands.

                  Announcing Windows Azure HDInsight: Where big data meets the cloud [The Official Microsoft Blog, Oct 28, 2013]

                  post is from Quentin Clark, Corporate Vice President of the Data Platform Group at Microsoft

                  I am pleased to announce that Windows Azure HDInsight – our cloud-based distribution of Hadoop – is now generally available on Windows Azure. The GA of HDInsight is an important milestone for Microsoft, as its part of our broader strategy to bring big data to a billion people.

                  On Tuesday at Strata + Hadoop World 2013, I will discuss the opportunity of big data in my keynote, “Can Big Data Reach One Billion People?” Microsoft’s perspective is that embracing the new value of data will lead to a major transformation as significant as when line of business applications matured to the point where they touched everyone inside an organization. But how do we realize this transformation? It happens when big data finds its way to everyone in business – when anyone with a question that can be answered by data, gets their answer. The impact of this is beyond just making businesses smarter and more efficient. It’s about changing how business works through both people and data-driven insights. Data will drive the kinds of changes that, for example, allow personalization to become truly prevalent. People will drive change by gaining insights into what impacts their business, enabling them to change the kinds of partnerships and products they offer.

                  Our goal to empower everyone with insights is the reason why Microsoft is investing, not just in technology like Hadoop, but the whole circuit required to get value from big data. Our customers are demanding more from the data they have – not just higher availability, global scale and longer histories of their business data, but that their data works with business in real time and can be leveraged in a flexible way to help them innovate. And they are collecting more signals – from machines and devices and sources outside their organizations.

                  Some of the biggest changes to businesses driven by big data are created by the ability to reason over data previously thought unmanageable, as well as data that comes from adjacent industries. Think about the use of equipment data to do better operational cost and maintenance management, or a loan company using shipping data as part of the loan evaluation. All of this data needs all forms of analyticsand the ability to reach the people making decisions. Organizations that complete this circuit, thereby creating the capability to listen to what the data can tell them, will accelerate.

                  Bringing Hadoop to the enterprise

                  Hadoop is a cornerstone of how we will realize value from big data. That’s why we’ve engineered HDInsight as 100 percent Apache Hadoop offered as an Azure cloud service. The service has been in public production preview for a number of months now – the reception has been tremendous and we are excited to bring it to full GA status in Azure. 

                  Microsoft recognizes Hadoop as a standard and is investing to ensure that it’s an integral part of our enterprise offerings. We have invested through real contributions across the project – not just to make Hadoop work great on Windows, but even in projects like Tez, Stinger and Hive. We have put in thousands of engineering hours and tens of thousands of lines of code. We have been doing this in partnership with Hortonworks, who will make HDP (Hortwonworks Data Platform) 2.0 for Windows Server generally available next month, giving the world access to a supported Apache-pure Hadoop v2 distribution for Windows Server. Working with Hortonworks, we will support Hadoop v2 in a future update to HDInsight.

                  Windows Azure HDInsight combines the best of Hadoop open source technology with the security, elasticity and manageability that enterprises require. We have built it to integrate with Excel and Power BI – our business intelligence offering that is part of Office 365 – allowing people to easily connect to data through HDInsight, then refine and do business analytics in a turnkey fashion. For the developer, HDInsight also supports choice of languages: .NET, Java and more.

                  We have key customers currently using HDInsight, including:

                  • The City of Barcelona uses Windows Azure HDInsight to pull in data about traffic patterns, garbage collection, city festivals, social media buzz and more to make critical decisions about public transportation, security and overall spending.
                  • A team of computer scientists at Virginia Tech developed an on-demand, cloud-computing model using the Windows Azure HDInsight Service, enabling  easier, more cost-effective access to DNA sequencing tools and resources.
                  • Christian Hansen, a developer of natural ingredients for several industries, collects electronic data from a variety of sources, including automated lab equipment, sensors and databases. With HDInsight in place, they are able to collect and process data from trials 100 time times faster than before.

                  End-to-end solutions for big data

                  These kinds of uses of Hadoop are examples of how big data is changing what’s possible. Our Hadoop-based solution HDInsight is a building block – one important piece of the end-to-end solutions required to get value from data.

                  All this comes together in solutions where people can use Excel to pull data directly from a range of sources, including SQL Server (the most widely-deployed database product), HDInsight, external Hadoop clusters and publicly available datasets. They can then use our business intelligence tools in Power BI to refine that data, visualize it and just ask it questions. We believe that by putting widely accessible and easy-to-deploy tools in everyone’s hands, we are helping big data reach a billion people. 

                  I am looking forward to tomorrow. The Hadoop community is pushing what’s possible, and we could not be happier that we made the commitment to contribute to it in meaningful ways.

                  Quentin Clark, Microsoft, at Big Data NYC 2013 with John Furrier and Dave Vellante

                  “We’re here here because we’re super committed to Hadoop,” Clark said, explaining that Microsoft is dedicated to help its customers embrace the benefits Big Data can provide them with. “Hadoop is the cornerstone of Big Data but not the entire infrastructure,” he added. Microsoft is focusing around adding security and tool integration, with thousands of hours of development put into Hadoop, to make it ready for the enterprise. “There’s a foundational piece where customers are starting,” which they can build upon and Microsoft focuses on helping them embrace Hadoop as part of the IT giant’s business goals.

                  Asked to compare the adoption of traditional Microsoft products with the company’s Hadoop products, Clark said, “a big part of our effort was to get to that enterprise expectations.” Security and tools integration, getting Hadoop to work on Windows is part of that effort. Microsoft aims to help people “have a conversation and dialogue with the data. We make sure we funnel all the data to help them get the BI and analytics” they need.

                  Commenting on Microsoft’s statement of bringing Big Data to its one billion Office users, Vellante asked if the company’s strategy was to put the power of Big Data into Excel. Clark explained it was about putting Big Data in the Office suite, going on to explain that there is already more than a billion people who are passively using Big Data. Microsoft focuses on those actively using it.

                  Clark mentions Microsoft has focused on the sports arena, helping major sports leagues use Big Data to power fantasy teams. “We actually have some models, use some data sets. I have a fantasy team that I’m doing pretty well with, partly because of my ability to really have a conversation with the data. On the business side, it’s transformational. Our ability to gain insight in real time and interact is very different using these tools,” Clark stated.

                  Why not build its own Hadoop distro?

                  Asked why Microsoft decided not to have its own Hadoop distribution, Clark explained that “primarily our focus has been in improving the Apache core, make Hadoop work on Windows and work great. Our partnership with Hortonworks just made sense. They are able to continue to push and have that cross platform capability, we are able to offer our customers a solution.”

                  Explaining there were great discrepancies in how different companies in the same industries made use of the benefits Big Data, he advised our viewers to “look at what the big companies are doing” embracing the data, and to look what they are achieving with it.

                  As far as the future of the Big Data industry is concerned, Clark stated: “There’s a consistent meme of how is this embraced by business for results. Sometimes with the evolution of technology, everyone is exploring what it’s capable of.” Now there’s a focus shift of the industry towards what greater purpose it leads to, what businesses can accomplish.

                  @thecube

                  #BigDataNYC


                  4.4 Empower people-centric IT – Microsoft Virtual Desktop Infrastructure (VDI):

                  Microsoft Virtual Desktop Infrastructure (VDI) enables IT to deliver desktops and applications to users that employees can access from anywhere on both personal and corporate devices . Centralizing and controlling applications and data through a virtual desktop enables your people to get their work done on the devices they choose while helping maintain compliance. Learn more: http://www.microsoft.com/msvdi
                  With dramatic growth in the number of mobile users and personal devices at work, and mounting pressure to comply with governmental regulations, IT organizations are increasingly turning to Microsoft Virtual Desktop Infrastructure (VDI) solutions. This session will provide an overview of Microsoft’s VDI solutions and will drill into some of the new, exciting capabilities that Windows Server 2012 R2 offers for VDI solutions.

                  In October, we announced Windows Server 2012 R2 which delivers several exciting improvements for VDI solutions. Among the benefits, Windows Server 2012 R2 reduces the cost per seat for VDI as well as enhances your end user’s experience. The following are just some of the features and benefits of Windows Server 2012 R2 for VDI:

                  • Online data deduplication on actively running VMs reduces storage capacity requirements by up to 90% on persistent desktops.
                  • Tiered storage spaces manage your tiers of storage (fast SSDs vs. slower HDDs) intelligently so that the most frequently accessed data blocks are automatically moved onto faster-tier drives. Likewise, older or seldom-accessed files are moved onto the cheaper and slower SAS drives.
                  • The Microsoft Remote Desktop App provides easy access to a variety devices and platforms including Windows, Windows RT, iOS, Mac OS X and Android. This is good news for your end users and your mobility/BYOD strategy!
                  • Your user experience is also enhanced due to improvements on several fronts including RemoteFX, DirectX 11.1 support, RemoteApp, quick reconnect, session shadowing, dynamic monitor and resolution changes.

                  If your VDI solutions run on Dell servers or if you are looking at deploying new VDI infrastructure, we are excited to let you know about the work we have been doing in partnership with Dell around VDI. Dell recently updated their Desktop Virtualization Solution (DVS) for Windows Server to support Windows Server 2012 R2, and DVS now delivers all of the benefits mentioned above. Dell is also delivering additional enhancements into Dell DVS for Windows Server so it will also support:

                  • Windows 8.1 with touch screen devices and new Intel Haswell processors
                  • Unified Communication with Lync 2013, via an endpoint plug-in that enables P2P audio and video. (Dell Wyse has certified selected Windows thin clients to this effect, such as the D90 and Z90.)
                  • Virtualized shared graphics on NVidia GRID K1/K2 and AMD FirePro cards using Microsoft RemoteFX technology
                  • Affordable persistent desktops
                  • Highly-secure and dual/quad core Dell Wyse thin clients, for a true end-to-end capability, even when using high-end server graphics cards or running UC on Lync 2013
                  • Optional Dell vWorkspace software, also supporting Windows Server 2012 R2, that brings scalability to tens of thousands of seats, advanced VM provisioning, IOPS efficiency to reduce storage requirement and improve performance, diagnostics and monitoring, flexible resource assignments, support for multi-tenancy and more.
                  • Availability in more than 30 countries

                  Depending on where you stand in the VDI deployment cycle in your organization, Dell DVS for Windows Server is already supported today on multiple Dell PowerEdge server platforms:

                  • The T110 for a pilot/POC up to 10 seats
                  • The VRTX for implementation in a remote or branch office of up to about 500 users
                  • The R720 for a traditional enterprise-like, flexible and scalable implementation to several thousand seats. It supports flexible deployments such as application virtualization, RDSH, pooled and persistent VMs.

                  This week, Microsoft and Dell will present a technology showcase at Dell World in Austin (TX), USA. If you happen to be at the show, you will be able to see for yourself how well Windows Server 2012 R2 and Windows 8.1 integrate into Dell DVS. We will show:

                  • The single management console of Windows Server 2012 installed on a Dell PowerEdge VRTX, demonstrating how easy it can be for an IT administrator to manage VDI workloads based on Hyper-V in a remote or branch office environment
                  • How users can chat, talk, share, meet, transfer files and conduct video conferencing within virtualized desktops set up for unified communication
                  • That you can watch HD multimedia and 3D graphics files on multiple virtual desktops sharing a graphic card installed remotely in a server
                  • How affordable it is to run persistent desktops with DVS and Windows Server 2012 R2

                  We are excited about the work that we are doing with Dell around VDI and hope you have a chance to come visit our joint VDI showcase in Austin. We will be located in the middle of the Dell booth in show expo hall. Also, we will show a VDI demo as part of the Microsoft Cloud OS breakout session at noon on Thursday (December 12th ) in room 9AB. Finally, we will show a longer VDI demo in the show expo theater (next to the Microsoft booth) at 10am on Friday (December 13th ) morning. We are looking forward to seeing you there.

                  With the Microsoft Remote Desktop app, you can connect to a remote PC and your work resources from almost anywhere. Experience the power of Windows with RemoteFX in a Remote Desktop client designed to help you get your work done wherever you are.

                  Post from Brad Anderson,
                  Corporate Vice President of Windows Server & System Center at Microsoft.

                  As of yesterday afternoon, the Microsoft Remote Desktop App is available in the Android, iOS, and Mac stores (see screen shots below). There was a time, in the very recent past, when many thought something like this would never happen.

                  If your company has users who work on iPads, Android, and Windows RT devices, you also likely have a strategy (or at least of point-of-view) for how you will deliver Windows applications to those devices. With the Remote Desktop App and the 2012 R2 platforms made available earlier today, you now have a great solution from Microsoft to deliver Windows applications to your users across all the devices they are using.

                  As I have written about before, one of the things I am actively encouraging organizations to do is to step back and look at their strategy for delivering applications and protecting data across all of their devices. Today, most enterprises are using different tools for enabling users on PCs, and then they deploy another tool for enabling users on their tablets and smart phones. This kind of overheard and the associated costs are unnecessary – but, even more important (or maybe I should say worse), is that your end-users therefore have different and fragmented experiences as they transition across their various devices. A big part of an IT team’s job must be to radically simplify the experience end users have in accomplishing their work – and users are doing that work across all their devices.

                  I keep bolding “all” here because I am really trying to make a point:  Let’s stop thinking about PCs and devices in a fragmented way. What we are trying to accomplish is pretty straightforward: Enable users to access the apps and datathey need to be productive in a way that can ensure the corporate assets are secure. Notice that nowhere in that sentence did I mention devices. We should stop talking about PC Lifecycle management, Mobile Device Management and Mobile Application Management – and instead focus our conversation on how we are enabling users. We need a user-enablement Magic Quadrant!

                  OK – stepping off my soapbox. Smile

                  Delivering Windows applications in a server-computing model, through solutions like Remote Desktop Services, is a key requirement in your strategy for application access management. But keep in mind that this is only one of many ways applications can be delivered – and we should consider and account for all of them.

                  For example, you also have to consider Win32 apps running in a distributed model, modern Windows apps, iOS native apps (side-loaded and deep-linked), Android native apps (side-loaded and deep-linked), SaaS applications, and web applications.

                  Things have really changed from just 5 years ago when we really only had to worry about Windows apps being delivered to Windows devices.

                  As you are rethinking your application access strategy, you need solutions that enable you to intelligently manage all these applications types across all the devices your workforce will use.

                  You should also consider what the Remote Desktop Apps released yesterday are proof of Microsoft’s commitment to enable you to have a single solution to manage all the devices your users will use.

                  Microsoft describes itself as a “devices and services company.” Let me provide a little more insight into this.

                  Devices: We will do everything we can to earn your business on Windows devices.

                  Services: We will light up those Windows devices with the cloud services that we build, and these cloud services will alsolight-up all (there’s that bold again) your other devices.

                  The funny thing about cloud services is that they want every device possible to connect to them – we are working to make sure the cloud services that we are building for the enterprise will bring value to all (again!) the devices your users will want to use – whether those are Windows, iOS, or Android.

                  The RDP clients that we released into the stores yesterday are not v1 apps. Back in June, we acquired IP assets from an organization in Austria (HLW Software Development GMBH) that had been building and delivering RDP clients for a number of years. In fact, there were more than 1 million downloads of their RDP clients from the Apple and Android stores.  The team has done an incredible job using them as a base for development of our Remote Desktop App, creating a very simple and compelling experience on iOS, Mac OS X and Android. You should definitely give them a try!

                  Also: Did I mention they are free?

                  To start using the Microsoft Remote Desktop App for any of these platforms, simply follow these links:

                  setup: – Windows 8.1 Pro run on a slow Netbook, BenQ Joybook Lite U101 with Aton N270! – HTC One X running Android 4.2.2 – HTC Fly running Android 3.2.1 How to: http://android-er.blogspot.com/2013/10/basic-setup-for-microsoft-remote.html

                  Satya Nadella’s (?the next Microsoft CEO?) next ten years’ vision of “digitizing everything”, Microsoft opportunities and challenges seen by him with that, and the case of Big Data

                  … as one of the crucial issues for that (in addition to the cloud, mobility and Internet-of-Things), via the current tipping point as per Microsoft, and the upcoming revolution in that as per Intel

                  Satya Nadella, Cloud & Enterprise Group, Microsoft and Om Malik, Founder & Senior Writer, GigaOM [LeWeb YouTube channel, Dec 10, 2013]

                  Satya is responsible for building and running Microsoft’s computing platforms, developer tools and cloud services. He and his team deliver the “Cloud OS.” Rumored to be on the short list for CEO, he shares his views on the future. [Interviewed during the “Plenary I” devoted to “The Next 10 years” at Day 1 on Dec 10, 2013.]

                  And why I will present Big Data after that? For very simple reason: IMHO exactly in Big Data Microsoft’s innovations came to a point at which its technology has the best chances to become dominant and subsequently define the standard for the IT industry—resulting in “winner-take-all” economies of scale and scope. Whatever Intel is going to add to that in terms of “technologies for the next Big Data revolution” is going only to help Microsoft with its currently achieved innovative position even more. But for this reason I will include here the upcoming Intel innovations for Big Data as well.

                  In this next-gen regard it is highly recommended to read also: Disaggregation in the next-generation datacenter and HP’s Moonshot approach for the upcoming HP CloudSystem “private cloud in-a-box” with the promised HP Cloud OS based on the 4 years old OpenStack effort with others [‘Experiencing the Cloud’. Dec 12, 2013] !

                  Now the detailed discussion of Big Data:

                  Microsoft® makes Big Data work for you! [HP Discover YouTube channel, recorded on Dec 11; published on Dec 12, 2013]

                  [Doug Smith, Director, Emerging Technologies, Microsoft] Come and join our Innovation Theatre session to hear how customers are solving Big Data challenges in big ways jointly with HP!

                  The Garage Series: Unleashing Power BI for Office 365 [Office 365 technology blog, Nov 20, 2013]

                  In this week’s show, host Jeremy Chapman is joined by Michael Tejedor from the SQL Server team to discuss Power BI and show it in action. Power BI for Office 365 is a cloud based solution that reduces the barriers to deploying a self-service Business Intelligence environment for sharing live Excel based reports and data queries as well as new features and services that enable ease of data discover and information access from anywhere. Michael draws up the self-service approach to Power BI as well as how public data can be queried and combined in a unified view within Excel. Then they walk through an end-to-end demo of Excel and Power BI componentsPower Query [formerly known as “Data Explorer], Power Pivot, Power View, Power Map [formerly known as product codename “Geoflow] and Q&A–as they optimize profitability of a bar and rein in bartenders with data.

                  Last week Mark Kashman and I went through the administrative controls of managing user access and mobile devices, but this week I’m joined by Michael Tejedor and we shift gears completely to talk data, databases and business intelligence. Back in July we announced Power BI for Office 365 and how this new service along with the  using the familiar tools within Excel, enables you can to discover, analyze, visualize and share data in powerful ways. Power BIThe solution includes Power Query, Power Pivot, Power View, Power Map and as well as a host of Power BI features including Q&A.  and how using the familiar tools within Excel, you can discover, analyze, visualize and share data in powerful ways. Power BI includes Power Query, Power Pivot, Power View, Power Map and Q&A.  

                  • Power Query [formerly known as “Data Explorer] is a data search engine allowing you to query data from within your company and from external data sources on the Internet, all within Excel.
                  • Power Pivot lets you create flexible models within Excel that can process large data sets quickly using SQL Server’s in-memory database.
                  • Power View allows you to manipulate data and compile it into charts, graphs and other visualizations. It’s great for presentations and reports
                  • Power Map [formerly known as product codename “Geoflow] is a 3D data visualization tool for mapping, exploring and interacting with geographic and temporal data.
                  • Q&A is a natural language query engine that lets users easily query data using common terms and phrases.

                  In many cases, the process to get custom reports and dashboards from the people running your databases, sales or operations systems is something like submitting a request to your database administrator and a few phone calls or meetings to get what you want. I came from an logistics and operations management background, it could easily take 2 or 3 weeks to even make minor tweaks to an operational dashboard. Now you can use something familiar–Excelin a self-service way to hook into your local databases, Excel flat files, modern data sources like Hadoop or public data sources via Power Query and the data catalogue. All of these data sources can be combined create powerful insights and data visualizations, all can be easily and securely shared with the people you work with through the Power BI for Office 365 service.

                  Of course all of this sounds great, but you can’t really get a feel for it until you see it. Michael and team built out a great demo themed after a bar and using data to track alcohol profitability, pour precision per bartender and Q&A to query all of this using normal query terms. You’ll want to watch the show to see how everything turns out and of course to see all of these power tools in action. Of course if you want to kick the tires and try Power BI for Office 365, you can register for the preview now.

                  Intel: technologies for the next Big Data revolution [HP Discover YouTube channel, recorded on Dec 11; published on Dec 12, 2013]

                  [Patrick Buddenbaum, Director, Enterprise Segment, Intel Corporation at HP Discover Barcelona 2013 on Dec 11, 11:40 AM – 12:00 PM] HP and Intel share the belief that every organization and individual should be able to unlock intelligence from the world’s ever increasing set of data sources—the Internet of Things.

                   

                  Related “current tipping point” announcements from Microsoft:

                  From: Organizations Speed Business Results With New Appliances From HP and Microsoft [joint press release, Jan 18, 2011]

                  New solutions for business intelligence, data warehouse, messaging and database consolidation help increase employee productivity and reduce IT complexity.

                  … The HP Business Decision Appliance is available now to run business intelligence services ….

                  Delivering on the companies’ extended partnership announced a year ago, the new converged application appliances from HP and Microsoft are the industry’s first systems designed for IT, as well as end users. They deliver application services such as business intelligence, data warehousing, online transaction processing and messaging. The jointly engineered appliances, and related consulting and support services, enable IT to deliver critical business applications in as little as one hour, compared with potentially months needed for traditional systems.3 One of the solutions already offered by HP and Microsoft — the HP Enterprise Data Warehouse Appliance — delivers up to 200 times faster queries and 10 times the scalability of traditional Microsoft SQL Server deployments.4

                  With the HP Business Decision Appliance, HP and Microsoft have greatly reduced the time and effort it takes for IT to configure, deploy and manage a comprehensive business intelligence solution, compared with a traditional business intelligence solution where applications, infrastructure and productivity tools are not pre-integrated. This appliance is optimized for Microsoft SQL Server and Microsoft SharePoint and can be installed and configured by IT in less than one hour.

                  The solution enables end users to share data analyses built with Microsoft’s award-winning5 PowerPivot for Excel 2010 and collaborate with others in SharePoint 2010. It allows IT to centrally audit, monitor and manage solutions created by end users from a single dashboard.

                  Availability and Pricing6

                  • The HP Business Decision Appliance with three years of HP 24×7 hardware and software support services is available today from HP and HP/Microsoft Frontline channel partners for less than $28,000 (ERP). Microsoft SQL Server 2008 R2 and Microsoft SharePoint 2010 are licensed separately.

                  • The HP Enterprise Data Warehouse Appliance with services for site assessment, installation and startup, as well as three years of HP 24×7 hardware and software support services, is available today from HP and HP/Microsoft Frontline channel partners starting at less than $2 million. Microsoft SQL Server 2008 R2 Parallel Data Warehouse is licensed separately.

                  3 Based on HP’s experience with customers using HP Business Decision Appliance.
                  4 SQL Server Parallel Data Warehouse (PDW) has been evaluated by 16 early adopter customers in six different industries. Customers compared PDW with their existing environments and saw typically 40x and up to 200x improvement in query times.
                  5 Messaging and Online Collaboration Reviews, Nov. 30, 2010, eWEEK.com.
                  6 Estimated retail U.S. prices. Actual prices may vary.

                  From: HP Delivers Enterprise Agility with New Converged Infrastructure Solutions [press release, June 6, 2011]

                  HP today announced several industry-first Converged Infrastructure solutions that improve enterprise agility by simplifying deployment and speeding IT delivery.

                  Converged Systems accelerate time to application value

                  HP Converged Systems speed solution deployment by providing a common architecture, management and security model across virtualization, cloud and dedicated application environments. They include:

                  • HP AppSystem maximizes performance while simplifying deployment and application management. These systems offer best practice operations with a standard architecture that lowers total cost of ownership. Among the new systems are HP Vertica Analytics System, as well as HP Database Consolidation Solution and HP Business Data Warehouse Appliance, which are both optimized for Microsoft SQL Server 2008 R2.

                  From: Microsoft Expands Data Platform With SQL Server 2012, New Investments for Managing Any Data, Any Size, Anywhere [press release, Oct 12, 2011]

                  New technologies will give businesses a universal platform for data management, access and collaboration.

                  … Kummert described how SQL Server 2012, formerly code-named “Denali,” addresses the growing challenges of data and device proliferation by enabling customers to rapidly unlock and extend business insights, both in traditional datacenters and through public and private clouds. Extending on this foundation, Kummert also announced new investments to help customers manage “big data,” including an Apache Hadoop-based distribution for Windows Server and Windows Azure and a strategic partnership with Hortonworks Inc. …

                  The company also made available final versions of the Hadoop Connectors for SQL Server and Parallel Data Warehouse. Customers can use these connectors to integrate Hadoop with their existing SQL Server environments to better manage data across all types and forms.

                  SQL Server 2012 delivers a powerful new set of capabilities for mission-critical workloads, business intelligence and hybrid IT across traditional datacenters and private and public clouds. Features such as Power View (formerly Project “Crescent,”) and SQL Server Data Tools (formerly “Juneau”) expand the self-service BI capabilities delivered with PowerPivot, and provide an integrated development environment for SQL Server developers.

                  From: Microsoft Releases SQL Server 2012 to Help Customers Manage “Any Data, Any Size, Anywhere” [press release, March 6, 2012]

                  Microsoft’s next-generation data platform releases to manufacturing today.

                  REDMOND, Wash. — March 6, 2012 — Microsoft Corp. today announced that the latest version of the world’s most widely deployed data platform, Microsoft SQL Server 2012, has released to manufacturing. SQL Server 2012 helps address the challenges of increasing data volumes by rapidly turning data into actionable business insights. Expanding on Microsoft’s commitment to help customers manage any data, regardless of size, both on-premises and in the cloud, the company today also disclosed additional details regarding its plans to release an Apache Hadoop-based service for Windows Azure.

                  Tackling Big Data

                  IT research firm Gartner estimates that the volume of global data is growing at a rate of 59 percent per year, with 70 to 85 percent in unstructured form.* Furthering its commitment to connect SQL Server and rich business intelligence tools, such as Microsoft Excel, PowerPivot for Excel 2010 and Power View, with unstructured data, Microsoft announced plans to release an additional limited preview of an Apache Hadoop-based service for Windows Azure in the first half of 2012.

                  To help customers more cost-effectively manage their enterprise-scale workloads, Microsoft will release several new data warehousing solutions in conjunction with the general availability of SQL Server 2012, slated to begin April 1. This includes a major software update and new half-rack form factors for Microsoft Parallel Data Warehouse appliances, as well as availability of SQL Server Fast Track Data Warehouse reference architectures for SQL Server 2012.

                  Microsoft Simplifies Big Data for the Enterprise [press release, Oct 24, 2012]

                  New Apache Hadoop-compatible solutions for Windows Azure and Windows Server enable customers to easily extract insights from big data.

                  NEW YORK — Oct. 24, 2012 — Today at the O’Reilly Strata Conference + Hadoop World, Microsoft Corp. announced new previews of Windows Azure HDInsight Service and Microsoft HDInsight Server for Windows, the company’s Apache Hadoop-based solutions for Windows Azure and Windows Server. The new previews, available today athttp://www.microsoft.com/bigdata, deliver Apache Hadoop compatibility for the enterprise and simplify deployment of Hadoop-based solutions. In addition, delivering these capabilities on the Windows Server and Azure platforms enables customers to use the familiar tools of Excel, PowerPivot for Excel and Power View to easily extract actionable insights from the data.

                  “Big data should provide answers for business, not complexity for IT,” said David Campbell, technical fellow, Microsoft. “Providing Hadoop compatibility on Windows Server and Azure dramatically lowers the barriers to setup and deployment and enables customers to pull insights from any data, any size, on-premises or in the cloud.”

                  The company also announced today an expanded partnership with Hortonworks, a commercial vendor of Hadoop, to give customers access to an enterprise-ready distribution of Hadoop with the newly released solutions.

                  “Hortonworks is the only provider of Apache Hadoop that ensures a 100 percent open source platform,” said Rob Bearden, CEO of Hortonworks. “Our expanded partnership with Microsoft empowers customers to build and deploy on platforms that are fully compatible with Apache Hadoop.”

                  More information about today’s news and working with big data can be found at http://www.microsoft.com/bigdata.

                  Choose the Right Strategy to Reap Big Value From Big Data [feature article for the press, Nov 13, 2012]

                  From devices to storage to analytics, technologies that work together will be key for business’ next information age.

                  REDMOND, Wash. — Nov. 13, 2012 — It seems the gigabyte is going the way of the megabyte — another humble unit of computational measurement that is becoming less and less relevant. Long live the terabyte, impossibly large, increasingly common.
                  Consider this: Of all the data that’s been collected in the world, more than 90 percent has been gathered in the last two years alone. According to a June 2011 report from the McKinsey Global Institute, 15 out of 17 industry sectors of the U.S. have more data stored — per company — than the U.S. Library of Congress.
                  The explosion in data has been catalyzed by several factors. Social media sites such as Facebook and Twitter are creating huge streams of unstructured data in the form of opinions, comments, trends and demographics arising from a vast and growing worldwide conversation.
                  And then there’s the emerging world of machine-generated information. The rise of intelligent systems and the Internet of Things means that more and more specialized devices are connected to information technology — think of a national retail chain that is connected to every one of its point-of-sale terminals across thousands of locations or an automotive plant that can centrally monitor hundreds of robots on the shop floor.
                  Combine it all and some industry observers are predicting that the amount of data stored by organizations across industries will increase ten-fold every five years, much of it coming from new streams that haven’t yet been tapped.
                  It truly is a new information age, and the opportunity is huge. The McKinsey Global Instituteestimates that the U.S. health care system, for example, could save as much as $300 billion from more effective use of data. In Europe, public sector organizations alone stand to save 250 billion euros.
                  In the ever-competitive world of business, data strategy is becoming the next big competitive advantage. According to analyst firm Gartner Group,* “By tapping a continual stream of information from internal and external sources, businesses today have an endless array of new opportunities for: transforming decision-making; discovering new insights; optimizing the business; and innovating their industries.”
                  According to Microsoft’s Ted Kummert, corporate vice president of the Business Platforms Division, companies addressing this challenge today may wonder where to start. How do you know which data to store without knowing what you want to measure? But then again, how do you know what insights the data holds without having it in the first place?
                  “There is latent value in the data itself,” Kummert says. “The good news is storage costs are making it economical to store the data. But that still leaves the question of how to manage it and gain value from it to move your business forward.”
                  With new data services in the cloud such as Windows Azure HDInsight Service and Microsoft HDInsight Server for Windows and Microsoft’s Apache Hadoop-based solutions for Windows Azure and Windows Server, organizations can afford to capture valuable data streams now while they develop their strategy — without making a huge financial bet on a six-month, multimillion-dollar datacenter project.
                  Just having access to the data, says Kummert, can allow companies to start asking much more complicated questions, combining information sources such as geolocation or weather information with internal operational trends such as transaction volume.
                  “In the end, big data is not just about holding lots of information,” he says. “It’s about how you harness it. It’s about insight, allowing end users to get the answers they need and doing so with the tools they use every day, whether that’s desktop applications, devices at the network edge or something else.”
                  His point is often overlooked with all the abstract talk of big data. In the end, it’s still about people, so making it easier for information workers to shift to a new world in which data is paramount is just as important as the information itself. Information technology is great at providing answers, but it still doesn’t know how to ask the right questions, and that’s where having the right analytics tools and applications can help companies make the leap from simply storing mountains of data to actually working with it.
                  That’s why in the Windows 8 world, Kummert says, the platform is designed to extend from devices and phones to servers and services, allowing companies to build a cohesive data strategy from end to end with the ultimate goal of empowering workers.
                  “When we talk about the Microsoft big data platform, we have all of the components to achieve exactly that,” Kummert says. “From the Windows Embedded platform to the Microsoft SQL Server stack through to the Microsoft Office stack. We have all the components to collect the data, store it securely and make it easier for information workers to find it — and, more importantly, understand what it means.”
                  For more information on building intelligent systems to get the most out of business data, please visit the Windows Embedded home page.
                  * Gartner, “Gartner Says Big Data Creates Big Jobs: 4.4 Million IT Jobs Globally to Support Big Data By 2015,” October 2012

                  Which data management solution delivers against today’s top six requirements? [The HP Blog Hub, March 25, 2013]

                  By Manoj Suvarna – Director, Product Management, HP AppSystems

                  In my last post I talked about the six key requirements I believe a data management

                  solution should deliver against today, namely:

                  1.      High performance

                  2.      Fast time to value

                  3.      Built with Big Data as a priority

                  4.      Low cost

                  5.      Simplified management

                  6.      Proven expertise

                  Today, 25th March 2013, HP has announced the HP AppSystem for Microsoft SQL Server 2012 Parallel Data Warehouse, a comprehensive data warerehouse solution jointly engineered with Microsoft, with a wide array of complementary tools, to effectively manage, store, and unlock valuable business insights.

                  Let’s take a look at how the solution delivers against each of the key requirements in turn:

                  1  High performance

                  With its MPP (Massively Parallel Processing) engine, and ‘shared nothing’ architecture, to effectively manage, store, and unlock valuable business insights, the HP AppSystem for Parallel Data Warehouse can deliver linear scale starting from a configuration to support small terabyte requirements all the way up to configurations supporting six Petabytes of data. 

                  The solution features the latest HP ProLiant Gen8 servers, with InfiniBand FDR networking, and uses the xVelocity in-memory analytics engine and the xVelocity memory-optimized columnstore index feature in Microsoft SQL Server 2012 to greatly enhance query performance. 

                  The combination of Microsoft software with HP Converged Infrastructure means HP AppSystem for Parallel Data Warehouse offers leading performance for complex workloads, with up to 100x faster query performance and a 30% faster scan rate than previous generations.

                  2  Fast time to value

                  HP AppSystem for Parallel Data Warehouse is a factory built, turn-key system, delivered complete from HP’s factory as an integrated set of hardware and software including servers, storage, networking, tools, software, services, and support.   Not only is the solution pre-integrated, but it’s backed by unique, collaborative HP and Microsoft support with onsite installation and deployment services to smooth implementation.  

                  3  Built with Big Data as a priority

                  Designed to integrate with Hadoop, HP AppSystem for Parallel Data Warehouse is ideally suited for “Big Data” environments. This integration allows customers to perform comprehensive analytics on unstructured, semi-structured and structured data, to effectively gain business insights and make better, faster decisions.

                  4  Simplified management

                  Providing the optimal management environment has been a critical element of the design, and is delivered through HP Support Pack Utility Suite.  This set of tools simplifies updates and several other maintenance tasks across the system to ensure that it is continually running at optimal performance.  Unique in the industry, HP Support Pack Utility Suite can deliver up to 2000 firmware updates with the click of a button.  In addition, the HP AppSystem for Parallel Data Warehouse is manageable via the Microsoft System Center console, leveraging deep integration with HP Insight Control.

                  5  Low cost

                  The HP AppSystem for Parallel Data Warehouse has been designed as part of an end to end stack for data management, integrating data warehousing seamlessly with BI solutions to minimize the cost of ownership.

                  It has also been re-designed with a new form factor to minimize space and maximize ease of expansion, which means the entry point for a quarter rack system is approximately 35% less expensive than the previous generation solution.    It is expandable in modular increments up to 64 nodes, which means no need for the type of fork-lift upgrade that might be needed with a proprietary solution, and is targeted to be approximately half the cost per TB of comparable offerings in the market from Oracle, IBM, and EMC*.

                  6 Proven expertise

                  Together HP and Microsoft have over 30 years experience delivering integrated solutions from desktop to datacenter.  HP AppSystem for Parallel Data Warehouse completes the portfolio for HP Data Management solutions, which give customers the ability to deliver insights on any data, of any size, combining best in class Microsoft software with HP Converged Infrastructure.

                  For customers, our ability to deliver on the requirements above ultimately provides agility for faster, lower risk deployment of data management in the enterprise, helping them make key business decisions more quickly and drive more value to the organization.

                  If you’d like to find out more, please go to www.hp.com/solutions/microsoft/pdw.

                  http://www.valueprism.com/resources/resources/Resources/PDW%20Compete%20Pricing%20FINAL.pdf

                  HP AppSystem for SQL 2012 Parallel Data Warehouse [HP product page, March 25, 2013]

                  Overview

                  Rapid time-to-value data warehouse solution

                  The HP AppSystem for Microsoft SQL Server 2012 Parallel Data Warehouse, jointly engineered, built and supported with Microsoft, is for customers who realize limitations and inefficiencies of their legacy data warehouse infrastructure. This converged system solution delivers significant advances over the previous generation solution including:

                  Enhanced performance and massive scalability

                  • Up to 100x faster query performance and a 30% faster scan rate
                  • Ability to start from small terabyte requirements that can  linearly scale out to 6 Petabytes for mission critical needs

                  Minimize costs and management complexity

                  • Redesigned form factor minimizes space  and allows ease of expansion with significant up-front acquisition savings as well as reduce OPEX heating, cooling and real estate cost requirements
                  • Appliance  solution is pre-built and tested as a complete, end-to-end stack — easy to deploy and minimal technical resources required
                  • Extensive integration of Microsoft and 3rd party tools  allow users to work with familiar tools like Excel as well as within heterogeneous BI environments
                  • Unique HP Support Pack Utility Suite set of tools significantly simplifies updates and  other maintenance tasks to ensure system is running at optimal performance

                  Reduce risks and manage change

                  • Services delivered jointly under a unique collaborative support agreement, integrated across hardware and software, to help avoid IT disruptions and deliver faster resolution to issues
                  • Backed by more than 48,000 Microsoft professionals—with more than 12,000 Microsoft Certified—one of the largest, most specialized forces of consultants and support professionals for Microsoft environments in the world

                  Solution Components

                  HP Products

                    HP Services

                    HP Software

                      Partner’s Software

                        HP Support

                        [also available with Dell Parallel Data Warehouse Appliance]
                        Appliance: Parallel Data Warehouse (PDW) [Microsoft PDW Software product page, Feb 27, 2013]

                        PDW is a massively parallel processing data warehousing appliance built for any volume of relational data (with up to 100x performance gains) and provides the simplest integration to Hadoop.

                        Unlike other vendors who opt to provide their high-end appliances for a high price or provide a relational data warehouse appliance that is disconnected from their “Big Data” and/or BI offerings, Microsoft SQL Server Parallel Data Warehouse provides both a high-end massively parallel processing appliance that can improve your query response times up to 100x over legacy solutions as well as seamless integration to both Hadoop and with familiar business intelligence solutions. What’s more, it was engineered to lower ongoing costs resulting in a solution that has the lowest price/terabyte in the market.

                        What’s New in SQL Server 2012 Parallel Data Warehouse

                        Key Capabilities

                        • Built For Big Data with PolyBase

                          SQL Server 2012 Parallel Data Warehouse introduces PolyBase, a fundamental breakthrough in data processing used to enable seamless integration between traditional data warehouses and “Big Data” deployments.

                          • Use standard SQL queries (instead of MapReduce) to access and join Hadoop data with relational data.
                          • Query Hadoop data without IT having to pre-load data first into the warehouse.
                          • Native Microsoft BI Integration allowing analysis of relational and non-relational data with familiar tools like Excel.
                        • Next-Generation Performance at Scale

                          Scale and perform beyond your traditional SQL Server deployment with PDW’s massively parallel processing (MPP) appliance that can handle the extremes of your largest mission critical requirements of performance and scale.

                          • Up to 100x faster than legacy warehouses with xVelocity updateable columnstore.
                          • Massively Parallel Processing (MPP) architecture that parallelizes and distributes computing for high query concurrency and complexity.
                          • Rest assured with built-in hardware redundancies for fault tolerance.
                          • Rely on Microsoft as your single point of contact for hardware and software support.
                        • Engineered For Optimal Value

                          Unlike other vendors in the data warehousing space who deliver a high-end appliance at a high price, Microsoft engineered PDW for optimal value by lowering the cost of the appliance.

                          • Resilient, scalable, and high performance storage features built into software lowering hardware costs.
                          • Compress data up to 15x with the xVelocity updateable columnstore saving up to 70% of storage requirements.
                          • Start small with a quarter rack allowing you to right-size the appliance rather than over-acquiring capacity.
                          • Use the same tools and knowledge as SQL Server without retaining new tools or knowledge for scale-out DW or Big Data.
                          • Co-engineered with hardware partners offering highest level of product integration and shipped to your door offering fastest time to value.
                          • The lowest price/terabyte than overall appliance market (and 2.5x lower than SQL 2008 R2 PDW).

                          PolyBase [Microsoft page, Feb 26, 2013]

                          PolyBase is a fundamental breakthrough in data processing used in SQL Server 2012 Parallel Data Warehouse to enable truly integrated query across Hadoop and relational data.

                          Complementing Microsoft’s overall Big Data strategy, PolyBase is a breakthrough new technology on the data processing engine in SQL Server 2012 Parallel Data Warehouse designed as the simplest way to combine non-relational data and traditional relational data in your analysis. While customers would normally burden IT to pre-populate the warehouse with Hadoop data or undergo an extensive training on MapReduce in order to query non-relational data, PolyBase does this all seamlessly giving you the benefits of “Big Data” without the complexities.

                          Key Capabilities

                          • Unifies Relational and Non-relational Data

                            PolyBase is one of the most exciting technologies to emerge in recent times because it unifies the relational and non-relational worlds at the query level. Instead of learning a new query like MapReduce, customers can leverage what they already know (T-SQL)

                            • Integrated Query: Accepts a standard T-SQL query that joins tables containing a relational source with tables in a Hadoop cluster without needing to learn MapReduce.
                            • Advanced query options: Apart from simple SELECT queries, users can perform JOINs and GROUP BYs on data in the Hadoop cluster.
                          • Enables In-place Queries with Familiar BI Tools

                            Microsoft Business Intelligence (BI) integration enables users to connect to PDW with familiar tools such as Microsoft Excel, to create compelling visualizations and make key business decisions from structured or unstructured data quickly.

                            • Integrated BI tools: End users can connect to both relational or Hadoop data with Excel abstracting the complexities of both.
                            • Interactive visualizations: Explore data residing in HDFS using Power View for immersive interactivity and visualizations.
                            • Query in-place: IT doesn’t have to pre-load or pre-move data from Hadoop into the data warehouse and pre-join the data before end users do the analysis.
                          • Part of an Overall Microsoft Big Data Story

                            PolyBase is part of an overall Microsoft “Big Data” solution that already includes HDInsight (a 100% Apache Hadoop compatible distribution for Windows Server and Windows Azure), Microsoft Business Intelligence, and SQL Server 2012 Parallel Data Warehouse.

                            • Integrated with HDInsight: PolyBase can source the non-relational analysis from Microsoft’s 100% Apache compatible Hadoop distribution, HD Insights.
                            • Built into PDW: PolyBase is built into SQL Server 2012 Parallel Data Warehouse to bring “Big Data” benefits within the power of a traditional data warehouse.
                            • Integrated BI tools: PolyBase has native integration with familiar BI tools like Excel (through Power View and PowerPivot).

                          Announcing Power BI for Office 365 [Office News, July 8, 2013]

                          Today, at the Worldwide Partner Conference, we announced a new offering–Power BI for Office 365. Power BI for Office 365 is a cloud-based business intelligence (BI) solution that enables our customers to easily gain insights from their data, working within Excel to analyze and visualize the data in…

                          Exciting new BI features in Excel [Excel Blog, July 9, 2013]

                          Yesterday during the Microsoft’s Worldwide Partner Conference we announced some exciting new Business Intelligence (BI) features available for Excel. Specifically, we announced the expansion of the BI offerings available as part of Power BIa cloud-based BI solution that enables our customers to easily gain insights from their data, working within Excel to analyze and visualize the data in a self-service way.

                          Power BI for Office 365 now includes:

                          • Power Query, enabling customers to easily search and access public data and their organization’s data, all within Excel (formerly known as “Data Explorer).  Download details here
                          • Power Map, a 3D data visualization tool for mapping, exploring and interacting with geographic and temporal data (formerly known as product codename “Geoflow).  Download details here.
                          • Power Pivot for creating and customizing flexible data models within Excel. 
                          • Power View for creating interactive charts, graphs and other visual representations of data.

                          Head on over to the Office 365 Technology Blog, Office News Blog, and Power BI site to learn more.

                          Clearing up some confusion around the Power BI “Release” [A.J. Mee’s Business Intelligence and Big Data Blog, Aug 13, 2013]

                          Hey folks.  Thanks again for checking out my blog.
                          Yesterday (8/12/2013), Power BI received some attention from the press.  Here’s one of the articles that I had seen talking about the “release” of Power BI:
                          http://www.neowin.net/news/microsoft-releases-power-bi-office-365-for-windows-8rt

                          Some of us inside Microsoft had to address all sorts of questions around this one.  For the most part, the questions revolved around the *scope* of what was actually released.  You have to remember that Power BI is a broad brand name that takes into account:

                          * Power Pivot/View/Query/Map (which is available now, for the most part)

                          * The Office 365 hosting of Power BI applications with cloud-to-on-premise data refresh, Natural Language query, data stewardship, etc..

                          * The Mobile BI app for Windows and iOS devices

                          Net-net: we announced the availability of the Mobile app (in preview form).  At present, it is only available on Windows 8 devices (x86 or ARM) – no iOS just yet.  The rest of the O365 / Power BI offering is yet to come.  Check out this article to find out how to sign up.
                          http://blogs.msdn.com/b/ajmee/archive/2013/07/17/how-can-i-check-out-power-bi.aspx
                          So, the headline story is really all around the Mobile app.  You can grab it today from the Store – just search on “Power BI” and it should be the first app that shows up.

                          From: Power Map for Excel earns new name with significant updates to 3D visualizations and storytelling [Excel Blog, Sept 25, 2013]

                          We are announcing a significant update to Power Map Preview for Excel (formerly Project codename “GeoFlow” Preview for Excel) on the Microsoft Download Center. Just over five months ago, we launched the preview of Project codename “GeoFlow” amidst a passionately announced “tour” of global song artists through the years by Amir Netz (see 1:17:00 in the keynote) at the first ever PASS Business Analytics conference in April. The 3D visualization add-in has now become a centerpiece visualization (along with Power View) within the business intelligence capabilities of Microsoft Power BI in Excel, earning the new name Power Map to align with other Excel features (Power Query, Power Pivot, and Power View).

                          Information workers with their data in Excel have realized the potential of Power Map to identify insights in their geospatial and time-based data that traditional 2D charts cannot. Digital marketers can better target and time their campaigns while environmentally-conscious companies can fine-tune energy-saving programs across peak usage times. These are just a few of the examples of how location-based data is coming alive for customers using Power Map and distancing them from their competitors who are still staring blankly at a flat table, chart, or map. Feedback from customers like this lead us to introduce Power Map with some new features across experience of mapping data, discovering insights, and sharing stories.

                          From: Microsoft unleashes fall wave of enterprise cloud solutions [press release, Oct 7, 2013]

                          New Windows Server, System Center, Visual Studio, Windows Azure, Windows Intune, SQL Server, and Dynamics solutions will accelerate cloud benefits for customers.

                          REDMOND, Wash. — Oct. 7, 2013 — Microsoft Corp. on Monday announced a wave of new enterprise products and services to help companies seize the opportunities of cloud computing and overcome today’s top IT challenges. Complementing Office 365 and other services, these new offerings deliver on Microsoft’s enterprise cloud strategy.

                          Data platform and insights

                          As part of its vision to help more people unlock actionable insights from big data, Microsoft next week will release a second preview of SQL Server 2014. The new version offers industry-leading in-memory technologies at no additional cost, giving customers 10 times to 30 times performance improvements without application rewrites or new hardware. SQL Server 2014 also works with Windows Azure to give customers built-in cloud backup and disaster recovery.

                          For big data analytics, later this month Microsoft will release Windows Azure HDInsight Service, an Apache Hadoop-based service that works with SQL Server and widely used business intelligence tools, such as Microsoft Excel and Power BI for Office 365. With Power BI, people can combine private and public data in the cloud for rich visualizations and fast insights.

                          How to take full advantage of Power BI in Excel 2013 [News from Microsoft Business UK, Oct 14, 2013]

                          The launch of Power BI features in Excel 2013 gives users an added range of options for data analysis and gaining business intelligence (BI). Power Query, Power Pivot, Power View, and Power Map work seamlessly together, making it much simpler to discover and visualise data. And for small businesses looking to take advantage of self-service intelligence solutions, this is a major stride forwards.

                          Power Query

                          With Power Query, users can search the entire cloud for data – both public and private. With access to multiple data sources, users can filter, shape, merge, and append the information, without the need to physically bring it in to Excel.

                          Once your query is shaped and filtered how you want it, you can download it into a worksheet in Excel, into the Data Model, or both. When you have the dataset you need, shaped and formed and properly merged, you can save the query that created it, and share it with other users.

                          Power Pivot

                          Power Pivot enables users to create their own data models from various sources, structured to meet individual needs. You can customise, extend with calculations and hierarchies, and manage the powerful Data Model that is part of Excel.

                          The solution works seamlessly and automatically with Power Query, and with other features of Power BI, allowing you to manage and extend your own custom database in the familiar environment of Excel. The entire Data Model in Power Pivot – including tables, columns, calculations and hierarchies – exist as report-ready elements in Power View.

                          Power View

                          Power View allows users to create engaging, interactive, and insightful visualisations with just a few clicks of their mouse. The tool brings the Data Model alive, turning queries into visual analysis and answers. Data can be presented in a variety of different forms, with the reports easily shareable and open for interactive analysis.

                          Power Map

                          A relatively new addition to ExcelPower Map is a geocentric and temporal mapping feature of Power BI. It brings location data into powerful, engaging 3D map visualisations. This allows users to create location-based reports, visualised over a time continuum, that tour the available data.

                          Using the features together

                          Power BI offers a collection of services which are designed to make self-service BI intuitive and collaborative. The solution combines the power and familiarity of Excel with collaboration and cloud-based functionality. This vastly increases users’ capacity to gather, manage and draw insights from data, ensuring they can make the most of business intelligence.

                          The various feature of BI can add value independently, but the real value is in integration. When used in conjunction with one another – rather than in silo – the services become more than the sum of their parts. They are designed to work seamlessly together in Excel 2013, supporting users as they look to find data, process it and create visualisations which add value to the decision making process.

                          Posted by Alex Boardman

                          Related upcoming technology announcements from Intel:

                          GraphBuilder: Revealing hidden structure within Big Data [Intel Labs blog, Dec 6, 2012]

                          By Ted Willke, Principal Engineer with Intel and the General Manager of the Graph Analytics Operation in Intel Labs.

                          Big Data.  Big.  Data.  We hear the term frequently used to describe data of unusual size or generated at spectacular velocity, like the amount of social data that Facebook has amassed on us (30 PB in one cluster) or the rate at which sensors at the Large Hadron Collider collect information on subatomic particles (15 PB/year).  And it’s often deemed “unstructured or semi-structured” to describe its lack of apparent, well, structure.  What’s meant is that this data isn’t organized in a way that can directly answer questions, like a database can if you ask it how many widgets you sold last week.

                          But Big Data does have structure; it just needs to be discovered from within the raw text, images, video, sensor data, etc., that comprise it.  And, companies, led by pioneers like Google, have been doing this for the better part of a decade, using applications that churn through the information using data-parallel processing and convenient frameworks for it, like Hadoop MapReduce.  Their systems chop the incoming data into slices, farm it out to masses of machines, which subsequently filter it, order it, sum it, transform it, and do just about anything you’d want to do with it, within the practical limits of the readily available frameworks.

                          But until recently, only the wizards of Big Data were able to rapidly extract knowledge from a different type of structure within the data, a type that is best modeled by tree or graph structures.  Imagine the pattern of hyperlinks connecting Wikipedia pages or the connections between Tweeters and Followers on Twitter.  In these models, a line is drawn between two bits of information if they are related to each other in some way.  The nature of the connection can be less obvious than in these examples and made specifically to serve a particular algorithm.  For example, a popular form of machine learning called Latent Dirichlet Allocation (a mouthful, I know) can create “word clouds” of topics in a set of documents without being told the topics in advance. All it needs is a graph that connects word occurrences to the filenames.  Another algorithm can accurately guess the type of noun (i.e., person, place, or thing) if given a graph that connects noun phrases to surrounding context phrases.

                          Many of these graphs are very large, with tens of billions of vertices (i.e., things being related) and hundreds of billions of edges (i.e., the relationships).  And, many that model natural phenomena possess power-law degree distributions, meaning that many vertices connect to a handful of others, but a few may have edges to a substantial portion of the vertices.  For instance, a graph of Twitter relationships would show that many people only have a few dozen followers while only a handful of celebrities have millions. This is all very problematic for parallel computation in general and MapReduce in particular.  As a result, Carlos Guestrin and his crack team at the University of Washington in Seattle have developed a new framework, called GraphLab, that is specifically designed for graph-based parallel machine learning.  In many cases, GraphLab can process such graphs 20-50X faster than Hadoop MapReduce.  Learn more about their exciting work here.

                          Carlos is a member of the Intel Science and Technology Center for Cloud Computing, and we started working with him on graph-based machine learning and data mining challenges in 2011.  Quickly it became clear that no one had a good story about how to construct large-scale graphs that frameworks like GraphLab could digest.  His team was constantly writing scripts to construct different graphs from various unstructured data sources.  These scripts ran on a single machine and would take a very long time to execute.  Essentially, they were using a labor-intensive, low-performance method to feed information to their elegant high-performance GraphLab framework.  This simply would not do.

                          Scanning the environment, we identified a more general hole in the open source ecosystem: A number of systems were out there to process, store, visualize, and mine graphs but, surprisingly, not to construct them from unstructured sources.  So, we set out to develop a demo of a scalable graph construction library for Hadoop.  Yes, for Hadoop.  Hadoop is not good for graph-based machine learning but graph construction is another story.  This work became GraphBuilder, which was demonstrated in July at the First GraphLab Workshop on Large-Scale Machine Learning and open sourced this week at 01.org (under Apache 2.0 licensing).

                          GraphBuilder not only constructs large-scale graphs fast but also offloads many of the complexities of graph construction, including graph formation, cleaning, compression, partitioning, and serialization.  This makes it easy for just about anyone to build graphs for interesting research and commercial applications.  In fact, GraphBuilder makes it possible for a Java programmer to build an internet-scale graph for PageRank in about 100 lines of code and a Wikipedia-sized graph for LDA in about 130.

                          This is only the beginning for GraphBuilder but it has already made a lot of connections.  We will continually update it with new capabilities, so please try it out and let us know if you’d value something in particular.  And, let us know if you’ve got an interesting graph problem for us to grind through.  We are always looking for new revelations.

                          Intel, Facebook Collaborate on Future Data Center Rack Technologies  [press release, Jan 16, 2013]

                          New Photonic Architecture Promises to Dramatically Change Next Decade of Disaggregated, Rack-Scale Server Designs

                          NEWS HIGHLIGHTS

                          • Intel and Facebook* are collaborating to define the next generation of rack technologies that enables the disaggregation of compute, network and storage resources.
                          • Quanta Computer* unveiled a mechanical prototype of the rack architecture to show the total cost, design and reliability improvement potential of disaggregation.
                          • The mechanical prototype includes Intel Silicon Photonics Technology, distributed input/output using Intel Ethernet switch silicon, and supports the Intel® Xeon® processor and the next-generation system-on-chip Intel® Atom™ processor code named “Avoton.”
                          • Intel has moved its silicon photonics efforts beyond research and development, and the company has produced engineering samples that run at speeds of up to 100 gigabits per second (Gbps).

                          OPEN COMPUTE SUMMIT, Santa Clara, Calif., Jan. 16, 2013 – Intel Corporation announced a collaboration with Facebook* to define the next generation of rack technologies used to power the world’s largest data centers. As part of the collaboration, the companies also unveiled a mechanical prototype built by Quanta Computer* that includes Intel’s new, innovative photonic rack architecture to show the total cost, design and reliability improvement potential of a disaggregated rack environment.

                          “Intel and Facebook are collaborating on a new disaggregated, rack-scale server architecture that enables independent upgrading of compute, network and storage subsystems that will define the future of mega-datacenter designs for the next decade,” said Justin Rattner, Intel’s chief technology officer during his keynote address at Open Computer Summit in Santa Clara, Calif. “The disaggregated rack architecture [since renamed RSA (Rack Scale Architecture)] includes Intel’s new photonic architecture, based on high-bandwidth, 100Gbps Intel® Silicon Photonics Technology, that enables fewer cables, increased bandwidth, farther reach and extreme power efficiency compared to today’s copper based interconnects.”

                          Rattner explained that the new architecture is based on more than a decade’s worth of research to invent a family of silicon-based photonic devices, including lasers, modulators and detectors using low-cost silicon to fully integrate photonic devices of unprecedented speed and energy efficiency. Silicon photonics is a new approach to using light (photons) to move huge amounts of data at very high speeds with extremely low power over a thin optical fiber rather than using electrical signals over a copper cable. Intel has spent the past two years proving its silicon photonics technology was production-worthy, and has now produced engineering samples.

                          Silicon photonics made with inexpensive silicon rather than expensive and exotic optical materials provides a distinct cost advantage over older optical technologies in addition to providing greater speed, reliability and scalability benefits. Businesses with server farms or massive data centers could eliminate performance bottlenecks and ensure long-term upgradability while saving significant operational costs in space and energy.

                          Silicon Photonics and Disaggregation Efficiencies

                          Businesses with large data centers can significantly reduce capital expenditure by disaggregating or separating compute and storage resources in a server rack. Rack disaggregation refers to the separation of those resources that currently exist in a rack, including compute, storage, networking and power distribution into discrete modules. Traditionally, a server within a rack would each have its own group of resources. When disaggregated, resource types can be grouped together and distributed throughout the rack, improving upgradability, flexibility and reliability while lowering costs.

                          “We’re excited about the flexibility that these technologies can bring to hardware and how silicon photonics will enable us to interconnect these resources with less concern about their physical placement,” said Frank Frankovsky, chairman of the Open Compute Foundation and vice president of hardware design at supply chain at Facebook. “We’re confident that developing these technologies in the open and contributing them back to the Open Compute Project will yield an unprecedented pace of innovation, ultimately enabling the entire industry to close the utilization gap that exists with today’s systems designs.”

                          By separating critical components from one another, each computer resource can be upgraded on its own cadence without being coupled to the others. This provides increased lifespan for each resource and enables IT managers to replace just that resource instead of the entire system. This increased serviceability and flexibility drives improved total-cost for infrastructure investments as well as higher levels of resiliency. There are also thermal efficiency opportunities by allowing more optimal component placement within a rack.

                          The mechanical prototype is a demonstration of Intel’s photonic rack architecture for interconnecting the various resources, showing one of the ways compute, network and storage resources can be disaggregated within a rack. Intel will contribute a design for enabling a photonic receptacle to the Open Compute Project (OCP) and will work with Facebook*, Corning*, and others over time to standardize the design. The mechanical prototype includes distributed input/output (I/O) using Intel Ethernet switch silicon, and will support the Intel® Xeon® processor and the next generation, 22 nanometer system-on-chip (SoC) Intel® Atom™ processor, code named “Avoton” available this year.

                          The mechanical prototype shown today is the next evolution of rack disaggregation with separate distributed switching functions.

                          Intel and Facebook: A History of Collaboration and Contributions

                          Intel and Facebook have long been technology collaboration partners on hardware and software optimizations to drive more efficiency and scale for Facebook data centers. Intel is also a founding board member of the OCP, along with Facebook. Intel has several OCP engagements in flight including working with the industry to design OCP boards for Intel Xeon and Intel Atom based processors, support for cold storage with the Intel Atom processor, and common hardware management as well as future rack definitions including enabling today’s photonics receptacle.

                          Disruptive technologies to unlock the power of Big Data [Intel Labs blog, Feb 26, 2013]

                          By Ted Willke, Principal Engineer with Intel and the General Manager of the Graph Analytics Operation in Intel Labs.

                          This week’s announcement by Intel that it’s expanding the availability of the Intel® Distribution for Apache Hadoop* software to the US market is seriously exciting for the employees of this semiconductor giant, especially researchers like me.  Why?  Why would I say this given the amount of overexposure that Hadoop has received?  I mean, isn’t this technology nearly 10 years old already??!!  Well, because the only thing I hear more than people touting Hadoop’s promise are people venting frustration in implementing it.  Rest assured that Intel is listening.  We get that users don’t want to make a career out of configuring Hadoop… debugging it…  managing it… and trying to figure out why the “insight” it’s supposed to be delivering often looks like meaningless noise.

                          Which brings me back to why this is a seriously exciting event for me.  With our product teams doing the heavy lifting of making the Hadoop framework less rigid and easier to use while keeping it inexpensive, Intel Labs gets a landing zone for some cool disruptive technologies. In December, I blogged about the launch of our open source scalable graph construction library for Hadoop, called Intel® Graph Builder for Apache Hadoop software (f.k.a. GraphBuilder), and explained how it makes it easy to construct large scale graphs for machine learning and data mining. These structures can yield insights from relationships hidden within a wide range of big data sources, from social media and business analytics to medicine and e-science. Today I’ll delve a bit more into Graph Builder technology and introduce the Intel® Active Tuner for Apache Hadoop software, an auto-tuner that uses Artificial Intelligence (AI) to configure Hadoop for optimal performance.  Both technologies will be available in the Intel Distribution.

                          So, Intel® Graph Builder leverages Hadoop MapReduce to turn large unstructured (or semi-structured) datasets into structured output in graph form.  This kind of graph may be mined using graph search of the sort that Facebook recently announced.  Many companies would like construct such graphs out of unstructured datasets and Graph Builder makes it possible.  Beyond search, analysis may be applied to an entire graph to answer questions of the type shown in the figure below.  The analysis may be performed using distributed algorithms implemented in frameworks like GraphLab, which I also discussed in my previous post.

                          image

                          Intel® Graph Builder performs extract, transform, and load operations, terms borrowed from databases and data warehousing.  And, it does so at Hadoop MapReduce scale.  Text is parsed and tokenized to extract interesting features.  These operations are described in a short map-reduce program written by the data scientist.  This program also defines when two vertices (i.e., features) in the graph are related by an edge.  The rule is applied repeatedly to form the graph’s topology (i.e., the pattern of edge relationships between vertices), which is stored via the library.  In addition, most applications require that additional tabulated information, or “network information,” be associated with each vertex/edge and the library provides a number of distributed algorithms for these tabulations.

                          At this point, we have a large-scale graph ready for HDFS, HBase, or another distributed store.  But we need to do a few more things to ensure that queries and computations on the graph will scale up nicely, like:

                          • Cleaning the graph’s structure and checking that it is reasonable
                          • Compressing the graph and network information to conserve cluster resources
                          • Partitioning the graph in a way that will minimize cluster communications while load balancing computational effort

                          The Intel Graph Builder library provides efficient distributed algorithms for all of the above, and more, so that data scientists can spend more of their time analyzing data and less of their time preparing it.  Enough said. The library will be included in the Intel Distribution shortly and we look forward to your feedback.  We are constantly on the hunt for new features as we look to the future of big data.

                          Whereas Intel® Graph Builder was developed to simplify the programming of emerging applications, Intel® Active Tuner was developed to simplify the deployment of today’s applications by automating the selection of configuration settings that will result in optimal cluster performance. In fact, we initially codenamed this technology “Gunther,” after a well-known circus elephant trainer, because of its ability to train Hadoop to run faster :-) .  It’s cruelty-free to boot, I promise.  Anyway, many Hadoop configuration parameters need to be tuned for the characteristics of each particular application, such as web search, medical image analysis, audio feature analysis, fraud detection, semantic analysis, etc.  This tuning significantly reduces both job execution and query time but is time consuming and requires domain expertise. If you use Hadoop you know that the common practice is to tune it up using rule-of-thumb settings published by industry leaders.  But these recommendations are too general and fail to capture the specific requirements of a given application and cluster resource constraints.  Enter the Active Tuner.

                          Intel® Active Tuner implements a search engine that uses a small number of representative jobs to identify the best configuration from among millions or billions of possible Hadoop configurations.  It uses a form of AI known as a genetic algorithm to search out the best settings for the number of maps, buffer sizes, compression settings, etc., constantly striving to derive better settings by combining those from pairs of trials that show the most promise (this is where the genetic part comes in) and deriving future trials from these new combinations.  And, the Active Tuner can do this faster and more effectively than a human can using the rules-of-thumb.  It can be controlled from a slick GUI in the new Intel Manager for Apache Hadoop, so take it for a test run when you pick up a copy of the Intel Distribution.  You may see your cluster performance improve by up to 30% without any hassle.

                          To wrap, these are one-of-a-kind technologies that I think you’ll have fun playing with.  And, despite offering quite a lot, Intel® Graph Builder and Intel® Active Tuner are just the beginning.  I am very excited by what’s coming next.  Intel is moving to unlock the power of Big Data and Intel Labs is preparing to blow it wide open.

                          *Other names and brands may be claimed as the property of others

                          Intel Unveils New Technologies for Efficient Cloud Datacenters [press release, Sept 4, 2013]

                          From New SoCs to Optical Fiber, Intel Delivers Cloud-Optimized Innovations Across Network, Storage, Microservers, and Rack Designs

                          NEWS HIGHLIGHTS

                          • The Intel® Atom™ C2000 processor family is the first based on Silvermont micro-architecture, has 13 customized configurations and is aimed at microservers, entry-level networking and cold storage.
                          • New 64-bit, system-on-chip family for the datacenter delivers up to six times1 the energy efficiency and up to seven times2 the performance compared to previous generation.
                          • The first live demonstration of a Rack Scale Architecture-based system with high-speed Intel® Silicon Photonics components including a new MXC connector and ClearCurve* optical fiber developed in collaboration with Corning*, enabling data transfers speeds up to 1.6 terabits4 per second at distances up to 300 meters5 for greater rack density.

                          SAN FRANCISCO, Calif., September 4, 2013 – Intel Corporation today introduced a portfolio of datacenter products and technologies for cloud service providers looking to drive greater efficiency and flexibility into their infrastructure to support a growing demand for new services and future innovation.

                          Server, network and storage infrastructure is evolving to better suit an increasingly diverse set of lightweight workloads, creating the emergence of microserver, cold storage and entry networking segments. By optimizing technologies for specific workloads, Intel will help cloud providers significantly increase utilization, drive down costs and provide compelling and consistent experiences to consumers and businesses.

                          The portfolio includes the second generation 64-bit Intel® Atom™ C2000 product family of system-on-chip (SoC) designs for microservers and cold storage platforms (code named “Avoton”) and for entry networking platforms (code named “Rangeley”). These new SoCs are the company’s first products based on the Silvermont micro-architecture, the new design in its leading 22nm Tri-Gate SoC process delivering significant increases in performance and energy efficiency, and arrives only nine months after the previous generation.

                          “As the world becomes more and more mobile, the pressure to support billions of devices and users is changing the very composition of datacenters,” said Diane Bryant, senior vice president and general manager of the Datacenter and Connected Systems Group at Intel. “From leadership in silicon and SoC design to rack architecture and software enabling, Intel is providing the key innovations that original equipment manufacturers, telecommunications equipment makers and cloud service providers require to build the datacenters of the future.”

                          Intel also introduced the Intel® Ethernet Switch FM5224 silicon which, when combined with the WindRiver Open Network Software suite, brings Software Defined Networking (SDN) solutions to servers for improved density and lower power.

                          Intel also demonstrated the first operational Intel Rack Scale Architecture (RSA)-based rack with Intel® Silicon PhotonicsTechnology in combination with the disclosure of a new MXC connector and ClearCurve* optical fiber developed by Corning* with requirements from Intel. This demonstration highlights the speed with which Intel and the industry are moving from concept to functionality.

                          Customized, Optimized Intel® Atom™ SoCs for New and Existing Market Segments
                          Manufactured using Intel’s leading 22nm process technology, the new Intel Atom C2000 product family features up to eight cores, a range of 6 to 20Watts TDP, integrated Ethernet and support for up to 64 gigabytes (GB) of memory, eight times the previous generation. OVH* and 1&1, leading global web-hosting services companies, have tested Intel Atom C2000 SoCs and plan to deploy them in its entry-level dedicated hosting services next quarter. The 22 nanometer process technology delivers superior performance and performance per watt.

                          Intel is delivering 13 specific models with customized features and accelerators that are optimized for particular lightweight workloads such as entry dedicated hosting, distributed memory caching, static web serving and content delivery to ensure greater efficiency. The designs allow Intel to expand into new markets like cold storage and entry-level networking.

                          For example, the new Intel Atom configurations for entry networking address the specialized needs for securing and routing Internet traffic more efficiently. The product features a set of hardware accelerators called Intel® QuickAssist Technology that improves cryptographic performance. They are ideally suited for routers and security appliances.

                          By consolidating three communications workloads – application, control and packet processing – on a common platform, providers now have tremendous flexibility. They will be able to meet the changing network demands while adding performance, reducing costs and improving time-to-market.

                          Ericsson, a world-leading provider of communications technology and services announced that its blade-based switches used in the Ericsson Cloud System, a solution which enables service providers to add cloud capabilities to their existing networks, will sooninclude the Intel Atom C2000 SoC product family.

                          Microserver-Optimized Switch for Software Defined Networking
                          Network solutions that manage data traffic across microservers can significantly impact the performance and density of the system. The unique combination of the Intel Ethernet Switch FM5224 silicon and the WindRiver Open Network Software suite will enable the industry’s first 2.5GbE, high-density, low latency, SDN Ethernet switch solutions specifically developed for microservers. The solution enhances system level innovation, and complements the integrated Intel Ethernet controller within the Intel Atom C2000 processor. Together, they can be used to create SDN solutions for the datacenter.

                          Switches using the new Intel Ethernet Switch FM5224 silicon can connect up to 64 microservers, providing up to 30 percent3 higher node density. They are based on Intel Open Network Platform reference design announced earlier this year.

                          First Demonstration of Silicon Photonics-Powered Rack
                          Maximum datacenter efficiency requires innovation at the silicon, system and rack level. Intel’s RSA design helps industry partners to re-architect datacenters for modularity of components (storage, CPU, memory, network) at the rack level. It provides the ability to provision or logically compose resources based on application specific workload requirements. Intel RSA also will allow for the easier replacement and configuration of components when deploying cloud computing, storage and networking resources.

                          Intel today demonstrated the first operational RSA-based rack equipped with the newly announced Intel Atom C2000 processors, Intel® Xeon® processors, a top-of-rack Intel SDN-enabled switch and Intel Silicon Photonics Technology. As part of the demonstration, Intel also disclosed the new MXC connector and ClearCurve* fiber technology developed by Corning* with requirements from Intel. The fiber connections are specifically designed to work with Intel Silicon Photonics components.

                          The collaboration underscores the tremendous need for high-speed bandwidth within datacenters. By sending photons over a thin optical fiber instead of electrical signals over a copper cable, the new technologies are capable of transferring massive amounts of data at unprecedented speeds over greater distances. The transfers can be as fast as 1.6 terabits per second4 at lengths up to 300 meters5 throughout the datacenter.

                          To highlight the growing range of Intel RSA implementations, Microsoft and Intel announced a collaboration to innovate on Microsoft’s next-generation RSA rack design. The goal is to bring even better utilization, economics and flexibility to Microsoft’s datacenters.

                          The Intel Atom C2000 product family is shipping to customers now with more than 50 designs for microservers, cold storage and networking. The products are expected to be available in the coming months from vendors including Advantech*, Dell*, Ericsson*, HP*, NEC*, Newisys*, Penguin Computing*, Portwell*, Quanta*, Supermicro*, WiWynn*, ZNYX Networks*.

                          Intel Brings Supercomputing Horsepower to Big Data Analytics [press release, Nov 19, 2013]

                          NEWS HIGHLIGHTS.

                          • Intel discloses form factors and memory configuration details of the CPU version of the next generation Intel® Xeon Phi™ processor (code named “Knights Landing“), to ease programmability for developers while improving performance.
                          • Intel® Xeon® processor-based systems power more than 82 percent of all supercomputers on the recently announced 42nd edition of the Top500 list.
                          • New Intel® HPC Distribution for Apache Hadoop* and Intel® Cloud Edition for Lustre* software tools bring the benefits of Big Data analytics and HPC together.
                          • Collaboration with HPC community designed to deliver customized products to meet the diverse needs of customers.

                          SUPERCOMPUTING CONFERENCE, Denver, Nov. 19, 2013 –Intel Corporation unveiled innovations in HPC and announced new software tools that will help propel businesses and researchers to generate greater insights from their data and solve their most vital business and scientific challenges.

                          “In the last decade, the high-performance computing community has created a vision of a parallel universe where the most vexing problems of society, industry, government and research are solved through modernized applications,” said Raj Hazra, Intel vice president and general manager of the Technical Computing Group. “Intel technology has helped HPC evolve from a technology reserved for an elite few to an essential and broadly available tool for discovery. The solutions we enable for ecosystem partners for the second half of this decade will drive the next level of insight from HPC. Innovations will include scale through standards, performance through application modernization, efficiency through integration and innovation through customized solutions.”

                          Accelerating Adoption and Innovation
                          From Intel® Parallel Computing Centers to Intel® Xeon Phi™ coprocessor developer kits, Intel provides a range of technologies and expertise to foster innovation and adoption in the HPC ecosystem. The company is collaborating with partners to take full advantage of technologies available today, as well as create the next generation of highly integrated solutions that are easier to program for and are more energy-efficient. As a part of this collaboration Intel also plans to deliver customized HPC products to meet the diverse needs of customers. This initiative is aimed to extend Intel’s continued value of standards-based scalable platforms to include optimizations that will accelerate the next wave of scientific, industrial, and academic breakthroughs.

                          During the Supercomputing Conference (SC’13), Intel unveiled how the next generation Intel Xeon Phi product (codenamed “Knights Landing”), available as a host processor, will fit into standard rack architectures and run applications entirely natively instead of requiring data to be offloaded to the coprocessor. This will significantly reduce programming complexity and eliminate “offloading” of the data, thus improving performance and decreasing latencies caused by memory, PCIe and networking.

                          Knights Landing will also offer developers three memory options to optimize performance. Unlike other Exascale concepts requiring programmers to develop code specific to one machine, new Intel Xeon Phi processors will provide the simplicity and elegance of standard memory programming models.

                          In addition, Intel and Fujitsu recently announced an initiative that could potentially replace a computer’s electrical wiring with fiber optic links to carry Ethernet or PCI Express traffic over an Intel® Silicon Photonics link. This enables Intel Xeon Phi coprocessors to be installed in an expansion box, separated from host Intel Xeon processors, but function as if they were still located on the motherboard. This allows for much higher density of installed coprocessors and scaling the computer capacity without affecting host server operations.

                          Several companies are already adopting Intel’s technology. For example, Fovia Medical*, a world leader in volume rendering technology, created high-definition, 3D models to help medical professionals better visualize a patient’s body without invasive surgery. A demonstration from the University of Oklahoma’s Center for Analysis and Prediction of Storms (CAPS) showed a 2D simulation of an F4 tornado, and addressed how a forecaster will be able to experience an immersive 3D simulation and “walk around a storm” to better pinpoint its path. Both applications use Intel® Xeon® technology.

                          High Performance Computing for Data-Driven Discovery
                          Data intensive applications including weather forecasting and seismic analysis have been part of the HPC industry from its earliest days, and the performance of today’s systems and parallel software tools have made it possible to create larger and more complex simulations. However, with unstructured data accounting for 80 percent of all data, and growing 15 times faster than other data1, the industry is looking to tap into all of this information to uncover valuable insight.

                          Intel is addressing this need with the announcement of the Intel® HPC Distribution for Apache Hadoop* software (Intel® HPC Distribution) that combines the Intel® Distribution for Apache Hadoop software with Intel® Enterprise Edition of Lustre* software to deliver an enterprise-grade solution for storing and processing large data sets. This powerful combination allows users to run their MapReduce applications, without change, directly on shared, fast Lustre-powered storage, making it fast, scalable and easy to manage.

                          The Intel® Cloud Edition for Lustre* software is a scalable, parallel file system that is available through the Amazon Web Services Marketplace* and allows users to pay-as-you go to maximize storage performance and cost effectiveness. The software is ideally suited for dynamic applications, including rapid simulation and prototyping. In the case of urgent or unplanned work that exceeds a user’s on-premise compute or storage performance, the software can be used for cloud bursting HPC workloads to quickly provision the infrastructure needed before moving the work into the cloud.

                          With numerous vendors announcing pre-configured and validated hardware and software solutions featuring the Intel Enterprise Edition for Lustre, at SC’13, Intel and its ecosystem partners are bringing turnkey solutions to market to make big data processing and storage more broadly available, cost effective and easier to deploy. Partners announcing these appliances include Advanced HPC*, Aeon Computing*, ATIPA*, Boston Ltd.*, Colfax International*, E4 Computer Engineering*, NOVATTE* and System Fabric Works*.

                          Intel Tops Supercomputing Top 500 List
                          Intel’s HPC technologies are once again featured throughout the 42nd edition of the Top500 list, demonstrating how the company’s parallel architecture continues to be the standard building block for the world’s most powerful supercomputers. Intel-based systems account for more than 82 percent of all supercomputers on the list and 92 percent of all new additions. Within a year after the introduction of Intel’s first Many Core Architecture product, Intel Xeon Phi coprocessor-based systems already make up 18 percent of the aggregated performance of all Top500 supercomputers. The complete Top500 list is available at www.top500.org.


                          1 From IDC Digital Universe 2020 (2013)

                          Software and workloads used in performance tests may have been optimized for performance only on Intel microprocessors. Performance tests, such as SYSmark and MobileMark, are measured using specific computer systems, components, software, operations and functions. Any change to any of those factors may cause the results to vary. You should consult other information and performance tests to assist you in fully evaluating your contemplated purchases, including the performance of that product when combined with other products.
                          Optimization Notice
                          Intel’s compilers may or may not optimize to the same degree for non-Intel microprocessors for optimizations that are not unique to Intel microprocessors. These optimizations include SSE2, SSE3, and SSE3 instruction sets and other optimizations. Intel does not guarantee the availability, functionality, or effectiveness of any optimization on microprocessors not manufactured by Intel. Microprocessor-dependent optimizations in this product are intended for use with Intel microprocessors. Certain optimizations not specific to Intel microarchitecture are reserved for Intel microprocessors. Please refer to the applicable product User and Reference Guides for more information regarding the specific instruction sets covered by this notice.
                          Intel does not control or audit the design or implementation of third party benchmark data or Web sites referenced in this document. Intel encourages all of its customers to visit the referenced Web sites or others where similar performance benchmark data are reported and confirm whether the referenced benchmark data are accurate and reflect performance of systems available for purchase.

                          Fujitsu Lights up PCI Express with Intel Silicon Photonics [The Data Stack blog of Intel, Nov 5, 2013]

                          Victor Krutul is the Director of Marketing for the Silicon Photonics Operation at Intel.  He shares the vision and passion of Mario Paniccia that Silicon Photonics will one day revolutionize the way we build computers and the way computers talk to each other.  His other passions are tennis and motorcycles (but not at the same time)!

                          I am happy to report that Fujitsu announced at its annual Fujitsu Forum on November 5th 2013, that it has worked with Intel to build and demonstrate the world’s first Intel® Optical PCIe Express (OPCIe) based server.  This OPCIe server was enabled by Intel® Silicon Photonics technology.  I think Fujitsu has done some good work when they realized that OPCIe powered servers offer several advantages over non OPCIe based servers.  Rack based servers, especially 1u and 2u servers are space and power constrained.  Sometimes OEMs and end users want to add additional capabilities such as more storage and CPUs to these servers but are limited  because there is simply not enough space for these components or because packing too many components too close to each other increases the heat density and prevents the system from being able to cool the components.

                          Fujitsu found a way to fix these limitations!

                          The solution to the power and space density problems is to locate the storage and compute components on a remote blade or tray in a way that they appear to the CPU to be on the main motherboard.  The other way to do this is to have a pool of hard drives managed by a second server – but this approach requires messages be sent between the two servers and this adds latency – which is bad.  It is possible to do this with copper cables; however the distance the copper cables can span is limited due to electro-magnetic interference (EMI).  One could use amplifiers and signal conditioners but these obviously add power and cost.  Additionally PCI Express cables can be heavy and bulky.  I have one of these PCI Express Gen 3 16 lanes cables and it feels like it weighs 20 lbs.  Compare this to a MXC cable that carries 10x the bandwidth and weighs one to two pounds depending on length.

                          Fujitsu took two standard Primergy RX200 servers and added an Intel® Silicon Photonics module into each along with an Intel designed FPGA.  The FPGA did the necessary signal conditioning to make PCI Express “optical friendly”.  Using Intel® Silicon Photonics they were able to send PCI Express protocol optically through an MXC connector to an expansion box (see picture below).  In this expansion box was several solid state disks (SSD) and Xeon Phi co-processors and of course there was a Silicon Photonics module along with the FPGA to make PCI Express optical friendly.  The beauty of this approach was that the SSD’s and Xeon Phi’s appeared to the RX200 server as if they were on the mother board.  With photons traveling at 186,000 miles per second the extra latency of travelling down a few meters of cable cannot reliably be measured (it can be calculated to be ~5ns/meter or 5 billionths of a second).So what are the benefits of this approach?  Basically there are four.  First, Fujitsu was able to increase the storage capacity of the server because they now were able to utilize the additional disk drives in the expansion box.  The number of drives is determined by the physical size of the box.  The 2nd benefit is they were able to increase the effective CPU capacity of the Xeon E5’s in the RX200 server because the Xeon E5’s could now utilize the CPU capacity of the Xeon Phi co-processors. In a standard 1u rack it would be hard if not impossible to incorporate Xeon Phi’s.  The third benefit is the cooling.  First putting the SSD’s in a expansion box allows one to burn more power because the cooling is divided between the fans in the 1U rack and those in the expansion box,  The fourth benefit is what is called cooling density or, how much heat needs to be cooled per cubic centimeter.  Let me make up an example. For simplicity sake let’s say the volume of a 1u rack is 1 cubic meter and let’s say there are 3 fans cooling that rack and each fan can cool 333 watts for a total capacity of 1000 watts of cooling.  If I evenly space components in the rack each fan does its share and I can cool 1000 watts.  Now assume I put all the components so that just one fan is cooling them because there is no room in front of the other two fans.  If those components expend more than 330 watts they can’t be cooled.  That’s cooling density.  The Fujitsu approach solves the SSD expansion problem, the CPU expansion problem and the total cooling and cooling density problems.

                          image

                          Go to:https://www-ssl.intel.com/content/dam/www/public/us/en/images/research/pci-express-and-mxc-2.jpg  if you want to see the PCI Express copper cable vs the MXC optical cable (you will also see we had a little fun with the whole optical vs copper thing.)

                          Besides Intel® Silicon Photonics the Fujitsu demo also included Xeon E5 microprocessors and Xeon Phi co-processors.

                          Why does Intel want to put lasers in and around computers?

                          Photonic signaling (aka fiber optics) has 2 fundamental advantages over copper signaling.  First, when electric signals go down a wire or PCB trace they emit electromagnetic radiation (EMI) and when this EMI from one wire or trace couples into an adjacent wire it causes noise, which limits the bandwidth distance product.  For example, 10G Ethernet copper cables have a practical limit of 10 meters.  Yes, you can put amplifies or signal conditioners on the cables and make an “active copper cable” but these add power and cost.  Active copper cables are made for 10G Ethernet and they have a practical limit of 20 meters.

                          Photons don’t emit EMI like electrons do thus fiber based cables can go much longer.  For example with the lower cost lasers used in data centers today at 10G you can build 500 meter cables.  You can go as far as 80km if you used a more expensive laser, but these are only needed a fraction of the time in the data center (usually when you are connecting the data center to the outside world.)

                          The other benefit of optical communication is lighter cables.  Optical fibers are thin, typically 120 microns and light.  I have heard of situations where large data centers had to reinforce the raised floors because with all the copper cable, the floor loading limits would be exceeded.

                          So how come optical communications is not used more in the data center today? The answer is cost!

                          Optical devices made for data centers are expensive.  They are made out of expensive and exotic materials like Lithium-Niobate or Gallium-Arsenide.  Difficult to pronounce, even more difficult to manufacture.  The state of the art for these exotic materials is 3 inch wafers with very low yields.  Manufacturing these optical devices is expensive.  They are designed inside of gold lined cans and sometimes manual assembly is required as technicians “light up” the lasers and align them to the thin fibers.  A special index matching epoxy is used that sometimes can cost as much as gold per ounce.  Bottom line is that while optical communications can go further and uses light fiber cables it costs a lot more.

                          Enter Silicon Photonics!  Silicon Photonics is the science of making Photonic devices out of Silicon in a CMOS fab.  Also known as optical but we use the word photonics because the word “optical” is also used when describing eye glasses or telescopes.  Silicon is the most common element in the Earth’s crust, so it’s not expensive.  Intel has 40+ years of CMOS manufacturing experience and has worked over the 40 years to drive costs down and manufacturing speed up.  In fact, Intel currently has over $65 Billion of capital investment in CMOS fabs around the world.  In short, the vision of Intel® Silicon Photonics is to combine the natural advantages of optical communications with the low cost advantages of making devices out of Silicon in a CMOS fab.

                          Intel has been working on Intel® Silicon Photonics (SiPh) for over ten years and has begun the process of productizing SiPh.  Earlier this year, at the OCP summit Intel announced that we have begun the long process of building up our manufacturing abilities for Silicon Photonics.  We also announced we had sampled customers with early parts.

                          People will often ask me when we will ship our products and how much they will cost?   They also ask me for all sort of technical details about out SiPh modules.  I tell them that Intel is focusing on a full line of solutions – not a single component technology. What our customers want are complete Silicon Photonic based solutions that will make computing easier, faster or less costly.  Let me cite our record of delivering end-to-end solutions:

                          Summary of Intel Solution Announcements

                          January 2013:  We did a joint announcement with Facebook at the Open Compute Project (OCP) meeting that we worked together to design disaggregated rack architecture (since renamed RSA [Rack Scale Architecture]).  This architecture used Intel® Silicon Photonics and allowed for the storage and networking to be disaggregated or moved away from the CPU mother board.  The benefit is that users can now choose which components they want to upgrade and are not forced to upgrade everything at the same time.

                          April 2013: At the Intel Developer Forum we demonstrated the first ever public demonstration of Intel® Silicon Photonics at 100G.

                          September 2013: We demonstrated a live working Rack Scale Architecture solution using Intel® Silicon Photonics links carrying Ethernet protocol.

                          September 2013: Joint announcement with Corning for new MXC and ClearCurve fiber solution capable of transmission of 300m with Intel® Silicon Photonics at 25G.  This reinforced our strategy of delivering a complete solution including cables and connectors that are optimized for Intel® Silicon Photonics.

                          September 2013Updated Demonstration of a solution using Silicon Photonics to send data at 25G for more than 800 meters over multimode fibers – A new world record.

                          Today: Intel has extended its Silicon Photonics solution leadership with a joint announcement with Fujitsu demonstrating the world’s first Intel® Silicon Photonics link carrying PCI Express protocol.

                          I hope you will agree with me that Intel is focusing on more than just CPUs or optical modules and will deliver a complete Silicon Photonics solution!