Home » Enterprise computing (Page 5)

Category Archives: Enterprise computing

Windows Embedded is an enterprise business now, like the whole Windows business, with Handheld and Compact versions to lead in the overall Internet of Things market as well

OR Windows Embedded: Recommitting to x86 across all of the edge devices of the future intelligent systems of enterprise customers and consumers while pushing ARM along its current positions in mobile and real-time, which is essentially corresponding to the Windows 7 licensing and pricing described by this source 
OR Windows Embedded enterprise solutions strategy based on creating actionable operation intelligence extended to edge devices in retail and hospitality, healthcare, manufacturing, and automotive industries
OR Capitalizing on the Internet of Things [WindowsEmbedded YouTube channel, March 20, 2013] and Transforming Business 
OR Building Edge Devices & Intelligent Solutions [WindowsEmbedded YouTube channel, March 20, 2013]
OR (as stemming from The future of Windows Embedded: from standalone devices to intelligent systems [this same ‘Experiencing the Cloud’ blog, March 9-29, 2012], note however that ARM architecture support was delivered only in Handheld and Compact versions despite original hint included into that post)
An intelligent system built on Windows Embedded—with the expertise of the extensive community of established Windows Embedded partners—extends the power of Windows and Microsoft technologies to edge devices. Our portfolio of products powers solutions that meet unique industry needs and span enterprises of any size and complexity.
coinciding with:
1.  Microsoft betting on boosting Windows RT demand with top level ARM SoCs from its SoC partners, Windows 8.1 enhancements, Outlook addition to the Office 2013 RT and very deep tactical discounts to its OEM partners for tablet offerings of more value and capability [‘Experiencing the Cloud’, June 6, 2013] and
2. “Cloud first” from Microsoft is ready to change enterprise computing in all of its facets [‘Experiencing the Cloud’, June 4, 2013], as well as
3. Visual Studio 2012 Update 2 is here [The Visual Studio Blog, April 4, 2013] which according to ANNOUNCING VISUAL STUDIO 2012 UPDATE 2 CTP 2 [BlendInsider, Jan 30, 2013] providing the utmost effectivity in developer productivity, finally achieving uniformity in XAML based embedded user experience design as well with one version of Blend for everything (highlighted inserts are mine):
image

as Windows Azure is providing a leading cloud application platform for all that (download): image
and an excellent  testimony to that is given in Discovering Intelligent Systems at work in Manufacturing [Windows Embedded Blog, Nov 27, 2012] from which it is important to include the basic story (just substitute “’intelligent’screwdriver” with “any enterprise or consumer device enhanced with ‘intelligence’”, “larger network in the factory” with “classic and mobile Internet” and the “backend” with “Windows Azure” to understand the enormous potential which is becoming available for Microsoft in terms of the Internet of Things market):

Hey everyone, recently our Windows Embedded team was on a customer site visit in Europe, and we came across a fantastic example of Intelligent Systems in action. While we were touring an automobile manufacturing plant, we observed the line using electric screwdrivers like the one pictured below. They had two cables running into them. Power and Ethernet. We asked the tour director about the network cable, and they explained that the screwdriver was actually an ‘intelligent’ screwdriver.
We smiled at the thought of this basic piece of hardware actually being able to think about what it was doing. Then he explained it and we were amazed. The screwdriver was hung off a manufacturing line Windows Embedded Compact PC that was connected to a larger network in the factory. The backend provided the screwdriver engineering specs about the screw going into that location on the car, including the required torque and even the number of revolutions that Class 1 screw should take to achieve the desired torque. So, when the technician popped the screw into the chassis, all they had to do was fire the trigger, and everything was automatic. They even had some scenarios where this was done using robotic arms instead of people.
imageWhen the screw was installed in the car, a data point was generated that came back down the network cable and registered in the factory database. Basically, an ‘OK’, or ‘NOT OK’ was registered, and in the case of either the torque being missed, or that torque being achieved in an unexpected number of revolutions, a flag was popped to investigate further. In summary, the car would not get off the production line if the quality bar wasn’t met.
We have learned since this visit that a number of our partners, and several large automotive manufacturers have deployed this technology in their factories both here in North America and in Europe.
The volume of parts going into just one car is massive, a true big data story, and the business doesn’t necessarily want to know about the hundreds of thousands of screws installed in their factory. What they do want to know is when a threshold like an engineering spec is missed. This type of approach enables business critical data to be presented, relevant, and not washed out in the volumes of activities/events happening minute by minute on the factory floor. …

In the IDC iView, sponsored by Microsoft, The Rise of Intelligent Systems: Connecting Enterprises and Smart Devices in Seamless Networks [April 18, 2012], you can find the following market forecast:

embedded and intelligent systems represent a much larger opportunity than the PC, tablet, or even the smartphone market. IDC estimates that the intelligent systems market will grow from 19% of all major electronic system unit shipments in 2010, to more than 27% of all systems by 2016. Revenue for the intelligent systems market will grow from more than $649 billion in 2012, to more than $1.4 trillion in 2016 (PCs and smartphones excluded from market-size numbers).

On the market for more than five years and with more than 5 million cars sold already, but in joint development since 2005, Ford SYNC based on Microsoft embedded technology is the best showcase of both the market potential and the level of achievements possible in this post-PC market for Microsoft:

The Ford Focus now comes with the optional Ford Sync system, which provides voice control and smartphone app integration. Jason Johnson, of Ford, takes us through a detailed demonstration of the system, showing off its ability to recognise navigation and phone book commands, as well as its wireless hot spot feature.

Soon we will have further advancements: Ford to Show the Smarter Way to Get There at Computex 2013 [press release, May 23, 2013] which you can follow on a special Ford Motor Company – Computex FB site which currently contains teaser videos about Future Technology Trends, Open Innovation and Device Interaction featuring Microsoft as well(note that those things are quite necessary as competition is getting stronger)

  • Ford will be the only automaker at Computex 2013, the largest computer exhibition in Asia
  • Ford will make several major announcements on its smart technologies for both Taiwan and Asia Pacific and Africa markets
  • Ford to showcase its most advanced class-leading technologies, designed to take the driving experience for customers to a new level
Inserted later: Ford Press Conference Highlights at Computex 2013  [FordAPA YouTube channel, June 6, 2013]
Ford Motor Company today announced it will bring its Ford Developer Program to markets in Asia Pacific and Africa to allow developers to create voice-activated apps for the car, further reinforcing its position as a global leader in technology.
    TAIPEI, Taiwan, May 23, 2013 – Ford Motor Company will show the Smarter Way to Get There at Computex 2013 with the most advanced class-leading technologies to further enhance the driving experience for customers.
    Ford will be the only automaker at Asia’s biggest computer exhibition from June 4-8, 2013, where it will make several major announcements for both Taiwan and Asia Pacific.
    “As one of the world’s largest and most influential technology shows, Computex is the ideal platform for Ford to showcase how our smart technologies are improving the driving experience for our customers in the digital age, ” said John Lawler, chairman and CEO, Ford Motor China.
    “In a world where consumers want to be connected all the time, be it at home, in the office or in their cars, we have a great opportunity to provide driver-connect technologies in our vehicles which enable drivers to stay connected through voice commands while keeping their hands on the wheel and eyes on the road. The technologies we are bringing to our vehicles not only give our customers a connected driving experience, they also make that experience simple, safe and personalized.”
    The Ford stand at Computex will not only feature the company’s most advanced technology developments, but also the all-new Ford Kuga, dubbed by Ford as the “Smarter SUV” because of its fuel economy, versatility and new technology that makes driving easier and more fun.
    Inserted later: Ford at Computex 2013 – Panel Discussion
    [FordAPA YouTube channel, June 5, 2013]
    On June 4, Ford will be part of the Computex Smart Living Industry Forum at which Edward Pleet, Connected Services Director, Ford Motor Company, Asia Pacific and Africa, and Europe will discuss The Smart Living, Networked Society.
    The Ford booth will be located at Taipei Word Trade Center Nangang Exhibition Hall, 4th Floor, booth number M2005.

    image

    OR Windows Embedded: Recommitting to x86 across all of the edge devices of the future intelligent systems of enterprise customers and consumers while pushing ARM along its current positions in mobile and real-time, which is essentially corresponding to the Windows 7 licensing and pricing described by this source as (here only WIN7 COMPACT (CE) has ARM support as well):image
    (click here or on the above image to see the full table, note also that the true enterprise licensing via even cheaper SELECT and EA (Enterprise Agreement) programs is not shown in the table, for explanations also see WES 7 “E” & “P”, WES SKU Differences, FES, FES 7 Pro, FES 7 Ultimate, WES vs FES, FES Pro & Ultimate SKU Differences, Win7 Compact (CE), Win7 Compact (CE) SKU Differences, Win7 Compact (CE) OS Components and “SKU rationale” from Microsoft) on which I overlaid the corresponding Windows Embedded 8 products and their already known (like General Embedded / NR / Entry for Windows Embedded Compact 2013 to be generally available on June 13) or supposed (like Standard ?…? / Standard ?Enterprise? for the Windows Embedded 8 Standard) SKUs. 

    imageNote that the above table could be misleading since it is just representing low-volume purchases while Microsoft is using License Packs as well where the per unit price is non-linearly decreasing with the number of licenses in the Pack. Fortunately I’ve found current trade data records for WINDOWS EMBEDDED STANDARD 8 EMB ESD OEI RUNTIME -7WT-00094(N-77P-3153) [April 9, 2013] and WIN PRO EMBEDDED 8 EMB ESD OEI -42C-00051(N-77P-3154) [April 9, 2013] from Taiwan to India which I could use as Model 1 and Model 2 for supposed pricing of the Windows Embedded 8 Standard, see the results on the above right. This could certainly be not so steep in reality (e.g. the model numbers were “more decreased” in trade declarations for the larger License Packs representing higher absolute value in order to decrease the absolute tax even more) as it is only giving a kind of idea for License Packs.

    It is also important to include here the argumentation why Isn’t a Linux or Android solution cheaper? [one of FAQs answered by Avnet Embedded, May 1, 2013]:

    Linux or Android solutions may seem cheaper initially. However, the Total Cost of Ownership (TCO) should be taken into account as a useful metric for assessing the overall cost impact of your investment. For example:

    • Acquisition costs— Inexpensive comparable products can cost as much or more than Windows to acquire and support.
    • Total costs—Acquisition costs are a very small component of TCO. Even when the costs of different operating systems are comparable, research shows that Windows often offers a lower TCO because of cost advantages in the other, larger components, such as staffing and downtime.
    • Cost vs. Value—In addition to what you must pay for, if you are making an investment in IT you should also consider what you will get in return; including features or capabilities that improve productivity and deliver additional value.   

    To find out more about the TCO of Windows Embedded, read ‘The Total Cost of Ownership (TCO) benefits of Windows Embedded software’ ebook

    If the runtime license still looks too expensive than it is important to consider that we are talking here about very special types of devices with the x86 based Windows Embedded 8. Here is how Microsoft representing that x86-only focus on the top of “edge devices of the future intelligent systems of enterprise customers“: image
    This has even very strong industry focus: Retail (from Kiosk to ATM), Manufacturing and Health Care. So we can proceed to other post titles which are equally important to properly represent the redefined Windows Embedded positioning:

    OR Windows Embedded enterprise solutions strategy based on creating actionable operation intelligence extended to edge devices in retail and hospitality, healthcare, manufacturing, and automotive industries

    OR Capitalizing on the Internet of Things [WindowsEmbedded YouTube channel, March 20, 2013] and watch also: Transforming Business

    The Internet of Things is prompting businesses to re-think how they use their digital assets. Kevin Dallas, GM of Windows Embedded at Microsoft, tells GigaOM Research’s Adam Lesser how companies can build intelligent systems to take advantage of the data their devices are already generating, for better business intelligence.

    OR Building Edge Devices & Intelligent Solutions [WindowsEmbedded YouTube channel, March 20, 2013]

    To be a part of the Internet of Things, businesses need the right kinds of devices. Kevin Dallas, GM of Windows Embedded at Microsoft, tells GigaOM Research’s Adam Lesser what OEM/ODMs should think about as they help their customers build intelligent systems to take advantage of the data their devices are already generating.

    Other videos in the “Building Edge Devices & Intelligent Solutions” series:
    Dell Wyse, HP, Omnicell and ParTech, Inc. I will embedd here even Bravo Outdoor Advertising Reaches Greater Heights With Intelligent System [WindowsEmbedded YouTube channel, Feb 11, 2013] as it shows very well how the range of edge devices could be hugely extended over the years (here with digital signage on the public transport in Ireland):

    Adrian O’Farrell, former marketing director for Bravo Outdoor Advertising, describes the many benefits — flexibility, customization and cost — of digital signage as opposed to traditional advertising on the Dublin bus system.

    OR
    (as stemming from The future of Windows Embedded: from standalone devices to intelligent systems [this same ‘Experiencing the Cloud’ blog, March 9-29, 2012], note however that ARM architecture support was delivered only in Handheld and Compact versions despite original hint included into that post)

    An intelligent system built on Windows Embedded—with the expertise of the extensive community of established Windows Embedded partners—extends the power of Windows and Microsoft technologies to edge devices. Our portfolio of products powers solutions that meet unique industry needs and span enterprises of any size and complexity.

    From Learn more about intelligent systems subpage linked on Microsoft > Windows Embedded > Products and Solutions page [May 6, 2013] which page also contains:

    Unlock intelligence with the full breadth of Microsoft technologies
    What happens when devices at the edge of enterprise networks are connected to software and services in the back end or the cloud? Suddenly, a rich new source of information is available. The data has always been there—but today, an integrated stack of Microsoft technologies, extending from the server room to the customer’s fingertips, can help evolve business intelligence to operational intelligence by enabling enterprises to identify and act on opportunities that would otherwise be out of reach. For OEMs, the ability to harness the power of Microsoft technologies to capitalize on data gathered from edge devices translates to new and expanded potential for creating solutions for customers.

    [The Big Shift From Software to Cloud Services video of Nov 13, 2012 from WindowsEmbedded YouTube channel is quite important to embed here, since it clearly shows that Microsoft is shifting from being a software company to a hardware & services company:]

    Windows Azure Marketing General Manager Eron Kelly discusses Microsoft Corp.’s focus on delivering software through the cloud and the opportunity it creates for devices and intelligent systems.
    One Microsoft, everything you need
    When connecting industry devices powered by Windows Embedded to back-end systems running SQL Server on-premise—or secured by Azure in the cloud–business data is without boundaries. Those building intelligent system solutions will shorten development time, and simplify implementation and management by harnessing the full breadth of Microsoft technologies, from the rich, familiar experience of Windows, to simplified management with System Center and security with Forefront. Device manufacturers, evaluate your intelligent systems business capabilities with Microsoft.
    Devices at the network edge: critical infrastructure for intelligent systems
    Intelligent systems are revolutionizing business, and Microsoft is focused on driving innovation in a number of industries, including retail and hospitality, healthcare, manufacturing and automotive. Whether streamlining inventory management with industry handheld devices, securely handling medical records using a thin-client solution, reinventing the customer experience with point-of-sale devices, transforming factory efficiencies with embedded robots, or reimagining the driving experience with an in-car infotainment system, edge devices are all around us. Powering these devices with Windows Embedded harnesses Microsoft technologies to create customized solutions that address specific industry needs and drive innovation—and profits—forward.
    According to IDC, unit shipments of IP-connected embedded systems, excluding mobile phones and PCs, will more than double by 2015, growing from approximately 1.4 billion in 2010 to over 3.3 billion.

    Source: IDC, “Smart Tech Market Forecast and 2020 Vision.”

    Specialized devices in the marketplace

    Select an industry [with a latest video of May 6, 2013 embedded here for each from WindowsEmbedded YouTube channel, in order to let you see how Microsoft and Windows Embedded are providing the technology, strategic leadership and partner ecosystem that are driving innovation]

    As far as the Windows Embedded 8 is concerned we have a pretty clear picture now:
    Windows Embedded 8 [Microsoft > Windows Embedded > Products and Solutions > Windows Embedded Products page, May 6, 2013]

    From this page the basic offerings (based on Windows 8) are the following ones:

    The Windows Embedded 8 family of platforms and tools helps companies extend their operational intelligence [by harnessing the flow of data across industry devices on the edge and back-end systems], using their existing IT infrastructure and industry devices that securely exchange data with back-end systems. Offering the same rich, multi-touch experience as Windows 8 and Windows Phone 8 , Windows Embedded 8 delivers compelling user experiences on a range of industry devices.

    Windows Embedded 8 Pro

    The power and flexibility of Windows 8 in a platform designed specifically for building edge devices [digital signs or point-of-service terminals in a store environment, handheld devices, robots on the manufacturing floor, or thin client devices in hospitals to transform business intelligence to actionable operation intelligence] and intelligent systems solutions [such as kiosks, medical devices, digital signage and HMI (human machine interface)].

    • Deliver a user experience that’s identical to Windows 8.
    • Design custom apps that feature the fast, fluid behavior of Windows 8.
    • Security features such as Bitlocker and Trusted Boot.
    • Compatible with line-of-business and productivity apps.

    Learn more

    [The Next Generation Digital Signage on Display at Computex 2012 video of June 25, 2012 from WindowsEmbedded YouTube channel is quite important to embed here, since it clearly discusses the direction for digital signage systems where full Windows compatibility is essential:]

    Windows Embedded’s John Boladian and Intel’s Gark Tan show off today’s interactive digital signage that create an engaging and connected experience for customers through combined technologies from Kinect for Windows, Windows Embedded and Intel’s Core processors.

    Windows Embedded 8 Standard

    Offers flexibility for purpose-built devices, such as thin clients, kiosks [digital signage] and automated manufacturing solutions.

    • Compelling UI, powerful app support, security and manageability of Windows 8.
    • Modular format allows you to use only the components needed.
    • Ensure consistent configuration with embedded specific lockdown features.
    • Custom branding feature.

    Learn more 

    [The Demo: Windows 8 on Embedded Devices video of Nov 13, 2012 from WindowsEmbedded YouTube channel is quite important to embed here, since it clearly shows the actually best example of a purpose-built ruggedized device (from a long-time partner Motion Computing) based on Windows 8 which is a kind of prototype of similar “custom branded” devices based on Windows Embedded 8 Standard:]

    Embedded Group Manager John Coyne shows off an industry application on a PC running Windows Embedded 8.

    Windows Embedded 8 Industry

    A consistent, streamlined application platform that shortens development cycles for specific industry device scenarios in retail, manufacturing and other industries [such as POS terminals, ATMs, automated manufacturing solutions and medical devices].

    • Compelling UI, powerful app support, security and manageability of Windows 8.
    • Ensure consistent configuration with lockdown features.
    • Fixed platform provides a consistent development experience.
    • Plug and play peripheral capabilities with POS for .NET.

    Learn more

    [The Intelligent Systems Making Vending Machines Fun at Computex 2012 video of June 25, 2012 from WindowsEmbedded YouTube channel is quite important to embed here, since it demonstrates an interactive smart vending machine where retail peripheral support is essential:]

    Windows Embedded’s John Boladian and Intel’s Gark Tan discuss the value and growth of intelligent systems across devices and the cloud. By highlighting an interactive smart vending machine, they show that intelligent systems not only make the purchase experience fun, but give the vendor a competitive advantage through increased connectivity, data collection, manageability and business analytics

    [Read also: Windows Embedded 8 Industry: A Modern OS for Industry Devices [Windows Embedded blog, April 2, 2013] “On the heels of our recent release of the Windows Embedded 8 platform, we’re making another member of the Windows Embedded family available today — Windows Embedded 8 Industry. David Wurster, Microsoft Windows Embedded’s senior product manager, has details on how Windows Embedded has evolved beyond point-of-service (POS) systems in retail to do much more in the Windows 8 era.”]

    Compare Windows Embedded 8 products

    Features
    Pro
    Standard
    Industry
    Rich multitouch, multi-user interface
    clip_image002
    clip_image002[1]
    clip_image002[2]
    Connectivity features, including connected standby, mobile broadband and WiFi
    clip_image002[3]
    clip_image002[4]
    clip_image002[5]
    Powerful security features, including anti-malware support, BitLocker and Trusted Boot
    clip_image002[6]
    clip_image002[7]
    clip_image002[8]
    Lockdown support, including unified write filter, gesture and keyboard filters
     
    clip_image002[9]
    clip_image002[10]
    Retail peripheral support

     

     

    clip_image002[11]
    Custom branding
     
    clip_image002[12]
     
    Full Windows compatibility
    clip_image002[13]

     

     
    Fixed image
    clip_image002[14]

     

    clip_image002[15]
    Easy end-to-end device management with Microsoft System Center
    clip_image002[16]
    clip_image002[17]
    clip_image002[18]
    Modular OS
     
    clip_image002[19]
     

    In addition there are the following complementary offerings, which are not based on Windows 8, are shown on the same page as well:

    Windows Embedded 8 Handheld

    Built on Windows Phone 8 to offer intuitive line-of-business applications [such as package delivery, mobile point-of-service, communication and collaboration, and scanning and data capture], with proven integration and security for industry handheld devices.

    • Common application programming interfaces so that devices easily integrate.
    • Manage devices across the network through the use of Windows Intune and SCCM 2012.
    • Benefit from a large selection of Windows Phone 8 apps.
    • Use the Windows Phone 8 SDK and Visual Studio 2012 to create custom apps.

    Learn more

    [Read also: Windows Embedded 8 Handheld joins the Windows Embedded 8 family [Windows Embedded blog, Jan 14, 2013] “Windows Embedded 8 Handheld is more than just the successor to Windows Embedded Handheld 6.5. It’s a complete re-imagination of the enterprise mobile device. With Windows Embedded 8 Handheld, the platform is now based on the Windows Phone 8, which itself is built on Windows 8. In addition to the highly-praised Windows Phone 8 user interface, both Windows Phone 8 and Windows Embedded 8 Handheld now share a common kernel with Windows.”]

    Windows Server 2012 for Embedded Systems

    Binary identical to Windows Server, a proven, highly reliable operating system for embedded applications in server appliances [such as of telecommunications, medical imaging, industrial automation and corporate headquarters]

    • Enable informed, real-time decisions that keep your enterprise ahead of the competition.
    • New storage features optimize the reliability and efficiency of data stores and scale to meet demand and reduce costs.
    • Equip employees with insightful analysis and reporting services.

    Learn more

    Microsoft SQL Server 2012 for Embedded Systems

    A database management tool, binary identical to Microsoft SQL Server, for use with purpose-built hardware running the Windows Embedded Server operating system [such as in telecommunications, medical imaging, industrial automation and corporate headquarters].

    • Glean new business insights from data, and harness it in real time.
    • Provide access to powerful data analysis and visualization tools.
    • Flexibility and usability for auditing and security manageability across SQL Server environment.

    Learn more

    Windows Embedded 8 Mission and Vision (from  Microsoft’s Intelligent Systems [Microsoft > Windows Embedded > Intelligent Systems page, May 7, 2013])

    Actionable data fueled by intelligent systems is the new currency for business, and its value is expected to increase exponentially, improving how people live, learn and conduct business. Gartner predicts that big data will “deliver transformational benefits to enterprises” in the coming 2-5 years, and that by 2015, enterprises that employ big data strategies “will begin to outperform their unprepared competitors within their industry sectors by 20 percentage in every available financial metric.” (Source: Hype Cycle for Cloud Computing, August, 2012.) With intelligent systems, Microsoft is helping organizations access and transform critical data into operational intelligence by providing a wide range of operating systems, tools, and systems and services.

    Our mission is to drive business growth and competitive advantage for our enterprise customers and partners through technology innovations that capitalize on the vast potential of data. Your investment in Windows Embedded is backed by Microsoft’s proven commitment to intelligent systems through more than 15 years of experience in the market.

    Industry Focus (from Microsoft’s Intelligent Systems [Microsoft > Windows Embedded > Intelligent Systems page, May 7, 2013])

    Intelligent systems are revolutionizing business and Microsoft is focused on driving innovation in retail and hospitality, healthcare, manufacturing, and automotive industries. Customized solutions built with Windows Embedded harness Microsoft technologies to address specific industry needs by connecting devices on the edge of enterprise networks with existing IT infrastructureson a single platform. The resulting intelligent systems help retailers deliver personal, seamless and differentiated experiences to customers; manufacturers increase efficiencies at every level of the operation to deliver innovative services, implement best-practice operations and enhance planning and decision-making processes; healthcare institutions optimize patient care and outcomes by bringing people, processes and information together; and automakers evolve “intelligent car” experiences, allowing drivers to access innovative in-car communication, infotainment, navigation and fuel-efficiency features.

    Our Solutions Approach (from Microsoft’s Intelligent Systems [Microsoft > Windows Embedded > Intelligent Systems page, May 7, 2013])

    Microsoft’s tools and technologies for intelligent system solutions extend beyond a software package or device; the great power and flexibility of industry devices running the Windows Embedded platform is that it works in concert with Microsoft’s cloud products and services, and with existing IT infrastructure to customize a complete connected system.

    Windows Embedded minimizes risk and complexity by providing one trusted platform with which to build solutions and broaden business opportunity. Windows Embedded fits with your needs, connecting data across a diverse set of technologies, providing compatibility across your existing systems, and enabling customization through a worldwide network of partners, to increase ease of use and drive efficiency. And a Microsoft solution extends the intelligence of your organization, increasing opportunities for your workforce to act on data and insights that would otherwise be out of reach.

    On the adjacent to the above “Windows Embedded Products” page there is a “Product Lifecycles” [May 7, 2013] page which contains the following

    Road map for intelligent systems

    With Windows Embedded 8, Microsoft extends Windows 8 to intelligent systems, creating the next wave of enterprise tools and technology. The release schedule includes the Windows Embedded 8 family of device operating systems, each with a distinct feature set that includes the building blocks for an intelligent system across hardware, software and services.

    image

    It means that from the whole portfolio the “Windows Embedded Compact 2013” was missing on May 7 as it was to be delivered in Q2 2013. When clicking on its “+” sign one gets the following description (corrections came in the first week of June with v.3 mark deleted after Blend and “sensory input and Kinect for Windows” deleted, and XAML for Windows embedded, multi-core support as well as Snapshot Boot added; it also coincided with “Microsoft Windows Embedded Compact 2013 ISO-TBE” availability for download):

    imageA streamlined, componentized device operating system, Windows Embedded Compact 2013 gives developers all the tools they need to create the next generation of intelligent systems solutions. Compact 2013 provides the flexibility and real-time support to reduce time to market, while creating an easy-to-use, multi-touch experience that helps enterprise customers improve worker productivity.

    • Access to up-to-date tools such as Platform Builder, Visual Studio 2012 and Expression Blend v.3 helps developers to streamline development.
    • Support for XAML for Windows embedded, multi-touch, sensory input and Kinect for Windows and multi-core support enables the creation of immersive applications.
    • Leverage the power of cloud computing through Windows Azure Application Services, giving customers a greater ability to extend their intelligence.
    • Improved file system performance and Snapshot Boot gives companies the confidence that their devices will always be available, whatever their current state .

    The first description of it was given in Windows Embedded Compact v.Next uncovered [Windows Embedded Blog, Nov 14, 2012] as follows

    Posted By David Campbell
    Program Manager

    Woo hoo, it’s finally time to share more information about the upcoming release! First, the release now officially has a name: Windows Embedded Compact 2013. (I know that folks probably have questions around why we chose this name. We thoroughly considered a long list of potential names, including Windows CE again, and Windows Embedded Compact 2013 really did receive the best response.)

    I’ll be doing a number of posts about the various key features and changes in Windows Embedded Compact 2013 over the next few posts, but I want to start with arguably the most interesting of the new features: the investments made for Visual Studio 2012 support, both ISV/app development via Visual Studio directly; and the OEM/device development experience with Platform Builder, now hosted in Visual Studio 2012!

    With all development now in Visual Studio 2012, there is no longer a need for multiple versions of Visual Studio to support Compact development alongside other Windows platforms. Plus, you’ll get many of the new features and productivity improvements available in Visual Studio 2012 when developing for Compact! We now have the same C++ toolset and standards supported everywhere. (And of course Visual Studio 2012 includes the new features from Visual Studio 2010, which were not previously to Compact developers.)

    We also have a new CRT, which has key new functionality aligned as well. (The existing CRT on Compact hasn’t been updated in some time.) And the new optimizer supports functionality like auto-parallelization of your code and auto-vectorization–so if your processor has FP registers, the optimizer will automatically generate code to use vector FP. The 2012 C++ compiler also includes many of the language features from the new C++11 standards.

    C++11 has new language features that allow you to write better performing, safer code and code it faster than ever before. For example, RValue references let you operate on data without having to copy it. And C++11 brings in functional semantics to make writing code more efficient, like having anonymous functions. We also support range based loops, letting you iterate over members of a list directly. More information is available on the Visual Studio team blog.

    .Net CF has also been upgraded to 3.9, which inherits the support Windows Phone updates while still being app compatible with 3.5. This upgrade improves performance significantly in a number of ways. .Net CF 3.9 has greatly improved performance overall, as well as memory allocation and garbage collection using the generational garbage collector. This not only improves performance, but also provides more predictability in the execution of applications. The memory footprint of the runtime is also smaller for both the framework and applications, using what is known as “the sharing server,” allowing loaded code to be reused across applications. The runtime itself is also multi-core enabled, which can improve the performance of all your applications. More information on the updated .Net CF is available on the .NET Framework blog.

    The embedded developer experience improvements of bringing the new features of Visual Studio 2012 to Windows Embedded Compact are amazing, and I’m sure you’ll be as excited as I am to get started using the new features of Visual Studio 2012, Platform Builder and the new Compact OS.

    For information on the upcoming Windows Embedded Compact release, visit www.windowsembedded.com.

    Previous versions with some important new features (my own judgement + Windows CE Wikipedia article + other inputs):

    CE7: Windows Embedded Compact 7 (March 2011)
    – Silverlight for Windows Embedded (UX C++ XAML API): application development made easy, synching designers and developers.
    – Windows Phone 7 IE with Flash 10.1 support: panning, zooming, multitouch and viewing bookmarks using thumbnails, etc
    – Multi-core support
    CE6: Windows Embedded CE 6.0 (September 2006)
    – Significant change in architecture over previous versions of CE (process address space is increased from 32 MB to 2 GB, number of processes has been increased from 32 to 32,768 etc.)
    – Incremental updates to features as R1, R2 and R3 releases
    – Silverlight introduced, Microsoft Office and PDF viewers support too.
    CE5: Windows CE 5.0 (August 2004)
    – Remote Desktop Protocol (RDP) introduced
    – Updates to Graphics and Multimedia support
    CE4: Windows CE 4.x (Jan 7, 2002)
    – .Net Compact Framework introduced
    – Since Windows CE.NET 4.2 system uses a new shell with integrated Internet Explorer
    CE3: Windows CE 3.0 (June 15, 2000)
    – Major recode that made CE hard real time down to the microsecond level
    – Base for the Pocket PC 2000, Handheld PC 2000, Pocket PC 2002 and Smartphone 2002
    CE2: Windows CE 2.x (Sept 29, 1997)
    – Real-time deterministic task scheduling
    – Architectures: ARM, MIPS, PowerPC, StrongARM, SuperH and x86
    CE1: Windows CE 1.0 (November 1996)

    Related post: Introducing NETCF 3.9 in Windows Embedded Compact 2013 – a faster, leaner and multi-core runtime! [.NET Framework blog, Nov 16, 2012] 

    Ever since .NET Compact Framework was introduced at the PDC conference in 2001, programming with .NET has scaled from some of the smallest devices to the largest servers. With C# and Visual Basic, developers can apply the same skills to program both devices and servers to form a complete end-to-end solution. As the devices become more prevalent in our daily lives, .NET is evolving too. Abhishek Mondal, the program manager for .NET Compact Framework [note that Abdishek Mondal was the program manager for GC as well], shares the following highlights of the latest version. –Brandon

    NETCF 3.9 advances the Windows Embedded Compact Platform

    We are happy to announce that we will be including the .NET Compact Framework 3.9 in Windows Embedded Compact 2013, as part of its upcoming release. We have made major updates in this version of the .NET Compact Framework, which deliver benefits in startup time, application responsiveness and memory utilization. You should see significantly better performance characteristics of your applications on both x86 and ARM devices, running Windows Embedded Compact 2013. You can read more about the Windows Embedded Compact 2013 release at the Microsoft News Center.
    The .NET Compact Framework is a version of the .NET Framework for embedded devices. It provides .NET development support for low-end embedded devices that run the Windows Embedded Compact 2013 OS. NETCF provides a familiar and rich development platform for embedded application development, with a small foot print and an extensive set of .NET functionality. For clarity, the other Windows Embedded OSes use the desktop .NET Framework, the same version that is included with desktop Windows.
    NETCF 3.9 is based on the NETCF version that shipped with Windows Phone 7.5. The following features are the key advances in NETCF 3.9, all big steps forward for app performance:
      • New Generational Garbage Collector for more responsive apps
      • NETCF runtime is now multi-core safe to take advantage of multi-core hardware  
      • Sharing Server feature that reduces working set and improves app launch
        Another major benefit of NETCF 3.9 is Visual Studio 2012 support! You will be able to use the same tools for Windows Embedded Compact 2013 development as you use for Windows, Windows Phone and Windows Azure development. Visual C++ development for this new Windows Embedded Compact version will also be supported in Visual Studio 2012, as reported on the Visual C++ team blog.

        Applications run (a lot) faster with NETCF 3.9

        NETCF 3.9 is a faster and leaner runtime for Windows Embedded Compact 2013. We have made many changes that should enable your apps to run much faster. NETCF is also multi-core safe, enabling you to take advantage of multiple cores on embedded devices. Multiple cores are increasingly available on today’s devices, and can be an important part of delivering a compelling experience to your customers. Let’s take a more in-depth look at some of the additional improvements that are part of NETCF 3.9.
        Faster app performance
        NETCF 3.9 has greatly improved performance overall. There are three key features that will speed up your apps. Let’s start with the new garbage collector in NETCF. We have observed app performance in the lab that shows 50-60% drops in GC time. We no longer see GC pauses significantly affecting app responsiveness, in our lab apps, which was a problem that was reported in the past. The new GC is a lot faster!
        For apps that use floating point arithmetic code, you may notice an additional performance boost, since NetCF takes advantage of ARM VFP instructions.
        Last, we’ll look at the new Sharing Server feature. Sharing Server enables a significant improvement in the warm start-up time of your app, particularly in scenarios where multiple applications run on a device. It is able to achieve this benefit by sharing loaded assemblies and JITed machine code across apps (including re-launching the same app).
        Efficient memory utilization of managed application
        The Sharing Server feature also enables lower memory use for NETCF 3.9 apps. As already discussed, the Sharing Server allows code to be reused across applications. In addition to benefiting app launch performance, this feature significantly lowers the aggregate memory use of devices in scenarios where multiple apps are in use.

        Developing apps with NETCF 3.9

        You will find that NETCF is a great addition to a modern development environment. You can use Visual Studio 2012 for development, including features such as Team Foundation Server for source control and feature management.
        Visual Studio 2012 will support Windows Embedded Compact development
        The single most compelling attraction of this release for many of you is the support for embedded development in Visual Studio 2012. This support will simplify development if you are already developing for both Windows or Windows Phone and Windows Embedded Compact, since you can do all of your work in a single Visual Studio environment.
        If you develop exclusively for the embedded platform, then Visual Studio 2012 support will enable you to use ALM tools and TFS in your development environment. There are also other benefits to Visual Studio 2012 such as performance improvements and other tools, which you can explore and enjoy.
        Here is a snapshot of a sample managed application developed using NETCF 3.9 with VS2012:
        imageA simple “Hello World” application on NETCF 3.9
        You can see this same app, running in Hyper-V, stopped at a breakpoint in Visual Studio 2012, using remote debugging:
        imageDebugging a NETCF 3.9 app on Windows, using Hyper-V
        NETCF 3.9 is source compatible with NETCF 3.5
        NETCF 3.9 is a big step forward for performance, however, the APIs that you’ve used in NETCF 3.5 stay the same. As a result, you can move existing .NETCF source code and investments forward to NETCF 3.9.
        You may wonder why NETCF 3.9 is source compatible with NETCF 3.5 and not binary compatible, since .NET assemblies are compiled to the machine-independent MSIL format. There is no change in our assembly format, nor are there any compatibility checks in NETCF 3.9.
        We chose a single compatibility message for Windows Compact Embedded 2013, for both native and .NET development, which is source compatibility. The single biggest driver of this support policy is C++, which needs to be re-compiled for the ARM thumb2 instruction set, with the new Visual Studio 2012 Visual C++ compiler. We have also found that many of you pair your managed code with native implementations. As part of re-compiling that native code, we expect that you may make API changes that would affect your P/Invoke declarations. As a result, we decided that a single compatibility policy for the whole release was the best choice.

        Wrapping Up

        If you are an embedded developer, I’m sure that you are excited that we are making NETCF 3.9 available to you for your embedded apps. We have already talked to a set of developers, who are looking forward to this big update, to significantly improve the runtime performance of the apps that you run on your devices. We look forward to seeing your new devices and the rich experiences that they deliver, after Windows Embedded Compact 2013 is released.
        NETCF 3.9 will be made available with the Windows Embedded Compact 2013 OS, when it is released. It isn’t available at this time. It will also be included in the SDK for the OS, for use with Visual Studio 2012. Watch the Windows Embedded Newsroom for future updates on release plans.
        Follow us or talk to us on Twitterhttp://twitter.com/dotnet.

        Then here is Windows Embedded Compact 2013 presentation @Embedded World 2013 [kojtp2 YouTube channel, March 3, 2013]

        [2:55] The “Silverlight for Windows Embedded” name was changed to “XAML for Windows Embedded” because Silverlight was associated with a browser plug-in technology in developers’ minds while here we have nothing like that.

        Since this video has bad voice recording quality it is also worth to watch the Windows Embedded Compact 2013 Technical Overview of what’s new [Microsoft Webinar Live Meeting record, April 30, 2013] from which I will include the following slide screenshots and some transcripts of my own:

        image

        image

        image

        image

        image

        image

        [21:35] Very cool news: „an entirely rewritten and upgraded .NET Compact runtime

        clip_image002[83]

        [25:25] „XAML for [Windows] Embedded” [changed the name to XAML from Silverlight] allowing UI developers to write using Silverlight, in Expression Blend 5.0 with this release [vs Blend 3 in the previous], which will generate XAML describing the user inerface and the user interactions. We link that in with native C++ code in the back-end, and that allows for extremely powerful interfaces while still allowing for high performance that we use native code plus there’s nothing between us and the operating system, and there’s nothing between us and the hardware, so we have clip_image002[85]much better performance, from real-time perspective as well, not just general performance perspective. … There is increased functionality [in this release] in terms of data binding and data context. … We’ve got new triggers that are supported. … This is still a very, very important area for Microsoft, and frankly from embedded perspective XAML gives you in many ways a superior user interface description environment compared to HTML5. … [27:30]

        clip_image002[87][36:00] .. all-up general SKU … NR SKU for personal navigation devices … and we are coming with a brand new SKU „Windows Embedded Compact 2013 Entry” SKU. And this is the SKU for smaller devices that don’t need XAML capability. .. We haven’t announced our pricing for the 3 SKUs yet. That will be announced around the general availability [GA] timeline. … Windows Embedded Compact 2013 is still on schedule to ship in the first half of 2013. That means June. So we will be shipping and announcing the product in June. What we are giving now is a kind of sneak preview which will give you a technical introduction to the product. [27:53]

        From Q&A

        Using the same rendering engine as before

        XAML for Windows Embedded is not supporting C#

        More information:
        Windows Embedded Compact 2013 [MSDN Library, April 26, 2013]
        from which of particular interest are:
        What’s New (Compact 2013) [MSDN Library, April 26, 2013]
        Expression Blend and XAML for Windows Embedded (Compact 2013) [MSDN Library, April 26, 2013]
        XAML for Windows Embedded Application Development (Compact 2013) [MSDN Library, April 26, 2013]
        Developer Guides (Compact 2013) [MSDN Library, April 26, 2013]
        from which of particular interest is:
        .NET Compact Framework (Compact 2013) [MSDN Library, April 26, 2013]

        Windows Embedded Compact 7 [Windows Embedded Products Overview > Windows Embedded Compact 7 Product Details page, May 7, 2013]

        Windows Embedded Compact 7 is a componentized, real-time operating system designed for small-footprint devices at the edge of enterprise networks. With support for x86 and ARM architectures, Windows Embedded Compact 7 allows devices to leverage the latest innovations in hardware, and equips developers and device manufacturers with the tools they need to create nimble, enterprise-class intelligent system solutions, while reducing time to market.

        Top features 

        Rich user interface
        Includes XAML for Windows Embedded, a powerful technology that allows you to build interfaces that incorporate touch and gesture support.
        Flexible architecture
        Real-time operating system supports an array of hardware requirements and key processor architectures, including x86 and ARM, to power everything from tiny controls to fully automated factories.
        Secure and reliable
        One-tier security model feature is SDL compliant and helps ensure that only authenticated applications can run on an industry device, with reliable wireless connectivity and networking performance.
        Ease of development
        Familiar tools like Visual Studio and Expression Blend allow you to create attractive and intuitive user interfaces, and bring differentiated devices to market faster than ever before.

        Things you can do

        For Enterprises

          One trusted platform
          Devices running on Windows Embedded Compact 7 are covered under a 10-year support program from Microsoft. You can deploy industry devices with the assurance that technical support will be there, when and if it’s needed. And because you can continue using your existing applications based on Windows Embedded CE 6.0, there is a smooth upgrade path for using current applications while moving to the next generation of touch-enabled apps that provide an easy-to-use experience for getting things done more quickly.
          Meets your needs
          Arm your employees with a new breed of business applications harnessing touch and gesture input that showcase your company’s work and give employees better tools to get things done with intuitive access to information. Windows Embedded Compact 7 also provides a flexible device platform that can run on the smallest of devices, or power rich device experiences. And with the capabilities of a real-time operating system, you can be confident of its ability to meet the most exacting of industry requirements.
          Extend business intelligence
          Windows Embedded Compact 7 supports a variety of connectivity options, providing more flexibility for connecting industry devices to your company’s network. Support for enhanced WiFi, Ethernet, Bluetooth and USB enables you to deploy devices across your corporate network, where they can help automate business processes and generate data that leads to greater insight. Collectively, these devices can provide you with greater visibility into what’s happening throughout your company. As critical components of an intelligent system, these devices can help you make decisions in real-time, as well as formulate long-term plans for the growth of your business.

          For OEMs

          One trusted platform
          With Windows Embedded Compact 7, you can develop industry devices within the integrated environment of Platform Builder, to allow adjustments on the hard, real-time operating system while working on specific projects simultaneously. And support for Visual Studio 2008, Expression Blend 3.0 and the .NET Compact Framework 3.5 provides access to the tools that OEMs rely on. Windows Embedded Compact 7 also ensures a consistency of APIs and SDKs, making it possible to leverage past investments and current skillsets to create products that are supported by a 10-year support program from Microsoft, along with the assurance that Windows Embedded Compact 7 will be available for 15 years from the time it was first released.
          Create differentiated devices
          Windows Embedded Compact 7 includes a development framework based on XAML and supports a range of architecture options, including ARM, MIPS and x86. As a result, you have greater flexibility to create devices that match your customers’ specifications. Creating these experiences is simplified with tools such as 3D transformation and Pixel/Shader effects. Your devices will give customers the ability to seamlessly share content on business networks, as well as network devices. And the introduction of touch gesture interface allows developers to create a more natural, interactive experience.
          Extend business intelligence
          Create an experience that helps companies get more done. With Windows Embedded Compact 7, you can design a solution that’s more seamless, making it easier for companies to synchronize content with their Windows PCs. And with the Connection Manager feature and multiple connectivity options, you can ensure that businesses have the optimum level of connectivity across the workplace. Support for enhanced WiFi, Ethernet, USB and Bluetooth virtually guarantees that your device will connect with the other devices, PCs and servers already running in the enterprise. With this connectivity in place, employees will be able to remotely access Microsoft Outlook via Microsoft Air Sync. And the ability to view Adobe and Microsoft Office files will help them stay current on business developments.

            Sample device types

            Human machine interface (HMI)
            The devices provide the ability to monitor automated processes, such as manufacturing, to safegueard against diminished product quality or equipment breakdowns
            RFID scanners
            Speed the completion of common tasks such as inventory, shipping and receiving with these devices
            Medical devices
            Sonograms and other medical devices enable doctors to monitor a baby’s health in utero and send images to researchers in real time via a wireless network
            GPS devices
            Help people stay on course to their destination with these navigation devices

            News: Building the intelligent car of the future
            [Microsoft feature story, May 7, 2013]
            Microsoft: Working with automotive industry to design an updateable car that’s easier to use and responds to the driver’s needs.

            In the 1920’s, carmakers started offering an accessory that would revolutionize the driving experience: the radio. While tooling down the road you could tune into the nightly newscast, a live jazz performance or the seventh game in the series. It provided a connected experience that replaced the steady drone of the four liter under the hood with the soaring notes of Duke Ellington’s bugle or the crack of Babe Ruth’s bat as the ball hurtled toward the right-field stands.
            Since then, the notion of the connected car has changed. Features such as streaming music from your smartphone and using voice commands to control the stereo and environment are standard equipment in many models. And Microsoft has a vision for in-car technology that takes us beyond the confines of the cockpit to what they call the intelligent car — a scenario in which telematics data can help improve the driving experience, and the design of the vehicle.
            Led by Group Program Manager Pranish Kumar, the Windows Embedded Automotive team is focused on fulfilling this vision and, in the process, developing an upgradeable technology solution that extends the useful life of the vehicle.
            Says Kumar: “The automotive industry faces a lot of unique challenges, perhaps first of which is that cars must be supportable for much longer than consumer electronics devices — 10 or 20 years, in most cases. I think we’ve developed a solid understanding of some of these challenges and how technology can address them, while providing drivers with a better experience.”
            Microsoft’s Pranish Kumar and his team work to develop reliable in-car experiences, not by sitting at a desk but by working behind the wheel of a fleet of test vehicles.

            A relationship built on experience and trust
            Microsoft’s involvement in the automotive industry stretches back 15 years to 1998 when the company partnered with Clarion to announce the Auto PC, a first-of-its-kind solution that gave drivers access to email, driving directions, paging and traffic alerts, and their entertainment system. And in 2003 Microsoft developed the Microsoft TBox, a telematics device that went on to power infotainment systems for a variety of carmakers.
            When it came to working directly with carmakers, Kumar says it was an uphill battle to gain their trust. Many had tried to design their own infotainment system and were convinced that it couldn’t be done in a shorter time than seven or eight years. Microsoft has since proven itself by reducing development time down to just two to three years.
            Kumar’s team also adopted the same level of rigor and many of the testing methodologies that carmakers use when conducting customer road tests. Making this change gave the team a “greater degree of confidence” that their development and reporting processes met the carmaker’s need and that the finished product would meet or exceed the driver’s expectations.
            From the connected car to the intelligent car
            For carmakers, the Promised Land lies in giving drivers the ability to access information and services anywhere they live, whether an app on their smartphones, a music file on their tablet at home, or customer contact information on their computer at work or in the cloud. Over time, members of the Windows Embedded Automotive team have earned a reputation for providing solid insight to help make these experiences a reality.
            Together with Kumar, Creative Director John Hendricks, Principal Program Manager Jay Loney, Partner Development Manager David Kelley, and Experience Designers David Walker and Melissa Quintanilha are part of a larger team developing and designing the future of Microsoft’s automotive technologies.
            Top Gear U.S.’s Tanner Foust talks with Microsoft engineers and designers about their vision for the future intelligent car.
            In doing so, they are moving away from a focus on creating in-dash technologies, such as the entertainment or navigation systems, to an emphasis on creating a solution that would power these technologies as part of an overall user experience. Taking this approach has given carmakers the ability to provide periodic updates that refresh the driving experience and extend compatibility to the latest consumer devices.
            In the future Microsoft wants to take that experience a step further. Whereas today consumers demand a car that’s more connected — to their phones, their music and their services — Windows Embedded Automotive is focused on designing intelligent cars that respond to the driver’s needs.
            One example that Kumar cites involves the difficulty of pairing new phones, which is one of the most frequent problems facing car owners. According to IDC, 722 million smartphones were shipped globally in 2012, a 46.1 percent increase over the previous year.[1] As demand for smartphones continues, ensuring compatibility between new models and infotainment systems will remain a challenge.
            A Windows Embedded-based system could transmit data about the unsuccessful pairing to Microsoft and overnight a solution could be identified and downloaded to the car. When the owner gets in his car the next morning, his phone would automatically pair. Over time, that same data could be used to design a user experience that’s not only easier to use but that performs tasks on your behalf, such as tuning to your favorite station or rescheduling a meeting due to traffic delays.
            Drivers also stand to gain from the availability of data. Many vehicles contain sensors that monitor factors such as speed, braking, fuel consumption, tire pressure and environmental conditions. Drivers can already use this information to assess their performance and get recommendations on how to improve fuel efficiency or vehicle maintenance.
            Using the same data, carmakers could augment the existing battery of tests that are part of their proving process. So in addition to putting a vehicle through the environmental extremes of Northern Sweden or California’s Death Valley, they could evaluate its performance in day-to-day conditions. Engineers and product planners could get a head start on the next year’s model through insights around where design improvements are needed or where a car has been over-engineered. They could even fine tune an engine over-the-air to improve fuel economy of the current model year.
            Kumar believes that many of the systems are already in place to make this vision a reality. Using technologies such as Windows Update, cars could be automatically updated — in much the same way as smartphones automatically update when you activate them. And the combination of big data and machine learning could lead to cars that develop an understanding of your preferences and driving behavior to become more responsive to your needs.
            “We’ve come a long way in terms of creating a product that works reliably and meets the quality standards of the automotive industry. And we’re continuing our work with carmakers to reach the full potential of in-car technology,” says Kumar. “Through a combination of software, hardware and user-centric design, we believe that car owners will experience driving like never before possible.”
            [1] IDC Worldwide Mobile Phone Tracker, Jan. 24, 2013

            See also: Maximizing Internet Explorer in Windows Embedded Compact 7 [Windows Embedded blog, June 11, 2012]

            Windows Embedded Compact has a customized version of Microsoft’s Internet Explorer named Internet Explorer (IE) for Embedded. This powerful browser can be used in a number of ways in an embedded system to enhance the functionality of the system. This post will discuss the various ways to tune, customize and even embed IE for Embedded inside embedded applications.
            IE for Embedded is a customized version of Internet Explorer 7 for the desktop with performance enhancements from IE 8 added as well. Specifically, the JScript engine brought from IE 8 provides a 400% performance improvement over the original IE 7 scripting engine. In addition gesture support along with zoom and pan support is in this browser.
            Internet Explorer for Embedded is fundamentally an HTML rendering engine. As such, the user input surrounding the engine, (the “chrome”) isn’t really part of IE for Embedded. Windows Embedded Compact comes with two examples of IE for Embedded; one with classic “Windows” controls and the other one with the chrome rendered with the XAML-driven, Silverlight for Embedded framework. Both of these examples come with the source code that demonstrates how to host the IE control. They also both illustrate that almost all the functionality of these Browsers is contained within the control itself. The chrome only provides input from the user and a platform for returning feedback.
            The classic browser example, IESample, supports a favorites list, browser history and URL completion. It incorporates an internet control panel that can tune how the browser connects to the web as well as setting security settings. The XAML-based browser, IEExr, has a vastly different look and feel. However, it too handles a favorites list, history and control pane. IEExr even supports tabbed browsing using a thumbnail page to switch between pages. The reason the two examples have similar features is that most of the functionality, is incorporated in the IE ActiveX control itself.

            Silverlight for Windows Embedded (Windows Embedded Compact 7) [MSDN Library, Jan 23, 2013]

            Microsoft Silverlight for Windows Embedded is a native (C++) UI development framework for Windows Embedded Compact powered devices that is founded on Microsoft Silverlight 3. You can use Silverlight for Windows Embedded to do the following:

            • Separate programming logic and UI design.
            • Define visual UIs for applications in XAML.
            • Add, modify, and customize the UI at run time.
            • Create interactive multimedia UIs.
            • Collaborate with designers who use Microsoft Expression Blend 3 projects.
            • Simultaneously develop applications for Microsoft Silverlight 3 and Silverlight for Windows Embedded with a common UI defined in XAML files.
            Silverlight for Windows Embedded is compatible with Silverlight 3 XAML and provides a set of equivalent classes for supported XAML elements. For information about Silverlight 3, see http://www.silverlight.net/.
            Silverlight for Windows Embedded is also compatible with existing Windows Embedded Compact window controls, so you can use your existing window controls.
            To add this feature to your OS, see Silverlight for Windows Embedded Catalog Items and Sysgen Variables.
            For reference information, see Silverlight for Windows Embedded Reference.

            For step-by-step guidelines and code examples to help you learn how to create a UI by using Silverlight for Windows Embedded, see Silverlight for Windows Embedded Application Development.
            For recommendations on which hardware to use with Silverlight for Windows Embedded, see Silverlight for Windows Embedded Hardware Recommendations.

            More information:
            Differences Between Microsoft Silverlight 3 and Silverlight for Windows Embedded [MSDN Library, Jan 23, 2013]
            Silverlight for Windows Embedded Application Development [MSDN Library, Jan 23, 2013]

            Microsoft Silverlight for Windows Embedded is a “UI development framework” for “embedded devices” and is based on Microsoft Silverlight for the desktop browser. By using Silverlight for Windows Embedded, you can create an application that supports features such as storyboard animations, transformations, interactive controls, a layout system, and a visual tree.
            Silverlight for Windows Embedded is a native C++ development framework in which you can design a UI for the shell and applications for a Windows Embedded Compact device. You can use Microsoft Expression Blend 3 to quickly design a UI in Extensible Application Markup Language (XAML), which you can then convert, or you can build your application from scratch in Microsoft Visual Studio 2008 by using one of the Smart Device project templates. In the native C++ source files for your application, you can use the rest of the features of Windows Embedded Compact 7, including any existing window controls.
            By using Silverlight for Windows Embedded, you can create a UI that provides advanced visual effects for your Windows Embedded Compact device shell and applications. Silverlight for Windows Embedded makes this possible by supporting a subset of Silverlight XAML elements and by supplying a set of C++ classes that provide access to these elements.

            Graphics and Performance in Silverlight for Windows Embedded (Windows Embedded Compact 7) [MSDN Library, Jan 23, 2013]
            Hardware Acceleration in Silverlight for Windows Embedded (Windows Embedded Compact 7) [MSDN Library, Jan 23, 2013]

            Many modern device platforms include on-board graphics processing units (GPUs) with two-dimensional or three-dimensional capabilities or both. Microsoft Silverlight for Windows Embedded provides support for using a GPU to accelerate certain types of animations. Hardware acceleration is accomplished by using the GPU (rather than the CPU) to do some critical composition steps in the rendering process. Silverlight for Windows Embedded supports hardware-based acceleration of graphics for both Microsoft DirectDraw and OpenGL.
            For information on how to implement hardware acceleration, see Implement Hardware Acceleration for Graphics in Silverlight for Windows Embedded [Reference].

            With Visual Studio 2012 Update 2 Now Available [Somasegar’s blog on MSDN, April 4, 2013]

            It includes support in Blend for SketchFlow, WPF 4.5, and Silverlight 5.

            which according to ANNOUNCING VISUAL STUDIO 2012 UPDATE 2 CTP 2 [Blend Insider, Jan 30, 2013]

            Blend for Visual Studio [as part of a consolidated designer/developer offering retained only from previous Expression products that were phased out with the Visual Studio 2012] now support WPF, Silverlight and SketchFlow projects in the same version of Blend (support for these was previously available only as a standalone Preview release of Blend). With this CTP release, Blend now supports developing Windows Store, Windows Phone, WPF and Silverlight apps without needing to have multiple versions of Blend on the same machine. The table below highlights the various platforms that are now supported in Blend for Visual Studio 2012:

            TARGET PLATFORM

            VERSIONS SUPPORTED

            SPECIFIC REQUIREMENTS

            Windows Store XAML and HTML

            Windows 8

            Windows 8

            Windows Phone

            Windows Phone 8, Windows Phone 7.5

            Windows Phone 8 SDK

            WPF

            3.5, 4.0, 4.5

             

            Silverlight

            4, 5

             

            SketchFlow

            WPF 4.0 and Silverlight 4

            Visual Studio 2012 Premium or higher

            Additional details:
            – in Silverlight 5 Beta – available now! [Silverlight Team blog on MSDN, April 22, 2011] for Silverlight 5 Available for Download Today [Silverlight Team blog on MSDN, Dec 9, 2011]
            – in What’s New in Silverlight for Windows Phone [MSDN Library] (which is part of Silverlight for Windows Phone [MSDN Library])

            Silverlight for Windows Phone OS 7.1 is based on Silverlight 4. That means if you create a new Silverlight for Windows Phone application that targets Windows Phone OS 7.1, you can take advantage of several new features. You can still write applications that target Windows Phone OS 7.0, but to take advantage of the new features, you must target Windows Phone OS 7.1. Applications that target Windows Phone OS 7.0 will run on devices running Windows Phone OS 7.1. This topic introduces some of the new features and improvements in Silverlight for Windows Phone.

            – in What Version is Windows Phone Mango? [Shawn Wildermuth blog, Aug 19, 2011]

            In finishing up my new Windows Phone book, I had to deal with the confusing version problem. There are three version numbers to be aware of:

            • Windows Phone 7.5
            • Windows Phone OS 7.1
            • Windows Phone SDK 7.1

            So what is Mango? It comes down to this:

            • Windows Phone 7.5: The marketing name of the phone. This is the phrase you’ll see in the ads to consumers.
            • Windows Phone OS 7.1: The name of the actual operating system. When you create a new application in Visual Studio (or upgrade an existing one), you’ll see this version.
            • Windows Phone SDK 7.1: The name of the Mango tools.

            So get your nomenclature right and stop being confused.

            Features Differences Between Silverlight and Silverlight for Windows Phone [MSDN Library]
            Implementation Differences Between Silverlight and Silverlight for Windows Phone [MSDN Library]

            Microsoft betting on boosting Windows RT demand with top level ARM SoCs from its SoC partners, Windows 8.1 enhancements, Outlook addition to the Office 2013 RT and very deep tactical discounts to its OEM partners for tablet offerings of more value and capability

            … especially valuable for small businesses, and even enterprises of different, larger sizes thanks to new enhancements in manageability, networking, and security announced at TechEd North America 2013 (see “Cloud first” from Microsoft is ready to change enterprise computing in all of its facets [this same ‘Experiencing the Cloud’ blog, June 4, 2013]).

            Relevant excerpts from Nick Parker, Tami Reller, Antoine Leblond and Steve Guggenheimer: COMPUTEX 2013 Keynote Transcript [Microsoft, June 5, 2013]

            The full record of the keynote from Notebookitalia which contains the below excerpts between [10:49] and [19:50] as indicated.

            Tami Reller, Chief Marketing Officer and Chief Financial Officer, Windows:

            [10:49] Bringing the power of Windows to tablets is a really big part of the vision of Windows 8 and of Windows RT, really a new class of tablets that offers more value and capability than today’s tablets. […]

            [15:00] Windows tablets are an important part of the Windows 8 vision, and Windows tablets do more.

            Completing that promise of do more, I’m pleased to announce that starting with the back-to-school lineup, and in some cases even earlier, Windows x86 tablets will come with Office. That’s Word, that’s Excel, PowerPoint, and OneNote in the box. We’re making that possible through new OEM offerings that were introduced earlier this spring.

            Even with the value of Office built-in to these Windows tablets, these new offerings are going to allow our partners to build opening price point tablets, as well as great premium tablets.

            Additionally, we’ve opened up support for small tablets with Windows 8, and we’ll do more with Windows 8.1. You’ve seen the first of those tablets here at COMPUTEX. Congratulations to Acer on their announcements earlier this week.

            And coming with 8.1, building on our support for small tablets, we’re really committed to completing the scenario, including full portrait support.

            One of the top requests from Windows RT customers has been Outlook. I’m very pleased to announce that with the Windows RT 8.1 update Microsoft Outlook will be in-box.

            With 8.1 we’re again embracing the very latest technology, and the very latest on the silicon roadmap. Specifically this includes Bay Trail-T, Qualcomm 8974 [one of Snapdragon 800 SoCs coming in commercial devices of H2 2013, see more details in Snapdragon 800 Product Brief], and NVIDIA T40 [or Tegra 4 first in the already announced HP SlateBook x2 to be available in August 2013].

            And we’re expanding our ARM program to provide more component flexibility, creating more opportunities for partners to build competitive ARM tablets running Windows. [17:15 …]

            [19:10] Windows 8.1 is easy for our customers to get. It’s free to Windows RT and Windows 8 customers so that whether a customer has Windows 8 today or is buying a PC or a tablet or any other device in the near future, it will be one click away and very easy to get Windows 8.1. We’ll deliver it through the Windows Store, including the preview, which will come at the end of June. And the final product will be available later this calendar year. [19:50 …]

            New ecosystem opportunities, Windows 8.1 updates shared at Computex [Blogging Windows from Microsoft, June 5, 2013]

            Antoine Leblond, corporate vice president of Windows program management joined Tami and other top Microsoft executives on stage to give our very first public demo of the upcoming Windows 8.1 update – touching upon many of the exciting improvements Antoine highlighted in his blog post from last week. You can see some of the highlights of what to expect in Windows 8.1 for yourself in this short demo video featuring Jensen Harris from the Windows User Experience Team:

            Jensen Harris from the Windows Team shows some highlights of what to expect in Windows 8.1 coming later this year as a free update for Windows 8 customers. http://bit.ly/10OM2Th

            Additionally, Tami announced that Outlook 2013 RT will be coming to Windows RT tablets as part of Windows 8.1. Windows running on ARM architectures has enabled an exciting new category of mobile-first, instant-on tablets that are thin and lightweight, with amazing battery life. We know that the addition of Outlook for those using ARM-based Windows devices such as the Surface RT, Dell XPS 10, Lenovo Yoga 11, and ASUS VivoTab RT as well as new tablets to come in the future has been a popular request from consumers and businesses alike. As Tami said in her keynote address, we’ve listened and Outlook will be joining the other Office applications currently available on Windows RT, including Word, Excel, PowerPoint, and OneNote.

            Our commitment to Windows on ARM doesn’t stop with the addition of Outlook 2013 RT. We announced a number of other enhancements with Windows 8.1, earlier this week at TechEd North America, including new manageability, networking, and security capabilities that will make Windows RT an even more compelling option for enterprises.

            Eight questions about Windows 8 for Microsoft manufacturing chief Nick Parker [PCWorld, June 5, 2013] 

            IDG: So you just announced you’ll be including Outlook with the next version of Windows RT, what was the thinking behind that?

            NP: Outlook is one of those apps people love, and when you start thinking about RT in the small business environment, or for heavy email users, Outlook is one of those high value solutions. That was the one we got the most feedback about.

            IDG: The reception for Windows RT has been a bit lukewarm, what are some of the reasons for that and to what extent will adding Outlook will improve the situation?

            NP: If you look at what we did with RT—it’s completely new silicon, a new hardware platform, and Windows 8 is a new OS. So first you just have a natural growth curve when you’re starting at zero. Then you start seeing new apps appear, the killer apps that people want, like Outlook. And the ecosystem gets more familiar with it—they learn how to code to it and how to certify parts for it.

            We get so used to the tremendous success we’ve had on PCs for years, you just think you can flip a switch and the platform’s going to change. I think it’s just the incremental growth of a new platform. And we should be a bit humble about how we go to market and talk about the new capabilities. I think we could maybe have inspired people a bit more with some of the RT devices and some of our marketing.

            IDG: There’s a lot of downward pressure on tablet pricing—Asus showed an Android tablet this week for $129. Do you expect to see Windows 8 tablets getting down to those sort of prices?

            NP: That’s a question to ask our OEMs [original equipment manufacturers, or basically PC makers]. I think people are prepared to pay for value and we see tablets with higher price points having better capabilities and features. I think buyers are getting smart about what’s good quality. But OEMs will choose their own prices.

            imageThe Acer Iconia W3-810 tablet

            IDG: We saw the first 8-inch Windows tablet launch this week from Acer. What are some of the things you’re doing to provide a better Windows experience on those smaller devices?

            NP: For any device you can hold in one hand, one of the things you need is portrait mode—so, the ability for the apps to work in the same way, to move and to flow nicely. And for our OEMs, we’re giving them the ability to have buttons on the side of the device, because when you’re holding it in one hand you might want to push a button on the side. You have to make the OS extensible. So those are the types of things.

            IDG: Will that all be part of Windows 8.1?

            NP: Yes, we talked about that today.

            IDG: I’ve never thought of Windows as being designed for smaller screens; the netbook experience wasn’t particularly great. What are you doing to improve the software experience?

            NP: In terms of how the display scales up and down, and in terms of the zooming capabilities—as soon as the preview [of Windows 8.1] comes out you should play with it.

            IDG: There’s a tremendous variety of form factors out there right now—all kinds of laptops and tablets and convertibles. When you look ahead a few years, do you expect them to coalesce around a few winning designs or will there always be that much variety?

            NP: In terms of capabilities, I think touch is going to be the new standard. People aren’t going to want to carry around hundreds of devices. You’ll have a phone, and I think the phablet is an interesting space. But for two-in-one detachables—I’m seeing the interest in those ramp. People want the best of both worlds. You can have a tablet and sit there and surf, then you plug it into a keyboard and you’re off working.

            IDG: Is the keyboard here to stay, or will people eventually get used to typing on touchscreens?

            I think the keyboard is here to stay, you’ve got that physical feedback. You may see a lot of innovation around keyboards but I think they’re here to stay.

            Google search on “Computex Windows ARM discount” between June 5 and 6 was yielding the following items:
            One year after debut, Windows RT is a Computex no-show | The Verge | OSNews | I4U News
            New ecosystem opportunities, Windows 8.1 updates shared at Computex | Blogging Windows [from Microsoft]
            Microsoft to include Outlook app with update to Windows 8 RT | ARN [Australia]
            Microsoft Aims to Lure More Users to Windows | WSJ.com
            Microsoft To Give More Tablet Makers Windows 8 Discounts | NASDAQ.com | 4-Traders | Capital.gr
            Microsoft to Offer Discounted Windows and Office for Small Tablets | AllThingsD | CELLIFONE.com
            AMD breaks from Windows exclusivity, adopts Android and Chrome OS | Facepunch.com
            Forget Haswell: Why tablet processors mean more to Intel at Computex | The USA News Online 
            Computex 2013: low-cost tablets, high-res laptops steal the show | Techgoondu
            Microsoft says Outlook is coming to Windows RT this year | ZDNet
            Microsoft demonstrates Windows as a platform for small tablets, touch and mobility at Computex 2013 | Virtualization Journal [replica of Microsoft press release]

            Windows RT is a Computex no-show:

            Three days into Computex Taipei, Asia’s biggest computer show, not a single manufacturer has announced a Windows RT device. … The Computex show floor has been dominated by devices running Windows 8 on Haswell and other chips from Intel, but ARM-powered units have been conspicuous in their absence.

            However, the upcoming Windows 8.1 update and its RT counterpart could provide a shot in the arm to the fledgling OS. Qualcomm has pledged support for RT 8.1 with its new Snapdragon 800 processor, which president and COO Steve Mollenkopf described in a presentation today as offering “about 75 percent better performance than the S4 Pro.”

            The Verge has heard that manufacturers may be holding back RT devices for Qualcomm’s new chip and the 8.1 update, which is also designed to improve the experience on smaller-screened devices.

            include Outlook app with update to Windows 8 RT:

            Outlook will be included with version 8.1 of Windows RT, previously dubbed Windows Blue, Microsoft announced at the Computex trade show in Taipei on Wednesday. The 8.1 update is scheduled for release later this year as a free update to Windows 8.

            “We’re always listening to our customers and one piece of feedback was that people want the power of Outlook on all their Windows PCs and tablets,” Microsoft said. […]

            Support for RT from hardware makers has been limited, however, with several PC makers, such as Acer, Asustek Computer and Hewlett-Packard, not yet supporting the OS.

            Microsoft hopes to change that by addressing one of the criticisms of Windows RT — that it doesn’t include a version of its popular Outlook email client. Nvidia CEO Jen Hsun Huang has been vocal about the importance of adding Outlook to RT.

            “If Outlook were to show up on RT, my life would be complete,” he said recently, lamenting the slow sales of Windows RT tablets. “I am one Outlook away from computing nirvana. Outlook god, please…”

            Lure More Users to Windows:

            Until now, people with Windows RT devices—which use different kinds of computer chips than those common in personal computers—have only been able to use a new type of email app that has been panned by users.

            A Microsoft executive, speaking at the Computex computer trade show in Taiwan, also acknowledged the company is cutting the prices it charges computer makers for Microsoft software.

            The executive, Nick Parker, didn’t detail the size of the software discounts. But people familiar with Microsoft’s pricing strategy have said for Windows RT devices, Microsoft is cutting by two-thirds the cost to license Windows and Office software, or roughly $100 before marketing rebates Microsoft offers to PC makers.

            Microsoft’s discounts apply to tablets smaller than 10.1 inches, Mr. Parker said. The company said it started offering discounts to some tablet makers in April.

            The discounts and addition of Outlook underscore how hard Microsoft is trying to boost the appeal of devices that run Windows RT, a product whose development marked a major break from company tradition. […]

            “This is an exciting development that we believe will deliver a much more robust and full-featured experience to Windows RT users,” wrote Mark Aevermann, an Nvidia product manager, in a blog post.

            Microsoft executives have said they would push harder to bolster sales by explaining more clearly the attributes of Windows RT and ARM chips.

            We are very committed to ARM,” said Tami Reller, the Windows chief financial officer and chief marketing officer, in an interview last month.

            Windows executives also recently suggested Windows RT devices might in the future lose the dual modes that have been a polarizing feature of the new Windows.

            Windows 8 and Windows RT devices operate in both a traditional Windows “desktop” and a new mode that looks and functions more like a smartphone screen. The Windows executives, Jensen Harris and Antoine Leblond, suggested in a May interview that it might be appropriate to junk desktop mode entirely on Windows RT devices.

            Windows 8 Discounts:

            Nick Parker, vice president of Microsoft’s OEM division, said at the Computex trade show in Taipei Wednesday that the Redmond, Wash. company is now expanding its discount program to include tablets that run on Windows RT, a version of Windows 8 running on ARM Holdings PLC (ARMH, ARM.LN) chips. The discount will also apply to an upgraded version of its Windows 8 system dubbed Windows 8.1. The discounts will only apply on tablets that are between 7 and 10.1 inches. The executive declined to comment on the size of the discounts but Mr. Parker said they will come in the form of a cut in licensing fees and free Office software for hardware makers.

            Microsoft said it started offering discounts to some tablet makers in April and there is no specific time frame for when the discounts might end.

            The Wall Street Journal reported in early March, citing people familiar with the situation, that Microsoft had been offering price breaks on its Windows 8 and Office software to help spur the development of small, touch-enabled laptop computers.

            In the latest discount program, tablets with screens bigger than 10.1 inches will not be eligible for the discount, Mr. Parker said. But he didn’t elaborate.

            Analysts said the discounts could help bring down retail prices of smaller Windows tablets and help Microsoft better compete with Apple Inc. (AAPL) and Google Inc. (GOOG).

            Discounted Windows and Office for Small Tablets:

            Second, Microsoft is cutting some sort of deal with computer makers that want to bundle Windows 8 and Office Home and Student onto a seven- or eight-inch tablet. Microsoft isn’t going into detail on what it is charging PC manufacturers, but it is clearly low enough to enable some pretty inexpensive tablets.

            The first of these tablets to be announced, Acer’s Iconia W3, has a $379 sticker price. That’s pretty darn cheap for a machine that includes full-blown Windows and Office.

            Microsoft isn’t saying which other computer makers may also be working on small tablets, but with the PC market struggling, it seems reasonable to think we will see a number of such tablets in short order.

            And while Microsoft’s bundle program appears limited to small tablets, one could conceivably hook up the tiny tablet to a monitor and keyboard and use it as a home PC.

            low-cost tablets, high-res laptops steal the show:

            Since Amazon’s Kindle Fire and Asus’ Nexus 7 came out last year, the idea of a cheap, small tablet has taken hold like few expected. This year, the cheap is going to get cheaper, with Asus’ MeMo Pad HD7 starting from just US$129 for an 8GB version.

            image

            Now, this may not be as cheap as some models you’d find in Shenzhen, but this model from Asus will win over many users looking for an affordable but well-made tablet.

            The new MeMo Pad HD7 also seems like an updated version of the successful Nexus 7. There is the 1,280 x 800 screen, now coupled with a quad-core Arm Cortex A7 CPU, and a microSD card slot to pop in memory cards, which the Nexus 7 did not have. No idea of when this is coming, but expect to save some money for a budget tablet this holiday season. […]

            An interesting idea, which may not turn out to be a major trend, is small Windows tablets. Acer surprised many visitors with its 8-inch Windows 8 tablet, probably the first such mobile option.

            The Iconia W3 runs an Intel Atom chip, has 2GB RAM and either 32GB or 64GB storage. The 1,280 x 800 resolution is not too bad on the small screen.

            image

            What’s a little hard to see is the Windows desktop, when you fire up your traditional Windows programs, like Excel. During a quick hands-on, I can tell that the screen was too small for serious editing. Don’t even think of sharing programs on the screen. It’s just too small.

            Which leaves you in mostly the Metro touch interface on Windows 8. Sadly, there aren’t many apps here yet, compared to either an Apple iPad mini or an Android tablet.

            Not just that, while the US$379 asking price isn’t unreasonable for the hardware, the question is on usage. If you’re using the machine mainly as a small tablet, Android tablets are getting cheaper all the time, as Asus’ MeMo Pad HD7 shows.

            Outlook is coming to Windows RT:

            Owners of existing RT devices will receive the updates for free.

            Despite weak sales of its own ARM-powered Surface and even more tepid support from hardware partners, Microsoft doesn’t appear to be backing away from Windows RT. The addition of Outlook will undoubtedly convince some previously recalcitrant business buyers that Windows RT tablets make sense, as will the announcement at the Tech-Ed conference this week of management tools that allow greater control over Windows RT devices. And Microsoft also announced support for additional types of Virtual Private Networks (VPNs) on Windows RT.

            But there are still dealbreakers that stand in the way of widespread deployments of Windows RT. Office 2013 RT has many of the same features as its x86/x64 counterpart, but it lacks the ability to handle custom macro code. In addition, some features are missing from the RT programs, including the ability to embed audio and video in OneNote notebooks.

            And Office is the only desktop app that Microsoft has officially ported to Windows RT. Third-party developers don’t have that option, which means any business that requires a third-party desktop app or a browser plugin other than Adobe Flash is out of luck. Likewise, Windows RT still doesn’t support some widely used third-party VPN clients.

            There’s also the pesky issue of licensing. The version of Office included with Windows RT is Office Home and Student 2013, which is licensed for noncommercial use only. If you want to stay in the good graces of Microsoft’s licensing agreement, you need to add commercial use rights, through a volume license or by way of a subscription to a business edition of Office 365.

            Today’s announcement is also noticeably silent on the question of when Microsoft plans to release native tablet versions of its Office programs, for both Windows 8.1/RT as well as alternative platforms like the iPad and Android tablets. The fact that the desktop version of Outlook is a key part of this fall’s update suggests that Office for tablets won’t appear until 2014, and one recent rumor says late 2014 is the likely target date for those apps.

            Windows as a platform for small tablets [Microsoft press release replicated]:

            “We want to be the best partner to all hardware manufacturers, from the way we engage and invest on new product designs to the experience we jointly deliver to customers,” Parker said. “This new wave of Windows devices from our partners, combined with our software, apps and services, reflects that commitment.”

            Most notable of the devices Parker showed were the new 7-inch and 8-inch Windows tablets: the Acer Iconia W3 that launched on June 3 in Taipei, and three other small tablets from top original device manufacturer (ODM) and original equipment manufacturer (OEM) partners expected to ship for the holiday season. These small tablets provide a Windows experience with Office Home & Student 2013 delivering even more options to experience all that Windows can offer in a smaller form factor. […]

            Tami Reller, chief financial officer and chief marketing officer of Microsoft’s Windows Division, joined Parker onstage … “Windows 8.1 furthers the bold vision of Windows 8 by responding to customer feedback and adding new features and functionality that advance the touch experience and mobile computing’s potential,” Reller said.

            As part of this commitment, Reller announced that Outlook 2013 RT will be available on Windows-based ARM tablets with the Windows 8.1 update later this year. “Windows on ARM is a core part of our strategy today and moving forward, and the addition of Outlook further enriches this world of new on-the-go opportunities for partners and customers,” Reller said.

            “Cloud first” from Microsoft is ready to change enterprise computing in all of its facets

            … represented by these alternative/partial titles explained later on in this composite post:
            OR Choosing the Cloud Roadmap That’s Right for Your Business [MSCloudOS YouTube channel, June 3, 2013]
            OR Microsoft transformation to a “cloud-first” (as a design principle to) business as described by Satya Nadella’s (*) Leading the Enterprise Cloud Era [The Official Microsoft Blog, June 3, 2013] post
            OR Faster development, global scale, unmatched economics… Windows Azure delivers [Windows Azure MSDN blog, June 3, 2012] which is best summarized by Scott Guthrie (*) as the following enhancements to Windows Azure
            OR as described by Brian Harry (*) in Visual Studio 2013 [Brian Harry’s MSDN blog, June 3, 2013]
            OR as described by Brad Anderson (*) in TechEd 2013: After Today, Cloud Computing is No Longer a Spectator Sport [TechNet Blogs, June 3, 2013]
            OR as described by Quentin Clark (*) in SQL Server 2014: Unlocking Real-Time Insights [TechNet Blogs, June 3, 2013]
            OR as described by Antoine Leblond (*) in Continuing the Windows 8 vision with Windows 8.1 [Blogging Windows, May 30, 2013], and continued by Modern Business in Mind: Windows 8.1 at TechEd 2013 [June 3, 2013] from Erwin Visser (*) describing some of the features that businesses can look forward to in Windows 8.1
            OR putting all this together: Microsoft unveils what’s next for enterprise IT [press release, June 3, 2013]

            First watch how this whole story was presented in the keynote to TechEd North America 2013 on June 3, 2013:

            Brad Anderson was the keynote speaker so besides the overall topic and his two particular topics he is taking care of all the introductions/recappings to detailed/particular parts delivered by other executives from Microsoft Server & Tools Business as well. His keynote is starting at [3:18]. He first invites Iain McDonald to deliver the Windows 8.1 Enterprise presentation starting at [6:36] in the video. After that at [28:57] Brad is talking about “Empower people-centric IT” based on “Personalized experience”, “Any device, anywhere” and “Secure and Protected” leading to System Center Configuration Manager 2012 R2 + Windows Intune which are demonstrated from [38:00] to [46:50] by Molly Brown, Principal Development Lead for those products. Bringing consistence experience across PCs, iOS devices and Android devices supporting the BYOD trend for client device managers. Then he starts talking about “Enable modern business applications” based on “Time to market”, “Revolutionary technology” and “Organizational readiness” by focusing on “Rapid lifecycle”, Multi-device”, “Any data, any size” and “Secure and avalable”. Then comes (at [50:00]) a current state-of-the-art overview of the Windows Azure business and a customer testimonial from a budget airliner Easy Jet at [52:10] about moving to the “allocated seating mode” for which they indeed required the peak load support capability of Windows Azure to meet the sudden rush of customer reservations for things like putting everything on sale at slashed down prices. He then at [56:45] invites Scott Guthrie to talk about the Windows Azure application platform leading to announcements like “Windows Azure per minute pricing” and “Windows Azure MSDN offer”. Brian Harry replaces Guthrie on the stage (at [1:05:02]) to continue the same topic with the upcoming Visual Studio 2013 offering a number of new additions for team development. Brad is back at [1:14:40] to talk about “Unlock insight from any data” based on “Data explosion”, “New types and sources of data” and “Increasing user expectations” by focusing on “Easy access to data”, “Powerful analytics for all” and “Complete data platform”. To shed specific details on that he invites Quentin Clark at [1:16:44] who is talking about the upcoming SQL Server 2014 and joined by his marketing partner Eron Kelly demonstrating the new things coming with that product. At [1:36:15] Brad Anderson is back to talk about how to “Transform the datacenter” based on “Cloud options on demand”, “Reduced cost and complexity” and “Rapid response to the business”. First he talks about the cloud platform itself (as an infrastructure) exploiting a customer testimonial from Trek Corporation at [1:39:24]. Then at [1:41:24] he announces Windows Server 2012 R2 and System Center 2012 R2, followed with Windows Azure Pack announcement encompassing a number of things which ar demonstrated at [1:44:44] by Clare Henry, Director of Product Marketing. At [1:49:25] Brad is back to talk about the fabric of this infrastucture for which he also invites at [1:51:30] Jeff Wolser, Principal Program Manager for looking into the storage, live migration, HyperV replica etc. From [2:01:27] Brad is delivering the final recap.

            The final recap by Brad Anderson well represented the story shown in the keynote:

            1. [2:03:20] Microsoft’s cloud vision is the Cloud OS in which they have 4 promises:
              image
              which was fully covered in the keynote (actually in that order) and
            2. [2:03:50] with the new announcements demonstrating execution on those promises:
              image

            Then here is the alternative/partial information which became also available:

            OR Choosing the Cloud Roadmap That’s Right for Your Business [MSCloudOS YouTube channel, June 3, 2013]

            Introductory information: Built From the Cloud Up [MSFTws2012 YouTube channel, Nov 20, 2012]

            Experience Microsoft’s vision for the Cloud OS with Satya Nadella (*) and see how it is made real today with Windows Server 2012 and Windows Azure. Learn more at http://microsoft.com/ws2012

            OR Microsoft transformation to a “cloud-first” (as a design principle to) business as described by Satya Nadella’s (*) Leading the Enterprise Cloud Era [The Official Microsoft Blog, June 3, 2013] post:

            Two years ago we bet our future on the cloud and quietly refocused our 19 billion-dollar software business by completely transforming our products, culture and practices to be cloud-first. We knew the journey would be long and challenging with plenty of doubters. But we forged ahead knowing that the cloud transition would change the face of enterprise computing. […]

            To enable this transformation we had to make deep changes to our organizational culture, overhauling how we build and deliver products. Every one of our division’s nearly 10,000 people now think and build for the cloud – first. […]

            We are already seeing this bet deliver substantial returns. Windows Azure is going through hyper-growth. Half the Fortune 500 companies are using Windows Azure. We have over 1,000 new customers signing up every day and over 30,000 organizations have started using our IaaS offering since it became available in April. We are the first multinational company to bring public cloud services to China. Ultimately we support enormous scale, powering some of the largest SaaS offerings on the planet.

            (*) Satya Nadella is President, Server & Tools Business, a US$19 billion division that builds and runs the company’s computing platforms, developer tools and cloud services. The whole above mentioned post contains the email he sent to employees about the progress they’ve made completely transforming Microsoft products, culture and practices to be cloud-first.

            Introductory information: Enable Modern Apps [MSFTws2012 YouTube channel, Nov 20, 2012]

            Scott Guthrie (*) demonstrates how Windows Server 2012 and Windows Azure provide the world’s best platform for modern apps. Learn more at http://microsoft.com/ws2012

            OR Faster development, global scale, unmatched economics… Windows Azure delivers [Windows Azure MSDN blog, June 3, 2012] which is best summarized by Scott Guthrie (*) as the following enhancements to Windows Azure:

            Windows Azure: Announcing New Dev/Test Offering, BizTalk Services, SSL Support with Web Sites, AD Improvements, Per Minute Billing [ScottGu’s blog, June 3, 2013]

            • Dev/Test in the Cloud: MSDN Use Rights, Unbeatable MSDN Discount Rates, MSDN Monetary Credits
            • BizTalk Services: Great new service for Windows Azure that enables EDI and EAI integration in the cloud
            • Per-Minute Billing and No Charge for Stopped VMs: Now only get charged for the exact minutes of compute you use, no compute charges for stopped VMs
            • SSL Support with Web Sites: Support for both IP Address and SNI based SSL bindings on custom web-site domains
            • Active Directory: Updated directory sync utility, ability to manage Office 365 directory tenants from Windows Azure Management Portal [regarding this read also: Making it simple to connect Windows Server AD to Windows Azure AD with password hash sync [Active Directory Team Blog, June 3, 2013]
            • Free Trial: More flexible Free Trial offer
            (*) Scott Guthrie, Corporate Vice President (CVP) of Program Management leading the Windows Azure Application Platform Team in the Server & Tools Business

            OR as described by Brian Harry (*) in Visual Studio 2013 [Brian Harry’s MSDN blog, June 3, 2013]

            Today at TechEd, I announced Visual Studio 2013 and Team Foundation Server 2013 and many of the Application Lifecycle Management features that they include. … I will not, in this post, be talking about many of the new VS 2013 features that are unrelated to the Application Lifecycle workflows. Stay tuned for more about the rest of the VS 2013 capabilities at the Build conference. […]

            We are continuing to build on the Agile project management features (backlog and sprint management) we introduced in TFS 2012 and the Kanban support we added in the TFS 2012 Updates. With TFS 2013, we are tackling the problem of how to enable larger organizations to manage their projects with teams using a variety of different approaches. … The first problem we are tackling is work breakdown. … We are also enabling multiple Scrum teams to each manage their own backlog of user stories/tasks that then contributes to the same higher-level backlog. […]

            We’ve been hard at work improving our version control solution. … We’ve added a “Connect” page to Team Explorer that makes it easier than ever to manage the different Team Projects/repos you connect to – local, enterprise or cloud. …We’ve also built a new Team Explorer home page. …The #1 TFS request on User Voice. … So, we have introduced “Pop-out Team Explorer pages”. …  Another new feature that I announced today is “lightweight code commenting”. […]

            As always, we’ve also done a bunch of stuff to help people slogging code every day. The biggest thing is a new “heads up display” feature in Visual Studio that provides you key insights into your code as you are working. We’ve got a bunch of “indicators” now and we’ll be adding more over time. It’s a novel way for you to learn more about your code as you read/edit. … Another big new capability is memory diagnostics – particularly with a focus on enabling you to find memory leaks in production. […]

            In addition to the next round of improvements to our web based test case management solution, today I announced a preview of a brand new service – cloud load testing. […]

            At TechEd today, perhaps my biggest announcement was our agreement to acquire the InRelease release management product from InCycle Software. I’m incredibly excited about adding this to our overall lifecycle solution. It fills an important gap that can really slow down teams. InRelease is a great solution that’s been natively built to work well with TFS. […]

            With TFS 2013 we are trying a new tact to facilitate that called “Team Rooms”. A Team Room is a durable collaboration space that records everything happening in your team. You can configure notifications – checkins, builds, code reviews, etc to go into the Team Room and it becomes a living record of the activity in the project. You can also have conversations with the rest of your team in the room. It’s always “on” and “permanently” recorded, allowing people to catch up on what’s happened while they were out, go back and find previous conversations, etc. […]

            (*) Brian Harry, Microsoft Technical Fellow working as the Product Unit Manager for Team Foundation Server (TFS).

            Introductory information: Empower People Centric IT [MSFTws2012 YouTube channel, Nov 20, 2012]

            Brad Anderson (*) shows how Windows Server 2012 helps enable personalized experiences across devices. Learn more athttp://microsoft.com/ws2012

            OR as described by Brad Anderson (*) in TechEd 2013: After Today, Cloud Computing is No Longer a Spectator Sport [TechNet Blogs, June 3, 2013]

            We are now delivering on our vision with a wave of enterprise products built with this cloud-first approach: Windows Server & System Center 2012 R2 and the update to Windows Intune bring cloud-inspired innovation to the enterprise, and enable hybrid scenarios that cannot be duplicated anywhere in the industry.

            With this new wave, our partners and customers can do four key things:

            • Build a world-class datacenter without barriers, boundaries, or limitations.
            • Use a Cloud OS to innovate faster and better than ever before.
            • Embrace and control the countless ways users circumvent IT, but still enable productivity.
            • Get serious about the cloud with a partner who takes the cloud seriously.

            These developments shatter the obstacles which once stood in the way of turning traditional datacenters into modern datacenters, and which inhibited the natural progression to hybrid clouds. These hybrid scenarios are especially exciting – and Microsoft’s comprehensive support for them sets us apart from each and every other competitor in the tech industry.

            (*) Brad Anderson, Corporate Vice President (CVP) of Program Management leading the Windows Server and System Center Group (WSSC) in the Server & Tools Business. The rest of his above post will shed more light on the Microsoft achievements delivered in his sphere of activity. See also his In the Cloud blog for more details.

            Follow-up information: Transform the Datacenter [MSFTws2012 YouTube channel, Nov 20, 2012]

            Bill Laing (*) shows how Windows Server 2012 helps increase agility and efficiency in the datacenter. Learn more athttp://microsoft.com/ws2012
            (*) Bill Laing, Corporate Vice President (CVP) for Server and Cloud [Development]. Read also his Announcing New Windows Azure Services to Deliver “Hybrid Cloud” [Windows Azure blog, June 6, 2012] post.

            Introductory information: Webcast: From Data to Insights [sqlserver YouTube channel, April 2, 2013]

            To better understand the impact of big data on the future of global business, Microsoft hosted an exclusive webcast briefing, “From data to insights”, produced in association with the Economist. In the webcast, you’ll hear from Tom Standage, digital editor of the Economist, on the social and economic benefits of mining data, followed by a moderated discussion featuring two Microsoft data experts, VP/Technical Fellow for Microsoft SQL Server Product Suite, Dave Campbell, and Technical Fellow, Server and Tools, Raghu Ramakrishnan, for an insider’s view of the trends and technologies driving the business of big data, as well as Microsoft’s big data strategy. To learn more about Microsoft big data solutions, visithttp://www.microsoft.com/bigdata

            OR as described by Quentin Clark (*) in SQL Server 2014: Unlocking Real-Time Insights [TechNet Blogs, June 3, 2013]

            The next version of our data platform – SQL Server 2014 – is a key part of the day’s news. Designed and developed with our cloud-first principles in mind, it delivers built-in in-memory capabilities, new hybrid cloud scenarios and enables even faster data insights. […]

            Today, we’re delivering Hekaton’s in-memory OLTP in the box with SQL Server 2014. For our customers, “in the box” means they don’t need to buy specialized hardware or software and can migrate existing applications to benefit from performance gains. … SQL Server 2014 is helping businesses manage their data in nearly real-time. The ability to interact with your data and the system supporting business activities is truly transformative. […]

            Insert: Edgenet Gain Real-Time Access to Retail Product Data with In-Memory Technology [MSCloudOS YouTube channel, June 3, 2013]

            To ensure that its customers received timely, accurate product data, Edgenet decided to enhance its online selling guide with In-Memory OLTP in Microsoft SQL Server 2014.

            End of Insert

            Delivering mission critical capabilities through new hybrid scenarios SQL Server 2014 includes comprehensive, high-availability technologies that now extend seamlessly into Windows Azure to make the highest level of service level agreements possible for every application while also reducing CAPEX and OPEX for mission-critical applications. Simplified cloud backup, cloud disaster recovery and easy migration to Windows Azure Virtual Machines are empowering new, easy to use, out of the box hybrid capabilities.

            We’ve also improved the AlwaysOn features of the RDBMS with support for new scenarios, scale of deployment and ease of adoption. We continue to make major investments in our in-memory columnstore for performance and now compression, and this is deeply married to our business intelligence servers and Excel tools for faster business insights.

            Unlocking real-time insights Our big data strategy to unlock real-time insights continues with SQL Server 2014. We are embracing the role of data – it dramatically changes how business happens. Real-time data integration, new and large data sets, data signals from outside LOB systems, evolving analytics techniques and more fluid visualization and collaboration experiences are significant components of that change. Another foundational component of this is embracing cloud computing: nearly infinite scale, dramatically lowered cost for compute and storage and data exchange between businesses. Data changes everything and across the data platform, we continue to democratize technology to bring new business value to our customers.

            (*) Quentin Clark, Corporate Vice President of Program Management leading the Data Platform Group. The rest of his above post emphasizes the great progress of the Microsoft SQL Server for which he also includes the below diagram:
            image

            Introductory information: Selling Windows 8 | Windows 8 business apps as big bet [msPartner YouTube channel, March 1, 2013]

            We recently sat down to talk Windows 8 with partners Scott Gosling from Data#3, Danny Burlage from Wortell and Carl Mazzanti from eMazzanti Technologies. In a conversation led by Erwin Visser (*), Windows Commercial, and our own Jon Roskill and Kat Tillman we discussed the business potential of Windows 8 and why apps are key. In this segment, learn why Windows 8 business apps are a big bet.

            TechEd North America 2013 – Windows 8.1 Enterprise Build 9415 [lyraull [Microsoft Spain] YouTube channel, June 4, 2013]

            During the keynote address, Iain McDonald, partner director of program management for Windows, [starting at [6:36]] detailed key business features in the recently announced Windows 8.1 update — including advances in security, management, mobility and networking — to offer the best business tablets with the most powerful operating system for today’s modern business needs.

            OR as described by Antoine Leblond (*) in Continuing the Windows 8 vision with Windows 8.1 [Blogging Windows, May 30, 2013]

            Windows 8.1 will advance the bold vision set forward with Windows 8 to deliver the next generation of PCs, tablets, and a range of industry devices, and the experiences customers — both consumers and businesses alike — need and will just expect moving forward. It’s Windows 8 even better. Not only will Windows 8.1 respond to customer feedback, but it will add new features and functionality that advance the touch experience and mobile computing’s potential.

            Windows 8.1 will deliver improvements and enhancements in key areas like personalization, search, the built-in apps, Windows Store experience, and cloud connectivity. Windows 8.1 will also include big bets for business in areas such as management and securitywe’ll have more to say on these next week at TechEd North America. Today, I am happy to share a “first look” at Windows 8.1 and outline some of the improvements, enhancements and changes customers will see. […]

            (*) Antoine Leblond, Corporate Vice President (CVP) of Windows Program Management. His above post from last Thursday was continued by Modern Business in Mind: Windows 8.1 at TechEd 2013 [June 3, 2013] from Erwin Visser (*) describing some of the features that businesses can look forward to in Windows 8.1 such as

            Networking features optimized for mobile productivity. Windows 8.1 improves mobile productivity for today’s workforce with new networking capabilities that take advantage of NFC-tagged and Wi-Fi [Miracast etc.] connected devices […]

            Security enhancements for device proliferation and mobility.Security continues to be a top priority for companies across the world, so we’re making sure we continue to invest resources to help you protect your corporate data, applications and device […]

            Improved management solutions to make BYOD a reality. As BYOD scenarios continue to grow in popularity among businesses, Windows 8.1 will make managing mobile devices even easier for IT Pros […]

            More control over business devices. Businesses can more effectively deliver an intended experience to their end users – whether that be employees or customers. … Windows Embedded 8.1 Industry Our offering for Industry devices like POS Systems, ATMs, and Digital Signage that provides a broader set of device lockdown capabilities. […]

            On June 26, at the Build developer conference in San Francisco, Microsoft will release a public preview of Windows 8.1 for Windows 8, Windows RT and Windows Embedded 8.1 Industry. Upgrading to Windows 8.1 is simple as the update does not introduce any new hardware requirements and all existing Windows Store apps are compatible. […]

            (*) Erwin Visser, Senior Director of Windows Commercial Business Group

            OR putting all this together: Microsoft unveils what’s next for enterprise IT [press release, June 3, 2013]

            New wave of 2013 products brings it all together for hybrid cloud, mobile employees and modern application development.

            NEW ORLEANS — June 3, 2013 — At TechEd North America 2013, Microsoft Corp. introduced a portfolio of new solutions to help businesses thrive in the era of cloud computing and connected devices. In today’s keynote address, Server & Tools Corporate Vice President Brad Anderson and fellow executives showcased how new offerings across client, datacenter infrastructure, public cloud and application development help deliver the most comprehensive, connected enterprise platform.

            “The products and services introduced today illustrate how Microsoft is the company that businesses can bet on as they embrace cloud computing, deliver critical applications, and empower employee productivity in new and exciting ways,” Anderson said. “Only Microsoft connects the dots for the enterprise from ‘client to cloud.’

            Today’s keynote featured several customers, including luxury car manufacturer Aston Martin. The company is an example of the many enterprises that use the full range of Microsoft products and cloud platforms for IT success.

            Driving Strategy and Innovation with the Power of the Microsoft Cloud OS Vision [MSCloudOS YouTube channel, June 3, 2013]

            Behind every luxury sports car produced by Aston Martin is a sophisticated IT Infrastructure. The goal of the Aston Martin IT team is to optimize that infrastructure so it performs as efficiently as the production line it supports. This video describes how Aston Martin has used cloud and hybrid-based solutions to deliver innovation and strategy to the business.

            “Our staff’s sole purpose is to provide advanced technology that enables Aston Martin to build the most beautiful, iconic sports cars in the world,” said Daniel Roach-Rooke, IT infrastructure manager, Aston Martin. “From corporate desktops and software development to private and public cloud, Microsoft is our IT vendor of choice.”

            Fueling hybrid cloud

            At TechEd, Microsoft introduced upcoming releases of its key enterprise IT solutions for hybrid cloud: Windows Server 2012 R2, System Center 2012 R2 and SQL Server 2014. Available in preview later this month, the products break down boundaries between customer datacenters, service provider datacenters and Windows Azure. Using them, enterprises can make IT services and applications available across clouds and scale them up or down according to business needs. Windows Server 2012 R2 and System Center 2012 R2 are slated to release by the end of calendar year 2013, with SQL Server 2014 slated for release shortly thereafter.

            With advances in virtualization, software-defined networking, data storage and recovery, in-memory transaction processing, and more, these solutions were engineered with Microsoft’s “cloud-first” focus, including a faster pace of development and release to market. They incorporate Microsoft’s experience running large-scale cloud services, connect to Windows Azure and work together to provide a consistent platform for powerful hybrid cloud scenarios. More information can be found at blog posts by Anderson about Windows Server and System Center and by Quentin Clark about SQL Server.

            Further showcasing Microsoft’s hybrid cloud advantage, today the company also announced the public preview of Windows Azure BizTalk Services for enterprise integration solutions, both on-premises and in the cloud. In addition, Windows Azure now offers industry-leading, per-minute billing for virtual machines, Web roles and worker roles that improves cloud economics for customers. More information is available at the Windows Azure blog.

            Windows 8.1: Empowering modern business

            During the keynote address, Iain McDonald, partner director of program management for Windows, detailed key business features in the recently announced Windows 8.1 update — including advances in security, management, mobility and networking — to offer the best business tablets with the most powerful operating system for today’s modern business needs.

            New networking features in Windows 8.1 aim to improve mobile productivity for today’s workforce, with system-on-a-chip (SoC)-integrated mobile broadband, native Miracast wireless display and near field communication (NFC)-based pairing with enterprise printers. Security is also enhanced in the new update to address device proliferation and to protect corporate data and applications with fingerprint-based biometrics, multifactor authentication on tablets and remote business data removal to securely wipe company data from a device. And improved management capabilities in Windows 8.1 give customers more flexibility with supported options such as System Center Configuration Manager 2012 R2 and new mobile device management (MDM) solutions with third-party MDM partners, in addition to updated Windows Intune support.

            On June 26, at the Build 2013 developer conference in San Francisco, Microsoft will release a public preview of the Windows 8.1 update for Windows 8 and Windows RT customers. More information on new features found in Windows 8.1 for businesses, including updated Windows deployment guidance for businesses, is available on the Windows for your Business blog.

            Fostering modern application development

            Microsoft today also introduced Visual Studio 2013 and demonstrated new capabilities for improving the application lifecycle, both on-premises and in the cloud. A preview of Visual Studio 2013, with its new enhancements for agile portfolio planning, developer productivity, team collaboration, quality enablement and DevOps, is slated for release in the coming weeks, timed with the Build conference.

            Furthermore, Microsoft today announced an agreement to acquire InCycle Software Inc.’s InRelease Business Unit. InRelease is a leading release management solution for Microsoft .NET and Windows Server applications. This acquisition will extend Microsoft’s offerings in the application lifecycle management and DevOps market. More information is available on S. Somasegar’s blog.

            In addition, the company today announced new benefits that enable Microsoft Developer Network (MSDN) subscribers to more easily develop and test more applications with Windows Azure. New enhancements include up to $150 worth of Windows Azure platform services per month at no additional cost for Visual Studio Professional, Premium or Ultimate MSDN subscribers and new use rights to run select MSDN software in the cloud.

            Founded in 1975, Microsoft (Nasdaq “MSFT”) is the worldwide leader in software, services and solutions that help people and businesses realize their full potential.

            Read More: SQL Server, Brad Anderson, Enterprise, IT Professionals, BUILD, Cloud Computing, Windows Server 2012, S. Somasegar, Windows Azure, Windows Intune, Visual Studio, .NET, TechEd North America 2013, TechEd 2013

            Deep technical evangelism and development team inside the DPE (Developer and Platform Evangelism) unit of Microsoft

            It is a fantastic gig – we’re working with developers, designers, and IT pros from across the industry – from the consumer to enterprise to startups to hobbyists – helping them create amazing next generation apps, build the frameworks that make all this easier, and share our experiences with the community.

            [John Shewchuk, Technical Fellow at Microsoft, Chief Technology Officer for the Microsoft Developer Platform]

            Source: My New Gig [JohnShew‘s MSDN Blog, May 12, 2013] from which the following excerpts will add more information to the above mission statement:

            To do this work I have an incredible team with people like Eric Schmidt, who leads our consumer applications efforts and has done ground-breaking work on projects like [NBC’s] Sunday Night Football (which is up for a Sports Emmy for Outstanding Live Sports Series).

            [In fact on May 7 the Sports Emmy was awarded, already 5th time from which the last four awards were won with the program using technology started with Silverlight 3.0 and IIS Smooth Streaming in 2009 for Sunday Night Football live streaming with highly advanced and customized viewing experience. This lead to a continously evolving and expanding cooperation which culminated on April 9th 2013 in the announcement that Microsoft Corp. and NBC Sports Group are partnering to use Windows Azure Media Services across NBC Sports’ digital platforms, including NBCSports.com, NBCOlympics.com and GolfChannel.com. The new alliance aims to deliver live and on-demand programming of more than 5,000 hours of sporting events plus Sochi 2014 Olympic Games for NBC Sports’ digital platforms. More details about that see later on.]

            Patrick Chanezon just joined us from VMware where he was driving their cloud and tools developer relations – he has a ton of expertise in the open source space which will be increasingly important given our new Azure IaaS support for Linux.

            … we also get to play with all the newest and coolest technologies we’re delivering to developers these days – everything from Windows to Xbox to Windows Phone – and we connect it to the latest cloud services from Azure, Office, and Bing.

            James Whittaker [now as Partner Technical Evangelist at Microsoft] – a known industry disruptor and incredible speaker joins us from Bing where he has been leading the development team making Bing knowledge available programmatically – many people may know him from his viral blog post on why he left Google for Microsoft.

            As far as John Shewchuk himself is concerned he is describing his latest achievement in the same post as:

            As many of you know, for the last few years I’ve been plugging away deep in the plumbing of enterprise identity and Reimagining Active Directory for the cloud.  It’s been a great experience and I couldn’t be more proud of all the cool stuff that has gone on across the industry to enable the world of claims-based identity and identity as a service.  Over the years I’ve gotten to know many identity leaders including Kim Cameron, Craig Burton, and Andre Durand and have worked with many other great people at companies like Shell, Sun, IBM, Google, and Facebook.
            Building on all this collaboration, just a few weeks ago here at Microsoft we reached a major milestone with the official release of Windows Azure Active Directory (AAD). Today all of Microsoft’s major organizational cloud services build on AAD – this includes Azure, Office 365, and Dynamics. AAD supports almost 3 million organizations through 14 global data centers with 99.97% availability.  This level of scale and availability is unprecedented for a turnkey identity management service – it’s a huge accomplishment.  Although I love the SaaS and scale aspects of AAD, I’ve spent my career working with developers – so I’m stoked that we have made all this available to developers through new technologies like the AAD Graph API.
            It is always sad to move on from a great project, but with the release of AAD it is an ideal time to transition and start a new role.  So I’m happy to announce that I’m headed to Microsoft’s Developer & Platform Evangelism (DPE) team, working for Steve Guggenheimer.  My role is to lead the team doing the deep technical evangelism and development here in DPE.

            If one adds to that John Shewchuk’s all contributions from his Experience profile on LinkedIn:

            Technical Fellow
            Microsoft
            March 2008Present (5 years 2 months)
            Current responsibilities include delivering Windows Azure Identity, Access, and Directory Services and defining platform strategy for Microsoft’s Business Productivity Online Services (BPOS).
            Recent deliverables include Windows Azure Access Control and Application Messaging / Service Bus Services, SQL Azure, and Active Directory.
            Member of the Server and Tools Business (STB) Technical Leadership Team. Key participant in the definition of overall technical and business strategy for several divisions across STB.
            Distinguished Engineer
            Microsoft
            20052008 (3 years)
            Delivered Windows Communication Foundation (WCF).
            Responsible for Active Directory technical strategy. Worked to unify Active Directory product suite. Released Active Directory Federation Services (ADFS).
            Software Engineer
            Microsoft
            19962005 (9 years)
            Member of architecture team that drove the first and subsequent releases of .NET.
            Drove transformation of Visual Studio to enable web development.
            Authored and drove technical strategy for Web standards. Responsible for key cross-industry collaborations with IBM, Sun, and many others. Key participant in defining strategy for enterprise development
            Group Program Manager
            Microsoft
            19931996 (3 years)
            Drove the first release of Visual Studio.
            Delivered web development tools including Visual InterDev. Later these became the basis for Visual Studio web tools and web execution platform.
            Delivered advanced browser features including 2D layout and progressive rendering. Broad range of patents covering many core web technologies.
            Vice President and Founder
            Daily Planet Software
            19901993 (3 years)
            Microsoft acquired Daily Planet Software in Q4CY93 [and morphed it into “Blackbird,” the online-content authoring system for MSN].

            so after adding all those contributions, not only to Microsoft but to software engineering in general, only then one can really understand how much John Shewchuk is a true larger than life figure. Also note that Microsoft’s DPE unit never had such an outstanding contributor on its staff, not even the units organisationally preceding it (DRG (Developer Relations Group) formed in 1984, ADCU (Application Developer Customer Unit) introduced in 1997, evolved into DPE in October 2011). It is also the first time as Microsoft DPE has a developer related CTO organization properly staffed with excellent contributors. The size of this central to DPE team could be over 100 people and growing, this is the unofficial information. At the moment we know only the leadership figures of the CTO organization:
            James Whittaker for the partner activities (as coming from his new LinkdIn title given above)
            Patrick Chanezon “initially focused on the enterprise market” (as described by Chanezon in the below details)
            Eric Schmidt leading the consumer applications efforts (as explicitly stated by Shewchuk above)

            So at this point we can understand this extremely important, we might say strategic addition to the DPE unit only via the professional stance of its leadership figures, including the leader of the team Shewchuk himself. This is why instead of the details sections I am providing here the following one:

            More light on the leaders of the new the deep technical evangelism and development team:

            James Whittaker’s Quality Software Crusade from Academia to Microsoft, then Google and now back to Microsoft [this same ‘Experiencing the Cloud’ blog, March 14 – April 12, 2012]
            James Whittaker‏ @docjamesw 8:19 AM – 8 Apr 13

            I gave a blunt, incendiary talk at MS. My punishment: they made it my day job. Watch out world, Microsoft just gave me a speaking role.

            James Whittaker‏ @docjamesw 3:54 PM – 8 May 13

            I finally “met” the famous @maryjofoley …nice talking to you today.

            from which Mary Jo Foley published the following in her Microsoft builds a deep-tech team to attract next-gen developers [ZDNet, May 13, 2013]

            Whittaker’s most recent gig at Microsoft was development manager for the Microsoft knowledge platform as part of the Bing team. 

            “When Microsoft talks about devices and services, that’s a two-legged stool,” said Whittaker. The third leg is knowledge. We’re embedding knowledge into everything from Xbox, to Office, to third-party products.”

            Whittaker said “dev platform” is no longer simply the operating system and related application programming interfaces (APIs). It’s the whole ecosystem, he said, including information that Bing extracts from the Web, like catalogs, weather, and maps. The goal is to make this available inside applications built by both Microsoft and third-party developers. 

            “Actions can be performed on these entities. We have hundreds of millions of things we can provide that go beyond the blue links (in search engines),” Whittaker said.

            A New Era of Computing [Channel 9 video of the ALM Summit 3 plenary session by James Whittaker, Jan 30, 2013], click on the image to watch (highly recommended)

            History will look back and identify September 2012 as the dawn of a new computing paradigm and the official end of the “Search-and-Browse” era [of the 2000s] that Google dominated. James Whittaker talks about this momentous event, shares some history about prior eras, and looks ahead to what this new era brings.

            image

            Explanation from the video:

            [19:58] September 2012 is “when total search volume went down first. We don’t need to search anymore. It turns out that if you search long enough you find a bunch of stuff, and you hav’nt to search for it anymore.”

            [21:00] “Apps are ingesting the web too. Apps are better at searching than browsers and search engines.”

            [22:08] “Apps are fundamentally a better way to search because they’re only looking at the part of the web you’ve been interested in. How do we know you are interested in? Because you are using the app.

            So our habits are changing and this era has ended.”

            In more than the middle [38:26 – 40:00] he is emphasizing the 3 “Experiences” out of Google’s current Top 10 revenue earners rather than “Apps” in the era “when the web goes away” as leading to “Data is currency” for the new era:

            image

            In the very end of his presentation (from [46:09] to [52:20]), as forward looking “Know & Do” experience, he is describing and a kind of “screenshot demonstrating” the “I need a vacation” experience which should naturally start in one’s calendar and ending there as well.

            Hello Microsoft! [Patrick Chanezon’s blog, May 13, 2013]

            On april 29th 2013, I joined Microsoft’s legendary Developer and Platform Evangelism team, where I will initially focus on the Enterprise market. I will report to Technical Fellow John Shewchuk, joining his new team of top-notch technical evangelists, like Xoogler James Whittaker and Microsoft veteran Eric Schmidt. Mary Jo Foley wrote a nice piece about our team on ZDNet today. I will be based in theMicrosoft San Francisco office.

            How did it happen?
            I spent most of my career competing with Microsoft, at Netscape, Sun, Google and VMware. Competition builds respect, competitors force you to question your assumptions and to constantly evolve. For many of my friends, this move came as a total shock. What made me open to the idea of joining Microsoft is a presentation from Scott Guthrie about Windows Azure at NodeConf 2012 last summer. He presented from a Mac laptop, launched Google Chrome, went to the Cloud9 IDE, edited a Node app pulled from Github, and pushed it to Azure from the cloud IDE: to me this indicated a real change of mentality at Microsoft, and a new openness. Clearly they had listened to what developers ask from a cloud platform. Later on, when my friend Srikanth Satyanarayana pinged me to start conversations with Microsoft, I was open to it. I met with Satya Nadella, and realized that our visions for where the cloud was going were very aligned. Further conversations with Scott Guthrie about Azure, John Shewchuk and Steve Guggenheimer about developer evangelism convinced me this was an adventure I had to take!
            Why Microsoft?
            Joining Microsoft boils down to 4 reasons: People, Learning, Technology, Impact.
            People: in my late 30′s I realized that the people you work with, for and around are as important as what you’re working on. Microsoft has many people I have admired from the outside, like Dare Obasanjo, Eric Meijer, Scott Guthrie, Jon Udell, Scott Hanselman, Jeff Sandquist, Andrew Shuman or Anders Hejlsberg. The team I join has a fantastic roster of A-players with whom I’ll have fun and from whom I will learn.

            image

            Learning: I’m a learner at heart. I am curious, I read a lot, and I like to learn from people I work with. I also love to share what I learned with others. My kids loved this book called My Friends, by Taro Gomi, which goes like this: “I learned to walk from my friend the cat, I learned to jump from my friend the dog…”.
            In my career it worked the same way: I learned algorithmic from my teacher Christian Vial, I learned internet protocols from my friend Nicolas Pioch, I learned open source from my friend Alejandro Abdelnur, I learned social media from my friend Loic Lemeur, I learned developer relations from my friend Vic Gundotra, I learned platform strategy and storytelling from my friend Charles Fitzgerald… I love doing developer relations, and my two mentors in this area over the past 8 years, Vic and Charles, both came from the Microsoft DPE team. I’m coming to the source for more learning. This team is more than a 1000 people worldwide, and over the past 10 years they defined what tech evangelism is about: they operate at a larger scale and cover a wider scope than any of the teams I worked with. I am very excited to join them.
            Technology: Windows Azure is Enterprise ready, more open than people think, and is a complete platform, from infrastructure to services, mobile and Big Data. Azure has matured a lot in the past few years, it covers IaaS, PaaS and Saas, their Paas service is multi-framework and multi-service, with a marketplace of add-ons, it has a mobile backend as a service for Windows Phone, iOS, Android and HTML5, and includes Hadoop and Big Data services. It is in production today, has been battle tested for years as the base for many Microsoft first party apps and services, and is ready for the Enterprise, with a true public/private/hybrid solution: with Windows Server 2013, System Center and Azure you can start building your hybrid cloud today.. The team ships important new features regularly, my favorite being the point to site and software vpn features announced a few weeks ago, which will drastically lower the barrier to create hybrid clouds. Azure is not a Windows/.NET only platform, it is more open than people give it credit for: you can provision Linux VMs, and the PaaS supports .NET, Java, PHP, Node, Python, Ruby, with open source (Apache 2 license) SDKs on Githuband an Eclipse plugin, built by the Microsoft Open Technologies team. Scott Guthrie gives a very good overview of Windows Azure in this video from the Windows Azure Conf 3 weeks ago.

            image

            Impact: as a kid, I was reading a lot of science fiction, and got my first computer (a TRS-80) when I was 10 years old. As I explain in many of my presentations (like Portrait of the developer as The Artist), my childhood dreams were to change the world through technology, and more specifically computers. My dreams are far from being fulfilled today: it is true that we have more powerful machines and software tools, and technology changed the world in many aspects, but machines are still hard to program, and software engineering needs to evolve to let us work at a higher level of abstraction.
            The move to a devices and services world is an important architecture change like we see every 20 years in the software industry. Cloud platforms have the potential to help developers build smarter applications faster, and change entire areas of the human experience. It has started to happen in the consumer applications space, but the next big wave of change is the consumerization of Enterprise IT, where developers and IT professionals can completely transform the way enterprises work, driving business value faster, enabling new capabilities and business models. My goal is to help them in this transformation, and Microsoft is the place where I can have the most impact.
            Here’s a quick video to summarize it all: developers, developers, developers, think big and look up at the sky, its color is Azure!
            Developers, Developers, Developers A homage to you, developers I interacted with around the world, in the past 8 years doing developer relations at Google and VMware. http://wordpress.chanezon.com/2013/05/10/goodbye-vmware/
            If you have never tried Azure, or have tried it a year ago, sign up for a free trial and give it a go! I hope to see many of you at the Build conference in June in San Francisco.

            – Mary Jo Foley published the following about Chanezon in her Microsoft builds a deep-tech team to attract next-gen developers [ZDNet, May 13, 2013]:

            We’re at a deep architectural inflection point right now in the enterprise,” said Chanezon. “Devs need new ways of working, new apps and new frameworks. There’s the whole dev-ops movement, plus the move to become more agile.”

            Chanezon said he joined Microsoft because he felt the company’s new devices plus services strategy really embraces these changes. He said while Google had devices and services, too, it didn’t have the private/hybrid cloud component which Microsoft also brings to the enterprise-dev table. As a big believer in the power and potential contribution of open source, he said he was encouraged to see that Azure has become a very open-source-friendly platform.

            – Mary Jo Foley published the following about Schmidt in her Microsoft builds a deep-tech team to attract next-gen developers [ZDNet, May 13, 2013]:

            Schmidt joined DPE six years ago [as director of DPE’s Media and Advertising Initiatives team], bringing his media specialization to the media and entertainment, social and gaming verticals. These are “where people are thinking about attaching devices to a lifestyle,” he said. 

            A big target for Schmidt is mobile developers, specifically those writing for iOS and Android who may not know how their skills can be transferred to Windows 8 and Windows Phone 8. “We’re showing them how what they already know is correlated,” he said, while playing up the message that the iOS and Android gold mines are drying up.

            Silverlight delivers online viewing experience for Sunday Night Football [Silverlight and Windows Phone SDK blog, Sept 10, 2009]

            The NFL and NBC will be delivering the entire Sunday Night Football season by using Silverlight 3.0 and IIS Smooth Streaming. The first game of the season will be broadcast tonight, with the Tennessee Titans vs. the Pittsburg Steelers. Game starts at 5:00pm PST and you can watch online for free: http://snfextra.nbcsports.com/.

            image

            Here are a few of the benefits Silverlight delivers:

            • A full screen video player that is capable of delivering 720p HD video. TV quality on the web.
            • A main HD video feed, plus 4 user-selectable alternate synchronized camera feeds that allows users to switch camera angles themselves. Your TV can’t do that.
            • Adaptive smooth streaming of live HD video, which enables the video player to automatically switch bitrates on the fly depending on networking/CPU conditions. No buffering/stuttering experience.
            • DVR support of the live video, including Pause, Instant Replay, Slow Motion, Skip Forward/Back. You can pause and rewind on live video.
            • Play-by-play data (touchdowns, fumbles, etc) inserted as tooltip chapter markers on the scrubber at the bottom allowing you to quickly seek to key moments. A smarter, contextual DVR.
            • Highlights of major plays created within minutes of the play. NBC is cutting on-demand highlights and publishing them on-the-fly with Smooth Streaming.
            • Sideline interviews with the players. No more channel surfing, you are one click away from additional content.
            • Game statistics. These are live stats coming directly in real-time from the NFL.
            • Game commentary and Q&A with the SNF hosts. Chat with the live TV broadcasters.

            Enjoy! http://snfextra.nbcsports.com/

            Microsoft Silverlight and NBC Bring Winter Games to the Web in High Definition [Microsoft feature story, Feb 12, 2010]

            Microsoft Silverlight is the player of choice for NBC’s online viewing experience of the 2010 Winter Games in Vancouver.

            REDMOND, Wash. —Feb. 12, 2010 — NBC and Silverlight have once again teamed up to bring Winter Games coverage to the Web – this time in high definition.
            For the next 16 days, people all over the world will watch the Winter Games on television. Increasingly, they’ll be tuning in online as the world’s top athletes compete for gold and glory.
            NBC will once again use Silverlight, Microsoft’s fast-growing, smooth-streaming video and animation plug-in for browsers, to bring full coverage and highlights to NBCOlympics.com. In 2008 for Beijing, the NBC-Silverlight partnership yielded not only revolutionary Web coverage of a sporting event, but a record number of viewers: 52.1 million people logged on to watch 9.9 million hours of video.
            At that time the Silverlight platform was so new that NBC also offered Windows Media Player alongside it. After the success of Beijing and with nearly 50 percent of Internet-connected devices running Silverlight, NBC decided to consolidate on Silverlight for the Vancouver Games.

            imageMicrosoft employees Jason Suess (left) and Eric Schmidt take
            a break in an NBC production studio.

            In addition, NBC and Silverlight teams are working together on other major sporting events such as Wimbledon and NFL Sunday Night Football.

            “It’s really been amazing to see that partnership and friendship with NBC grow over the last year and a half,” says Jason Suess, principal technical evangelist for Silverlight. “I expect many more events as our partnership gets tighter and tighter.”
            With Silverlight, viewers can rewind and fast forward the action, or use pause and slow-motion. The player also scales the quality of the video to whatever a user’s machine can handle, delivering up to 720p – the highest resolution possible under current digital television standards.
            “After Beijing, what we heard loud and clear was if you can provide a higher quality experience, users will definitely spend more time in that experience,” Suess says.
            The Silverlight team also worked with NBC to provide special behind-the-scenes tools for the network, including the ability to insert mid-stream advertising, and a rough cut editor that allows NBC personnel to quickly edit and post highlights on the Web.
            “With Michael Phelps going for eight gold medals in Beijing, every time he’d win there would be a massive rush to the site to see him winning the latest gold,” Suess says. “The challenge there was for NBC to have the content on the site in time to meet the demand. Now editors can go in literally while a (video) stream is happening and cut a highlight.”
            Suess said the Winter Games are at a different scale from the massive Summer Games, with far fewer events and more niche sports. Still, Microsoft has worked hard to provide the most engaging photo and video experience possible, he says.

            Silverlight Powered Emmy Nominated Sunday Night Football [Silverlight Team on Silverlight Blog, April 19, 2010]

            This NFL season, NBC thrilled football fans by broadcasting Sunday Night Football on 2 screens – television and online. And now, as a result of this great work, Sunday Night Football Extra and NBC Sports have been nominated for a 2010 Sports Emmy® Nomination! NBC Sports teamed with Microsoft Silverlight and Vertigo to design and develop a visual stunning, interactive online video experience. The Sunday Night Football Extra Player featured Microsoft Smooth Streaming technology providing a customized viewing experience that smoothly and automatically adjusted to individual users’ bandwidth and computer’s performance in real time. The SNF Extra Player also touted an interactive user experience featuring an unprecedented five synchronized camera angles all in true 720p HD, slow-motion replay, full DVR controls, real time key plays integration, real-time statistics, and live interaction with commentators.

            The Sports Emmy® Awards will be held in New York City on Monday, April 26, 2010, and will recognize outstanding achievement in sports television coverage. This nomination is really the culmination of the innovative thinking, hard work and dedication demonstrated by the team that NBC Sports, Vertigo and a select team of key partners brought together for Sunday Night Football Extra — and Silverlight is the engine that made it possible. If you want to learn more about the nomination, you can also visit Vertigo’s site at http://bit.ly/vertigo-snf.
            The Result?
            • Number of Games: 17 football games streamed via Silverlight
            • Average time tuned in: 29 minutes (about 24 minutes longer than average time spent tuning in on broadcast TV)
            • Number of Viewers: Over 2.2 million football fans tuned in on NBCSports.com to watch the Season live and in full HD
            • Hours of Video: Approximately 1 million hours of video streamed
            • Peak users: 38,500 total peak concurrent users
            • What technology made this possible😕 IIS 7, IIS Media Services and Silverlight Rough Cut Editor
            Tons of great information about how SNF came together online can be found in the case study and whitepaper live on Microsoft.com.
            The Sports Emmy® Awards will be held in New York City on Monday, April 26, 2010, and will recognize outstanding achievement in sports television coverage. This nomination is really the culmination of the innovative thinking, hard work and dedication demonstrated by the team that NBC Sports, Vertigo and a select team of key partners brought together for Sunday Night Football Extra — and Silverlight is the engine that made it possible. If you want to learn more about the nomination, you can also visit Vertigo’s site at http://bit.ly/vertigo-snf.

            Interactive Media Player to Bring PDC to Developers Worldwide [Microsoft feature story, Oct 27, 2010]

            A new interactive media player will enable developers worldwide to virtually attend this week’s Professional Developers Conference at microsoftpdc.com. Using Silverlight and Windows Azure, Microsoft is providing many of the features NBC used when broadcasting the Olympics online.

            With the player, Microsoft is introducing a new way of bringing a live, in-person event to a much broader audience, said Eric Schmidt, Microsoft’s senior director of Developer Platform Evangelism. “The goal is to narrow the gap between audience and speaker,” he said.

            Schmidt heads up the team that has helped stream a number of major events recently, including the 2010 U.S. Open Golf Championship, the 2010 Wimbledon Championship, and NBC’s Sunday Night Football. The team’s objective has been to reach large online audiences with immersive and interactive experiences. Along the way, they developed new ways of delivering multi-camera video and built new interactive models inside what has traditionally been just a video player. The team also built out frameworks so that customers and partners can create similar experiences leveraging Microsoft’s platform technologies in a turnkey manner.

            With the PDC10 virtual player, Microsoft is doing things it couldn’t have done just a few years ago, said Schmidt. All session content will be available live and on-demand in HD quality, and viewers will have the ability to pause and rewind the video at any point. They also can toggle back and forth between different camera feeds, allowing a viewer to cut between a presenter and the presentation material.

            The PDC player has a number of built-in interactive features. Real-time polling will enable speakers to query both the online and in-person audience for live feedback. Live Q&A will help the audience interact with the presenters while they’re delivering a session. And an inline Twitter feed will extend the conversation beyond the online player and into the Twitter domain.

            NBC SPORTS GROUP COLLECTS 11 SPORTS EMMY AWARDS, MOST OF ANY SPORTS MEDIA COMPANY [press release, May 7, 2013]

            London Olympics Garners Five Awards, Including Outstanding Live Event Turnaround

            Sunday Night Football Wins Fifth Consecutive Emmy for Outstanding Live Sports Series; Super Bowl XLVI Wins for Outstanding Live Sports Special

            Bob Costas, Al Michaels, Cris Collinsworth and Pierre McGuire Honored

            NEW YORK – May 7, 2013 – NBC Sports Group won 11 Sports Emmy Awards, the most of any sports media company for the third straight year; the London Olympics received five Emmys, including Outstanding Live Event Turnaround; Super Bowl XLVII won for Outstanding Live Sports Special; Sunday Night Football won its fifth consecutive award for Outstanding Live Sports Series; and Bob Costas, Al Michaels, Cris Collinsworth and Pierre McGuire were all honored in their respective categories at the 34th Annual Sports Emmy Awards, presented tonight by the National Academy of Television Arts and Sciences at Frederick P. Rose Hall, Home of Jazz at Lincoln Center.
            MARK LAZARUS, NBC SPORTS GROUP CHAIRMAN: “We could not be more proud of our dedicated team. Tonight is particularly special because we were recognized for our coverage of the London Olympics and the NFL, two properties that touch virtually everyone in the NBC Sports Group – and our on-air commentators. It’s rewarding to know that our talent continues to be recognized year in and year out by our peers.”
            Formed in January, 2011, the NBC Sports Group consists of NBC Sports, NBC Sports Network, Golf Channel, NBC Olympics, 11 NBC Sports Regional Networks, two regional news networks, NBC Sports Radio and NBCSports.com.
            NBCUniversal’s coverage of the London Olympics was honored with a total of five Emmy Awards in the following categories:
            • Outstanding Live Event Turnaround;
            • The George Wensel Technical Achievement Award – NBC, NBC Sports Network, NBCOlympics.com, Bravo, CNBC, MSNBC, Telemundo;
            • Outstanding Technical Team Studio;
            • The Dick Schaap Outstanding Writing Award;
            • Outstanding New Approaches, Sports Programming – NBCOlympics.com.

            For the fifth consecutive year, NBC Sports won Outstanding Live Sports Series for Sunday Night Football. NBC Sports has now won the award in six of the last seven years, also winning in 2007 for its NASCAR coverage.

            NBC Sports was also honored with the Emmy for Outstanding Live Sports Special for its coverage of Super Bowl XLVI. NBC Sports also received the Emmy in this category for its coverage of Super Bowl XLIII.
            Bob Costas was awarded his 25th career Emmy and fifth consecutive for Outstanding Sports Personality-Studio Host. Costas hosted the London Olympics, is the host Football Night in America, NBC Sports’ acclaimed NFL studio show, and Costas Tonight, which airs on NBC Sports Network. He won the Emmy in the same category last year for his work on Football Night.

            Al Michaels was awarded the Emmy Award for Outstanding Sports Personality – Play-by-Play, for his work on Sunday Night Football. For Michaels, who received the Lifetime Achievement Award at the 32ndAnnual Sports Emmy Awards in 2011, this marks his seventh career Emmy Award.

            Cris Collinsworth was awarded his fifth consecutive Emmy for Outstanding Sports Personality-Sports Event Analyst. This marks Collinsworth’s 14th career Emmy, which includes wins in 2007 and 2008 in the Studio Analyst category for work on Football Night in America.
            Pierre McGuire, NBC Sports Group’s “Inside the Glass” analyst for its NHL coverage, was awarded his first career Emmy for Outstanding Sports Personality – Sports Reporter.

            Microsoft Teams Up With NBC Sports Group to Deliver Compelling Sports Programming Across Digital Platforms Using Windows Azure [press release, April 9, 2013]

            New alliance aims to deliver live and on-demand programming of more than 5,000 hours of sporting events plus Sochi 2014 Olympic Games for NBC Sports’ digital platforms.

            LAS VEGAS — April 9, 2013 — Today at the National Association of Broadcasters Show, Microsoft Corp. and NBC Sports Group announced they are partnering to use Windows Azure Media Services across NBC Sports’ digital platforms, including NBCSports.com, NBCOlympics.com and GolfChannel.com.

            Through the agreement, which rolls out this summer, Microsoft will provide both live-streaming and on-demand viewing services for more than 5,000 hours of games and events on devices, such as smartphones, tablets and PCs. These services will allow sports fans to be able to relive or catch up on their favorite events and highlights that aired on NBC Sports Group platforms.
            Rick Cordella, senior vice president and general manager of digital media at NBC Sports Group discusses how they use Windows Azure across their digital platforms.
            “NBC Sports Group is thrilled to be working with Microsoft,” said Rick Cordella, senior vice president and general manager of digital media at NBC Sports Group. “More and more of our audience is viewing our programming on Internet-enabled devices, so quality of service is important. Also, our programming reaches a national audience and needs to be available under challenging network conditions. We chose Microsoft because of its reputation for delivering an end-to-end experience that allows for seamless, high-quality video for both live and video-on-demand streaming.”
            NBC Sports Group’s unique portfolio of properties includes the Sochi 2014 Winter Olympic Games, “Sunday Night Football,” Notre Dame Football, Premier League soccer, Major League Soccer, Formula One and IndyCar racing, PGA TOUR, U.S. Open golf, French Open tennis, Triple Crown horse racing, and more.
            “Microsoft is constantly looking for innovative ways to utilize the power of the cloud, and we see Windows Azure Media Services as a high-demand offering,” said Scott Guthrie, corporate vice president at Microsoft. “As consumer demand for viewing media online on any available device grows, our partnership with NBC Sports Group gives us the opportunity to provide the best of cloud technology and bring world-class sporting events to audiences when and where they want them.”
            Microsoft has a broad partner ecosystem, which extends to the cloud. To bring the NBC Sports Group viewing experience to life, Microsoft is working with iStreamPlanet Co. and its live video workflow management product Aventus. Aventus will integrate with Windows Azure Media Services to provide a scalable, reliable, live video workflow solution to help bring NBC Sports Group programming to the cloud.
            NBC Sports Group and iStreamPlanet join a growing list of companies, including European Tour, deltatre, Dolby Laboratories Inc. and Digital Rapids Corp., which are working with Windows Azure to bring their broadcasting audiences or technologies to the cloud.
            In addition to Media Services, Windows Azure core services include Mobile Services, Cloud Services, Virtual Machines, Websites and Big Data. Customers can go tohttp://www.windowsazure.com for more information and to start their free trial.

            – Mary Jo Foley published the following about Shewchuk, the head of the team in her Microsoft builds a deep-tech team to attract next-gen developers [ZDNet, May 13, 2013]:

            ‘The platform’ is now a collection of capabilities across all of our products,” said John Shewchuk, the head of the recently formed technical evangelism and dev team. Our job is “helping devs stitch together solutions with these technologies.”

            “Devs” also is a much broader target audience for Microsoft than it once was. Back in the early DPE days, devs meant professional, full-time programmers. The target audience for Microsoft’s new deep-tech team includes anyone who writes a consumer, business or hybrid application. That means startups, enterprise customers and top consumer and business independent software vendors (ISVs).

            The Microsoft toolbox from which devs can choose to mix and match includes many technologies that didn’t exist a decade, or even just a few years, ago. They include everything from Windows Azure technologies, to Bing programming interfaces and datasets, to the WinRT framework underlying Windows 8 and Windows Server 2012. Microsoft’s next Xbox, Kinect, Windows Phones, Surfaces, Perceptive Pixel multitouch displays are among the targets for these technologies.

            “This is a playground. We get to work with stuff from all the different Microsoft business groups,” said Shewchuk. “It’s like geek heaven.”

            The idea of creating this kind of deep-tech team has been percolating since October 2012, when Microsoft veteran Steve Guggenheimer returned to Microsoft to head up DPE, according to Microsoft execs. Guggenheimer, in conjunction with Server and Tools Business chief Satya Nadella and with the blessing of CEO Steve Ballmer, set out to recruit some deeply technical evangelists with far-flung specializations.

            Shewchuk, a 20-year Microsoft veteran and one of the company’s Technical Fellows, agreed to spearhead the team. (Microsoft isn’t saying how large the new team is, but I’ve heard it could be over 100 people in size and growing.) Shewchuk, who is now the Chief Technology Officer for the Microsoft Developer Platform, was working for the last several years on Windows Azure, where he helped the company build Windows Azure Active Directory, Service Bus and SQL services. Shewchuk also was a key contributor to a number of other Microsoft dev technologies, including .Net, Visual Studio, Windows Communication Foundation and the WIndows Identity Foundation.

            The idea is to bridge our inside developers to outside developers,” Shewchuk said. “We want to get the top developers to adopt our platform.”

            Shewchuk described the new deep-tech team as a place where Microsoft pulls together its own “world-class” developers to exchange ideas among themselves and with the outside world. Because Microsoft’s new stack of technologies are all at different places, in terms of their maturity cycle, the Microsoft tech team will do everything from build new frameworks; develop code to tie together disparate products; and make available code and templates for external use using services like GitHub or CodePlex. In some cases, the “developers” who take advantage of these pieces may be Microsoft’s own product teams who may want to incorporate code (and even the developers who wrote it) directly into their units.

            More information:
            John Shewchuk’s Profile [MSDN, May 2013]

            John Shewchuk is a Technical Fellow and the CTO for the Microsoft Developer Platform. John leads the team responsible for technical evangelism and development in DPE; his team partners with developers, designers, and IT pros to build next gen applications using Microsoft’s devices and services and they share those experiences with the developer community. John has been with Microsoft for almost 20 years. Most recently John focused on Azure developing key platform services including Windows Azure Active Directory, Service Bus, and SQL services. He has been a key contributor on wide range of technologies including; Visual Studio, .NET, WCF, WIF, IE, and AD. John is an advocate and contributor to open source and Web standards – most recently he drove many of the contributions Microsoft made to OAuth 2. John has BS in Electrical Engineering from Union College and an MS in Computer Science from Brown University. He lives in Redmond with his wife and four children.

            Microsoft Big Brains: John Shewchuk [Mary Jo Foley for All About Microsoft blog of ZDNet, Nov 20, 2008]

            Claim to Fame: One of the masterminds behind “Zurich,” a key component of Microsoft’s Azure cloud infrastructure, and a key player in Microsoft’s Federated Identity work [see also: Ozzie foreshadows ‘Zurich,’ Microsoft’s elastic cloud [same author, same place, July 24, 2008]

            Bytes by MSDN: John Shewchuk and Rob Bagby discuss “Project Dallas” [on YouTube MrAbdoul9 channel, Jan 29, 2010; on Channel 9, Aug 29, 2010] this is where OAuth is first mentioned

            John Shewchuk, a Microsoft Technical Fellow, leads the Project Zurich architecture and strategy teams, which are focused on extending Microsoft’s .NET application development technologies to the Internet “cloud.” Shewchuk works in Microsoft’s Connected Systems Division (CSD) where he leads the technical strategy team. Over the last several years Shewchuk and his team have developed a wide range of Internet-based application messaging and identity federation technologies. Additionally, he was a co-founder of the Windows Communication Foundation (WCF) team and has been a key contributor to cross-industry interoperability initiatives. Working in conjunction with others on his team, Shewchuk developed Web services specifications and managed technical collaborations with IBM, Sun, SAP, and many others. He also has been a key leader and contributor to Microsoft’s efforts in federated identity, access control, and privacy. Previously, Shewchuk worked on Microsoft’s development tools and runtimes and played a key role in the development of Visual Studio and .NET. Earlier in his career, he played a role in the development of many Internet technologies including stylesheets, browser behaviors, and Web server controls.

            Microsoft unveils AD Azure strategy, ID management reset [John Fontana for Identity Matters blog of ZDNet, May 25, 2012]

            After two years of work, Microsoft has unveiled details and its strategy around Active Directory for the cloud, anointing it the centerpiece of a comprehensive online identity management services strategy it thinks will profoundly alter the ID landscape.
            The company said changes to the current concepts around identity management need a “reset” to handle the “social enterprise.” Microsoft says it is “reimagining” how its Windows Azure Active Directory (WAAD) service helps developers create apps that connect the directory to SaaS apps and cloud platforms, corporate customers and social networks.
            “The term ‘identity management’ will be redefined to include everything needed to provide and consume identity in our increasingly networked and federated world,” Kim Cameron, an icon in the identity field and now a distinguished engineer working on identity at Microsoft, said on his blog. “This is so profound that it constitutes a ‘reset’.”
            At the center is WAAD, which is in use today mostly with Office 365 and Windows Intune customers. WAAD is a multitenant service designed for high availability and Internet scale.
            In a companion blog post to Cameron’s, John Shewchuk [see also Part 2 of that], a Microsoft Technical Fellow and key cog in the company’s cloud identity engineering, provided some details on WAAD, including new Internet-focused connectivity, mobility and collaboration features to support applications that run in the cloud.
            Shewchuk said the aim is to support technologies such as Java, and apps running on mobile devices including the iPhone or other cloud platforms such as Amazon’s AWS.
            Shewchuk said WAAD will be the cloud extension to on-premises Active Directory deployments enterprises have already made. The two are married using identity federation and directory synchronization.
            He said Microsoft made “significant changes to the internal architecture of Active Directory” in order to create WAAD.
            As an example, he said, “Instead of having an individual server operate as the Active Directory store and issue credentials, we split these capabilities into independent roles. We made issuing tokens a scale-out role in Windows Azure, and we partitioned the Active Directory store to operate across many servers and between data centers.”
            Some analysts are already noting the challenges Microsoft will have with its cloud directory.
            Mark Diodati, a research vice president at Gartner focusing on identity issues, told me in a conversation about changes the cloud is forcing on enterprise ID management that, “the addition of tablets and smartphones into the enterprise device mix exceeds Active Directory’s management capabilities and there is an impedance mismatch using Kerberos across the cloud.”
            While Shewchuk laid out the set-up for a Part 2 [see here: Part 2 where OAuth 2 is first mentioned as: “we currently support WS-Federation to enable SSO between the application and the directory. We also see the SAML/P, OAuth 2, and OpenID Connect protocols as a strategic focus and will be increasing support for these protocols”] of his blog that will focus on enhancements to WAAD, Kim Cameron painted the bigger picture on cloud identity going forward.
            He said companies adopting cloud technology will see dramatic changes over the next decade in the way identity management is delivered. “We all need to understand this change,” he stressed.
            Cameron said identity management as a service “will use the cloud to master the cloud”, and will provide the most reliable and cost-effective options.
            “Enterprises will use these services to manage authentication and authorization of internal employees, the supply chain, and customers (including individuals), leads and prospects. Governments will use them when interacting with other government agencies, enterprises and citizens.”
            And he added that enterprises will have to move beyond concepts that have guided their thinking to date.

            Identity & Access [MSFTws2012 YouTube channel, Nov 20, 2012]

            John Shewchuk talks about how to overcome identity and access challenges brought about by continuous services and connected devices. Learn more about Active Directory, Direct Access, and Dynamic Access Control at http://aka.ms/Ytidentity
            Current state-of-the-art:
            Welcome to the Active Directory Team Blog [MSDN blogs, April 15, 2013]
            Announcing some new capabilities in Azure Active Directory Graph Service [Windows Azure Active Directory Graph Team blog on MSDN, May 15, 2013]

            BUILD 2013, Windows 8.1, and Microsoft’s Deep-Tech Team: Hopeful News for Devs [Tim Huckaby on DevPro, May 16, 2013]

            It’s hard to change a culture. Having worked for or with Microsoft for over 20 years, I can tell you that I have a myriad of colleagues that are Microsoft employees, most of whom I call my friends and respect very much. Over the last several months, I’ve had several discouraging private conversations about where the developer goals, mission, and strategy were headed for Microsoft. I could see the problems and mistakes. Microsoft employees could see them, too. You probably saw them, too. It’s been frustrating. When the head guy in charge of Microsoft development ignores feedback that includes internal feedback from Microsoft and external feedback from folks such as me and you, then that builds a culture of secrecy and fear. Although that head guy is gone now [obvious reference to Steven Sinofsky, ex Microsoft: The victim of an extremely complex web of the “western world” high-tech interests [‘Experiencing the Cloud, Nov 13-20, 2012], it’s still taken a long time to change that culture back to where it should be.
            In all honesty, I can tell you that I haven’t been encouraged about the developer platform at Microsoft in a while. However, today I’m encouraged for the first time in a long time. I see the culture changing. I hear people at Microsoft saying that the culture is changing. And there’s several encouraging announcements that are emerging. Suddenly, I’m now excited about the Microsoft’s BUILD 2013 developer conference that’s being held in San Francisco from June 26 to June 28, and I’m not the old guy saying, “Get off my lawn!” However, I’d first like to present you all with some background that made me discouraged in the first place.
            Microsoft’s Development Woes
            I painfully read a recent blog post about Microsoft’s developer issues. I don’t even know who wrote it. This guy or gal didn’t put his or her name on the blog post. It’s painful because this person makes a ton of good points. Within this blog post, the author goes far enough back to put Win16 into perspective. It’s a very interesting read if you want to talk about the context of Microsoft’s developer problems through time and the speculation surrounding those problems. One of the main points in this article is that Microsoft has hung onto an obsolete Win32 API even though, a decade ago Intel took a completely different tact with the GPU and multi-core processors when it could have picked several versions of Windows over time to start over. However, Microsoft didn’t choose to do this, which has caused developers a lot of pain.
            Related: Windows 8 Start Button Shenanigans
            Most recently that developer pain has manifested with the introduction of the modern API in Windows 8. The modern API has many developers so confused and angered. A lot of these developers are experiencing anger because the most successfully adopted and beloved developer technology in Microsoft history was seemingly killed by this new modern API: Silverlight. Also seemingly killed was XNA. Several developers are also confused because Microsoft seems to be pushing the message to get users to build enterprise applications in HTML5 and deliver them through the Windows Store.
            But, alas, there is hope! Recent announcements and speculations have me really encouraged.
            Encouraging Announcements from Microsoft
            On May 14, Microsoft officially announced the long rumored Windows Blue, which is officially called Windows 8.1. It will be a free update to Windows 8. Windows 8.1 promises to fix several different problems that folks have been complaining about. It’s important to note that Windows 8.1 isn’t a service pack. It’s a full blown upgrade to the OS. Microsoft promises several exciting things for the developer to be announced at BUILD, which includes the public release of Windows 8.1.
            This month a minor Internet hysteria phenomena occurred with the revelation of the Microsoft deep-tech team. Mary Jo Foley wrote it best describing it as Microsoft’s new plan for reaching out to top-tier developers of all sizes to get them to take a look at the new and expanded Microsoft toolbox. There’s several “big guns” who will be leading the effort.
            John Shewchuk is one of those “big guns.” I know John from a prior life at Microsoft. He’s a 20-year Microsoft veteran and one of the company’s Technical Fellows. He’s leading the team and serving as the Chief Technology Officer for the Microsoft Developer Platform. This is good news.
            My guess is that the deep-tech team was the brainchild of Microsoft veteran Steve Guggenheimer, who took the reins of heading the Developer and Platform Evangelism (DPE) team in October 2012. Affectionately known as “Guggs,” Steve Guggenheimer has a long and storied career at Microsoft.
            Patrick Chanezon is a new hire to Microsoft who will lead the enterprise evangelism efforts in Microsoft’s DPE unit from San Francisco. He joined Microsoft from VMware just weeks ago. This is a key hire that also seems to be really good news.
            More about those Microsoft people I respect; the people who get it; the people who affect change.  Scott Guthrie is one of them. But everyone knows who knows the Microsoft Platform knows who Scott Guthrie is. Another one of them is Gabor Fari. You probably don’t know his name. But Gabor is one of the many Microsoft folks who “gets it.” Internally, he’s willing to criticize the company he works for and loves when it deserves it. He’s also the first to garner praise where Microsoft deserves it. Gabor’s title is Director of Life Sciences Solutions, and his grasp of the developer platform at Microsoft is his passion. When discussing the problems of the past and the excitement of the future with Gabor he left me with this, and I believe it’s the perfect way to end this article:
            “I am very excited about the latest developments and news that has been released, and I am eagerly anticipating additional news from the BUILD conference. The slumbering lion still has spectacular fangs and teeth; and now he has woken up and is ready to roar.”

            Regarding Gabor Fari I will include here the following link:
            Sanofi: Global Healthcare Leader Deploys Intelligent Content Framework, Speeds Time-to-Market [Microsoft Case Study, April 16, 2013] from which the following excerpts describe Fari’s involvement and role in strategic developments the best:

            In January 2011, Sanofi launched a program called CRUISE—Content Re-Use Information System for Electronic Health. Through CRUISE, the company set out to develop a content management solution that transverses the company’s research and development efforts. The program charter of CRUISE is to implement processes and tools that enable stakeholders to author, assemble, review, approve, reuse, publish, and deliver high-quality, consistent, and compliant content and documentation throughout the product development life cycle—aiding the submission to regulatory agencies and other industry audiences. “The idea is to find ways to intelligently and seamlessly manage content authoring and production,” says Bhanu Bahl, Senior Manager of Clinical Sciences and Operation Platform at Sanofi. “The key business objective is to reduce the effort required to prepare documents through a synergy of optimized processes and enabling technologies.”
            CRUISE has three pillars. One pillar involves simplifying the documentation process in a way that makes it possible to reuse content in various materials. Another pillar revolves around services that involve the many different documentation deliverables. The third pillar focuses on the technology solution, which is designed as a content library that tags and classifies information so that it can be easily assembled and searched. “With CRUISE, we are not doing a process redesign,” says Bahl. “We’re building something more tangible, more simplified, and more standardized.”
            To address the CRUISE mandate, Sanofi worked closely with Microsoft as well as two members of the Microsoft Partner Network, DITA Exchange and the ArborSys Group. Microsoft provided the Intelligent Content Framework (ICF) and underlying technologies based on Microsoft SharePoint Server and Microsoft Office. DITA Exchange delivered a solution that enables organizations to establish and maintain a “single source of truth” for their strategic content, and to deliver that content consistently across outputs. The ArborSys Group consulted on the tool and process redesign and helped achieve an end-to-end business and technology implementation for regulated industries.

            Gabor Fari, Director of Life Sciences Solutions at Microsoft, served as an evangelist in helping to put together the CRUISE team. DITA Exchange had been working closely with Microsoft since 2008 to develop the ICF for regulated industries. It completed the first version of the XML-based solution in February 2009.
            As the technology pillar of CRUISE and the engine of EnCORE, DITA Exchange software elevates SharePoint to an XML-based component content management and single-source publishing solution. It enables its customers to comply with regulatory requirements with tools for reusing content in a consistent and accurate way throughout the product development life cycle in the life sciences space. “Microsoft promoted our work to several pharmaceutical companies,” says Andersen. “It led the way in terms of bringing innovative ideas around SCM solutions.”
            DITA Exchange began working on the CRUISE implementation in April 2011. The partner participated in planning and supplied the solution used to manage the document output maps, topics, and linking of topics to the maps. “DITA Exchange helped us with content design and the governance structures of information design,” says Allred. “The people at DITA Exchange are masters of their technological domain. They have experience in regulated industries and the knowledge required to get our vision into an operational model.”
            The ArborSys Group joined the effort in April 2011. This partner provides business consultancy and technical implementation and helped Sanofi achieve measurable and sustainable results through the implementation of flexible IT solutions that can be adapted for change in a dynamic business climate.
            The two partners collaborated on developing the EnCORE platform. The ArborSys Group scoped processes, integrated service management roles and extensions, and trained internal resources.
            “Microsoft, DITA Exchange, and the ArborSys Group all provided expertise and leadership in terms of how we define processes and address the three pillars of CRUISE,” says Bahl. “The various disciplines they provided really helped us strategize our best opportunity in terms of development. We share a common vision that has resulted in a very rich, cutting-edge offering that other pharmaceutical companies will probably adopt three to five years from now.”
            While many other regulated industries have embraced SCM in recent years, life science organizations have lagged. “It’s no secret that the pharmaceutical industry is conservative,” says Andersen. “People think very carefully before they start anything. Sanofi is absolutely the leader in innovating in the pharmaceutical content management space.”

            Saving Intel: next-gen Intel ultrabooks for enterprise and professional markets from $500; next-gen Intel notebooks, other value devices and tablets for entry level computing and consumer markets from $300

            imageimageOR “2 for 1” (or “two-for-one”) touch and voice enabled ultrabooks of convertible and detachable form factors with Haswell / 4th generation Intel Core processor family (shipping now and on track for Q2’13 launch) starting as low as $500.

            Touch-enabled notebooks [other value devices and tablets]
            with Bay Trail
            down to the $300 to $400 range
            in Q4’13, and as low as $200 later.

            OR after Intel’s biggest flop: at least 3-month delay in delivering the power management solution for its first tablet SoC [‘Experiencing the Cloud’, Dec 20, 2012] AND Urgent search for an Intel savior [‘Experiencing the Cloud’, Nov 21 – Dec 11, 2012] Intel is finally ready to drop entry level prices to competitive levels in both enterprise/professional and entry level computing /consumer markets

            Updates: a young Seeking Alpha investment research contributor reflected on it as Intel Just Made A Huge Decision [April 14, 2013] with the following reasoning to close his article which I wholeheartedly agree with:

            Intel’s “Atom” chips command margins roughly in line with the corporate average. This makes sense given that Nvidia recently disclosed that its “Tegra” mobile SoC business carried roughly 50% gross margins. Given that Intel owns its own fabs (and doesn’t pay royalties to ARM), gross margins in the 60%+ range are completely plausible. The problem is that raw ASPs for the chips are much lower than that of the traditional notebook and desktop chips.
            Selling a $25 – $30 processor isn’t going to give you the raw margin dollars that a $100 processor will, even if the gross margin percentage is the same. If we start seeing a trend where people are simply going with the Atom based solutions rather than the Core solutions, then this will of course be a problem for Intel at the top and bottom lines. But if we see the “Core” solutions staying mostly flat with the rejuvenated Atom helping to gain back market share from the ARM vendors, then this is pure upside for Intel.
            My guess is that the “truth” is going to be somewhere in the middle. The people who need performance, will always need performance, and the people who generally bought low cost, would have bought the cheaper “Celeron” and “Pentium” products (these aren’t too much more expensive than an Atom/ARM SoC) anyway. I expect that the difference is that while today’s “Celeron” and “Pentium” products generally end up in crappy systems with bad screens, slow hard disks, and lousy battery life, the “Atom” products will end up in much more compelling systems, as the PC OEMs/Intel can’t really afford to keep the good stuff confined to expensive systems that people may not be buying anyway.
            Conclusion
            Intel made the right move to unleash Atom and to grin and bear the potential blended ASP erosion that is sure to happen. The key, then, is to focus not on blended ASPs, but to keep an eye on total revenue and gross margin dollars. If these grow as a result of Atom, then great – Intel gets rewarded with a higher multiple as it will have proven its viability going forward, and increased revenues/earnings will only further serve to amplify the share price. If revenues stagnate, then Intel still made the right decision (because it is likely that without Atom being competitive, ARM based chips would have caused continued negative growth), but will need to really focus on increasing the total # of devices that it serves.
            In no way is making Atom more competitive a “mistake”, and Intel would rather cannibalize itself than let the other chip vendors do it. The big question mark is how total sales are going to be, and whether a competitive Atom at the low end PC + tablet spaces is going to be enough. My bet is “yes”, but nothing is ever sure when it comes to business.

            Don’t forget meawhile that Intel promotes Android convertible notebooks, say vendors [DIGITIMES, April 19, 2013]

            Viewing that Windows 8 has been unable to stimulate global demand for notebooks, and global sales of Android tablets have been increasing, Intel has begun to promote Android tablet-convertible notebooks, and China-based vendor Lenovo has taken the initiative to launch initial models in May while Hewlett-Packard (HP), Toshiba, Acer and Asustek Computer will launch models in the third quarter, according to sources from notebook vendors.
            Lenovo’s Android-based Yoga notebook [see: Lenovo Yoga 11S ultrabook tablet-convertible [Notebookitalia YouTube channel, Jan 7, 2013] and the IDF Bejing slide inserted on the left in a smaller format, both embedded much below in this post], set for release in May, is expected to feature an 11-inch display, the sources noted.
            Intel has estimated that the price sweet spot of Android-based notebooks is around US$500, and the machines will also need to feature detachable keyboard designs to allow transformation into a tablet, the sources said.
            Since most consumers are familiar with Android, with the addition of document processing applications, the sources believe Android-based notebooks should be able to attract strong demand.

            At the same time Some China-based white-box vendors plan to develop Windows 8 tablets [DIGITIMES, April 17, 2013]

            Viewing that Android tablets, especially 7-inch models, have been under intense price competition and therefore profitability is thinning, some China-based white-box vendors are considering developing Windows 8 tablets equipped with Intel processors for market segmentation, according to industry sources at the 2013 China Sourcing Fair: Electronics and Components taking place in Hong Kong during April 12-15.
            The products are expected to show up at the beginning of the third quarter, at the soonest.
            The sources believe that since the volume of tablets using a Windows operating system is still low, if they are able to enter the market ahead of others, there may be a chance of gaining profits.
            Although related costs are expected to increase by using Intel’s platform and Microsoft’s operating system, the sources pointed out that the advantage as an early mover will allow them to achieve better gross margins than for Android-based models. The fees from the operating system are not really a huge concern, the sources added.

            End of updates

            Sections of this post:

            1. Touch-enabled notebooks [other value devices and tablets] with Bay Trail down to the $300 to $400 range in Q4’13, and as low as $200 later.
            2. “2 for 1” (or “two-for-one”) touch and voice enabled ultrabooks of convertible and detachable form factors with Haswell/4th generation Intel Core processor family (shipping now and on track for Q2’13 launch) starting as low as $500.
            3. Intel’s CEO Discusses Q1 2013 Results – Earnings Call Transcript [Seeking Alpha, April 16, 2013]
            4. Earlier information from Intel


            1. Touch-enabled notebooks [other value devices and tablets] with Bay Trail down to the $300 to $400 range in Q4’13, and as low as $200 later.

            image
            Note that on this slide demoed on the screen of Bay Trail prototype (see the video embedded below) the targeted launch is set for “HR’13”, meaning “Holiday Revenue 2013”. Note as well that the Bail Trail SoC is both for entry desktop (i.e. Celeron) and entry notebook (i.e. current Atom) replacement. This why in the video below both an entry desktop motherboard prototype (from Gigabyte) and an entry notebook (from ASUS) is demoed. The range of devices with Bay Trail SoC is going to be however much wider than that, as is communicated already by Intel in the below excerpts. More exact information will be available later.

            From: Intel’s CEO Discusses Q1 2013 Results – Earnings Call Transcript [Seeking Alpha, April 16, 2013]

            … as we get into the Christmas selling season … we’ll see, because of Bay Trail coming into the marketplace, you’ll see touch-enabled thin notebooks with really good performance that are hitting kind of $300 price points. And then with our Android tablets, you’ll see things that are significantly …

            … If you look at touch-enabled Intel based notebooks that are ultrathin and light using non-Core processors, those prices are going to be down to as low as $200 probably. …

            Intel Bay Trail Prototype Hands On & HD Video Demo [minipcpro YouTube channel, April 9, 2013]

            Intel Bay Trail http://www.mobilegeeks.com. At IDF Beijing Intel took the opportunity to quietly announce Bay Trail, this new processor line up will be aimed at entry level computing. The new product line will feature Baytrail M for Mobile and Baytrail D for desktop. The 22nm chipset will be aimed at smartphones and tablets and in desktop think All in One systems. Bay Trail will be the most powerful Atom processor to date as it will be offering a Quad Core SoC, it should double the computing performance on Intel’s current generation of tablet processors.

            From: Intel Developer Forum: Transforming Computing Experiences from the Device to the Cloud [press release, April 10, 2013] Images are inserted from:
            Reinventing the Computing Experience presentation at IDF 2013 by Kirk Skaugen, Intel senior vice president and general manager of the PC Client Group
            Mobile Inside at IDF 2013 by Tan Weng Kuan, vice president and general manager of the Mobile Communications Group, Intel China
            Developing on Innovative Intel® Atom™ Processor Based Tablet Platforms [April 11, 2013 presentation by Intel at the IDF Beijing]

            Augmenting the company’s offerings for computing at a variety of price points, Skaugen announced plans for new market variants of its “Bay Trail22nm SoC with PC feature sets specifically designed for value convertibles, clamshell laptops, desktops and value all-in-one computers to ship later this year.

            imageTaking full advantage of the broad spectrum of capabilities enabled by Intel® architecture, processor technology leadership, manufacturing and multi OS support across Windows* 8 and Android*, Tan discussed the company’s forthcoming smartphone and tablet products based on Intel’s leading-edge 22nm process and an entirely new Atom microarchitecture. Intel’s quad-core Atom SoC (“Bay Trail“) will be the most powerful Atom processor to-date, doubling the computing performance of Intel’s current-generation tablet offering1. Scheduled for holiday 2013 tablets [in market Q4’13], “Bay Trail” will help enable new experiences and designs as thin as 8mm that have all-day battery life and weeks of standby.

            image

            What’s New in Tablets? Intel Powers Android & Windows 8 [channelintel YouTube channel, Feb 27, 2013]

            Intel continues its tablet expansion, now powering both Android and Windows 8 devices.


            2. “2 for 1” (or “two-for-one”) touch and voice enabled ultrabooks of convertible and detachable form factors with Haswell / 4th generation Intel Core processor family (shipping now and on track for Q2’13 launch) starting as low as $500.

            From: Intel’s CEO Discusses Q1 2013 Results – Earnings Call Transcript [Seeking Alpha, April 16, 2013]

            … as we get into the Christmas selling season, your expectation is you will see touch-enabled ultrabooks that are $499 and $599 pretty commonly out there. $599 commonly, and $499 as kind of special SKUs.

            From: Intel Developer Forum: Transforming Computing Experiences from the Device to the Cloud [press release, April 10, 2013]
            Images are inserted from Reinventing the Computing Experience presentation at IDF2013 by Kirk Skaugen, Intel senior vice president and general manager of the PC Client Group

            Reinventing the Computing Experience

            During his keynote, Kirk Skaugen, Intel senior vice president and general manager of the PC Client Group, provided a deeper look at the forthcoming 4th generation Intel Core processor family, which he said is now shipping to OEM customers and will launch later this quarter.

            Ultrabooks based on the 4th generation Intel Core processor family will enable exciting, new computing experiences and all-day battery life delivering the most significant battery life capability improvement in Intel’s history,” said Skaugen. “It will also bring to consumers a new wave of ‘two-for-oneconvertible and detachable systems that combine the best of a full PC experience with the best of a tablet in amazing new form factors.”

            NEW Architecture on 22nm Tri Gate

            NEW Intel Power Optimizer: 20x Power Reduction
            vs. 2nd gen Intel® Core™ Processors

            NEW Integrated on package PCH [Platform Controller Hub]
            for amazing form factors

            NEW Integrated Audio DSP: more battery life, higher quality

            Shipping Now and On Track for Q2 2013 Launch

            The new Intel Core microarchitecture will allow the company to deliver up to double the graphics performance over the previous generation. In addition, the new graphics solution will have high levels of integration to enable new form factors and designs with excellent visual quality built in. Skaugen demonstrated these graphics improvements on the 4th generation Intel Core processor-based Ultrabook reference design called “Harris Beach.” The demo featured Dirt 3*, a popular gaming title, showing the same visual experience and game play as a discrete graphics card that users would otherwise have to add separately. He also showed the 4th generation Intel Core processor-based concept, codenamed “Niagara,” a premium notebook with the ability to play the unreleased enthusiast title Grid 2* from CodeMasters* without the aid of a discrete graphics card.

            Along with touch capability, Intel® Wireless Display (Intel WiDi) will be enabled on all 4th generation Intel Core processor-based Ultrabook devices to allow people to quickly and securely stream content and apps from devices to the big screen, free from the burden of cables. Skaugen said the China ecosystem is taking the lead on integrating Intel WiDi into systems, and announced that the leading television manufacturer in China, TCL*, has a new model with the Intel WiDi technology built in. He also announced new receivers certified for Intel WiDi from QVOD* and Lenovo* and a set-top box from Gehua*.

            Where the idea of “2 for 1” (or “two-for-one”) was already demonstrated in 
            Convertible Ultrabook™ Features [channelintel YouTube channel, Feb 7, 2013]

            A complimentary piece to the “Best of Both World’s” Live-action video. This animation is intended to educate the viewer on the specific features and details surrounding convertible Ultrabook™. Many different form factors are shown as well as several usage models to give the user an idea for the many different ways that a user can take advantage of a convertible Ultrabook™.

            and here is The Best of Both Worlds, a Convertible Ultrabook™ Story (Long Version) [channelintel YouTube channel, Feb 27, 2013] live–action video for that

            A day in the life of our favorite PC user, Alysha Nett. Watch as she uses her Intel-based Convertible Ultrabook™ for both work and play — follow the two sides of her story as she uses the Ultrabook™ by day in her interior design job and by night out with friends watching her favorite band, “We Will Be Lions”. A shorter version of this video is also available.

            Shown first at CES 2013 for May’13 delivery: Lenovo Yoga 11S ultrabook tablet-convertible [Notebookitalia YouTube channel, Jan 7, 2013]

            Lenovo unveiled the IdeaPad Yoga 11S at CES 2013, highlighting the ability of this 11.6-inch notebook to turn into an 11.6-inch tablet.

            As well as a detachable form factor ultrabook reference design: IDF Beijing 2013 Keynote Demo – North Cape [channelintel YouTube channel, April 17, 2013]

            Kirk Skaugen showcases the North Cape reference design at the Intel Developer Forum in Beijing.

            Kirk Skaugen:

            [0:17] This is a full 17 mm clamshell ultrabook. In this configuration it actually has 13 hours of battery life, and it is a full Core i5 computer. But what I can do here, as I can just very simply push in an electronic eject button, and lift it out very simply with one hand. About 3 hours of battery life comes from a battery that sits under the keyboard. But here then I have an amazing notebook that gives me a less than 3 pound tablet with 10 hours of battery life. [0:51]

            At CeBIT 2013 in Hannover, Germany (March 5-9) North Cape was demonstrated as:
            Haswell Ultrabook – North Cape Reference Design Hands-On [Steve Chippy Paine YouTube channel, March 5, 2013]

            and in the companion article it was reported:

            In a chat with one of the marketing managers I confirmed that there will be COnnected Standby and non-Connected Standby Ultrabooks on Haswell. The CS Ultrabooks are likely to be the cream of the crop and will be more expensive but will have lower power profiles. Clearly the hybrid designs are the perfect fit for CS-capable Ultrabooks but I’ wouldn’t be surprised to see Samsung have a CS-capable Series 9. Remember, CS is not just about being up-to-date with emails, it means apps can run when the Ultrabook is in your bag, without a fan, on an SSD, for days. It’s the mark of extreme battery life, it’s very exciting technology and likely to be exclusive to Ultrabooks.

            then immediately before IDF the same source delivered the news that Haswell Ultrabooks could achieve Tablet-like 100mW Connected-Idle [April 9, 2013]

            imageIn a presentation due to go out at the Intel Developer Forum over the next two days Intel will outline best practices for low-power idle on Ultrabooks. Today you’ll be lucky to see an Ultrabook idle to less than 3000mW (3 Watts) which is a background drain that’s always there. On Haswell, Intel says that you could get to a screen-off idle state of 100mW.
            By effectively removing nearly 3W of background drain, all operations are going to benefit, not just idle. Where Internet browsing was a 9W operation, expect to see that go down to around 6W for a big increase in battery life.
            The 100mW target requires both system designers and software engineers to build to the best standards but when it comes to laptops, it’s the Ultrabooks that have the best chance of getting the best engineers working on them. Low-power DDR3 memory, SSD storage, high-quality power components and tight board design mean the best systems won’t be cheap systems but all the ingredients and skills are now available to make laptops that idle like tablets.
            Intel also want’s to see engineers using configurable TDP and other features to create systems in the 10W (fanless) range. High Density Interconnects on motherboards could also bring advantages. By reducing the mainboard size, space is created for more battery. Intel says there’s a chance to fit 20-45% more battery inside when motherboard sizes are reduced using HDI techniques.
            imageWhile the ingredients and techniques might be on the shelf, it’s up to the OEMS to decide how they use them. Pricing pressures often lead to compromises so don’t expect all of the new engineering techniques to appear on anything but the high-end Ultrabooks.

             

            More information: Form Factor and Average Power Innovations for Ultrabooks™
            [April 10, 2013 presentation by Intel at the IDF Beijing] with the following abstract:

            Intended Audience: OEMs and ODMs – Motherboard Layout Designers, Power Delivery, and Power Management Architects
            In this session we propose methods to improve, form factor, battery capacity, and power consumption for Ultrabook™ devices. We show how High Density Interconnects (HDI) Printed Circuit Boards could free up considerable space for more battery and other features, especially in thinner Ultrabooks. We show current practices with HDI and propose better ways to achieve higher mother board area reduction to close the cost gap between type 3 and type 4 (HDI) designs. For power consumption, we also show design methods to reduce average power, especially by reducing platform idle power.

            and agenda:

              • What is HDI?
              • Benefits of HDI in Form Factor Constrained Systems
              • Reducing the Cost of HDI
              • Reducing Platform Power
              • Thermal management an Power Configurability

            North Cape was first shown at CES 2013, so OEMs had pretty much time to work on Haswell based offerings to be unveiled in Q2’13:
            Intel Delivers Broad Range of New Mobile Experiences [press release, Jan 7, 2013]

                • 4th generation Intel® Core™ processor family (formerly codenamed “Haswell“) will enable a broad new range of Ultrabook convertibles, detachables and tablets with all-day battery life; the biggest battery life gain over a previous generation in company’s history3.
            3 4th Generation Intel Core processors provide 3-5 hours of additional battery life when compared to 3rd Generation Intel Core processors, based on measurement of 1080p HD video playback.

            Low Power Fuels Ultrabook Innovation

            Since mid-2011, Intel has led the industry in enabling Ultrabook devices aimed at providing new, richer mobile computing experiences in thin, elegant and increasingly convertible and detachable designs. To enable these innovative designs, Intel announced last September that it added a new line of processors to its forthcoming 4th generation Intel Core processor family targeted at about 10 watt design power, while still delivering the excellent performance people want and need.

            Skaugen announced today that the company is bringing the low-power line of processors into its existing 3rd generation Intel Core processor family. Available now, these chips will operate as low as 7 watts, allowing manufacturers greater flexibility in thinner, lighter convertible designs. Currently there are more than a dozen designs in development based on this new low-power offering and they are expected to enable a full PC experience in innovative mobile form factors including tablets and Ultrabook convertibles. The Lenovo IdeaPad Yoga* 11S Ultrabook and a future Ultrabook detachable from Acer will be among the first to market this spring based on the new Intel processors and were demonstrated by Skaugen on stage.

            The 4th generation Intel Core processor family enables true all-day battery life — representing the most significant battery life capability improvement in Intel history. Skaugen disclosed that new systems are expected to deliver up to 9 hours of continuous battery life, freeing people from some of the wires and bulky power bricks typically toted around.
            “The 4th generation Core processors are the first Intel chips built from the ground up with the Ultrabook in mind,” Skaugen said. “We expect the tremendous advancements in lower-power Core processors, and the significant ramp of touch-based systems will lead to a significant new wave of convertible Ultrabooks and tablets that are thinner, lighter and, at the same time, have the performance required for more human-like interaction such as touch, voice and gesture controls.”

            To demonstrate the impact of the 4th generation Intel Core processor family, Skaugen showed a new form factor Ultrabook detachable reference design (codenamed “North Cape“) that converts into a 10mm tablet and can run on battery for up to 13 hours while docked.

            Advancements made in the way consumers will interact with their computing devices were also demonstrated, including natural and more immersive interaction experiences using a 3-D depth camera. Intel showed applications running on an Ultrabook in which objects can be manipulated naturally with free movements of the hands, fingers, face and voice. One application that was demonstrated can be used for enabling new and immersive video collaboration and blogging experiences. These were all enabled using the Intel® Perceptual Computing SDK Beta. This year, Intel expects more Ultrabooks and all-in-one (AIO) systems to offer applications for voice control (Dragon Assistant*) and facial recognition (Fast Access*) for convenience and freedom from passwords.

            So this was first shown at CES 2013 as well: IDF Beijing 2013 Keynote Demo — Perceptual Computing SDK [channelintel YouTube channel, April 17, 2013]

            New gesture and voice capabilities shown during Doug Fisher’s keynote at the Intel Developer Forum in Beijing

            which was used in the IDF Beijing 2013 Keynote Demo – Personified Chat [channelintel YouTube channel, April 17, 2013]

            The latest in perceptual computing demonstrated using an example of personified chat at the Intel Developer Forum Beijing.

            IDF 2013 Beijing Highlights Day One [channelintel YouTube channel, April 16, 2013]

            Intel® UltrabookConvertible SBA v1 [channelintel YouTube channel, April 2, 2013]

            Get the flexibility to move your business forward with the ultra versatile, ultra sleek, Ultrabook™. Inspired by Intel®

            Intel® UltrabookPerformance SBA v2 [channelintel YouTube channel, April 2, 2013]

            You have big business goals. Reaching them requires the right tools. The ultra responsive, ultra sleek Ultrabook™. Inspired by Intel®.

            3. Intel’s CEO Discusses Q1 2013 Results – Earnings Call Transcript [Seeking Alpha, April 16, 2013]

            Paul Otellini for the second half of the year for sales:

            … as the OEMs start looking at new form factors that they can design around our new chips, Haswell in particular, and maybe Bay Trail, and Windows 8, enabling touch, the explosion in form factors and the competitiveness of that platform is going to be substantially different, at price points down into the $300 to $400 range enabling touch. We didn’t have that last year. So you go into the prime selling season with new products, new technologies, new form factors, and new capabilities that, up to now, were unapproachable price points.

            Paul Otellini regarding his current view on Haswell’s potential to revitalize the PC market with Windows 8:

            With Haswell, there’s a number of things. First of all, the overall performance goes up, graphics performance goes up, as well as the integer performance. So it’s a better punch in the package than we’ve had with Ivy Bridge. Point one. Point two, the power envelope, or the batter life for that level of performance, is exceptionally better than Ivy Bridge.

            Third, it gets into the form factor innovation and the integration with touch as I spoke about earlier, which I think is really part of the recipe required for Win 8 adoption. I’ve recently converted personally to Windows 8 with touch, and it is a better Windows than Windows 7 in the desktop mode, when you implement the touch and the touch-based applications and operating environment. It’s just a lot easier to use.

            There is an adoption curve, and once you get over that adoption curve, I don’t think you go back. And we didn’t quite have that same kind of adoption curve in Windows 7 versus XP before it. This requires a little bit of training. And I think people are attracted to touch, and the touch price points today are still fairly high, and they’re coming down very rapidly over the next couple of quarters.

            Paul Otellini about technology transitions:

            We’ve also got the technology transition to the 14 nanometers. [unintelligible] a first order, all of our spending is focused on 14 nanometers , which gives us a fairly significant ramp capability. If demand for older products exceeds what we could build on 14, we could still build 22 for quite some time. So I really think it depends on whatever demand scenario you see out there. In any event, the most important thing for us is to make that transition to 14 and continue to have the leading edge.

            Lenovo Talks Tablets [detachable ultrabooks] at HIMSS 13 [channelintel YouTube channel, April 16, 2013]

            At HIMSS 13, Lenovo Ambassador Ashley Rodrigue showed off the company’s new health IT convertible devices that feature the best of both worlds for clinicians—a detachable tablet and an Ultrabook laptop for more robust activity. The benefit for health IT professionals? Just one device to manage. Find out more information and read the latest blog posts on health IT in the Intel Healthcare Community: http://communities.intel.com/community/healthcare

            Paul Ottelini on what Intel can do to help the PC ecosystem to become healthy again:

            I think continue to give them the tools to innovate. And I wouldn’t paint the entire customer base with the same brush that you just did. Certainly if you looked at the last quarter, even inside the PC space, Lenovo outperformed everybody else, and actually had a very good year on year set of numbers, in a down year. Apple continues to do well.

            Subsets of customers in different segments are also doing very well in terms of, say, those. Those providing products into the internet data centers. What I see when we look out is a tremendous amount of innovation, particularly at the ODM and Taiwanese OEM side, where the ability to miniaturize and bring things into extremely thin form factors is as revolutionary as the amount of changes I’ve seen in my time in this industry.

            And so I think what we can do is give them the products, like Haswell and Bay Trail to innovate around. We can help them with other feature sets like voice and speech that go around them, and just help them build better products.

            Paul Ottelini on Intel efforts to invest in things outside of what could called core PC, such as the set top box or in the foundry efforts or other areas of revenue that the company is seeking:

            I don’t look at things with quite that level of granularity. The foundry thing, the investment is really going to be taking advantage, at least near term, with the current customer base, of capacity that we’re already putting in place. That doesn’t mean that at some point we won’t have to actually build extra capacity for a foundry customer or a foundry business, but today, up to this point, it’s certainly within our ability to absorb.

            The set top box spending, or the stuff we’re doing in Intel Media, in the grand scheme of things, is not a lot of spending. So the real issue is inside of our core microprocessor and platform development, and we’re at the point now where roughly half of our spending is focused on System on Chip, inside the microprocessor world.

            And the System on Chip environment is really a lot of the ultramobile products. It’s the phones, it’s the tablets. It’s embedded systems. It’s automotive, etc. Where we have fairly strong growth opportunities. So it’s not the same monolithic Tick-Tock model that we put in place eight years ago.

            Paul Otellini on the proper interpretation of the new price points mentioned earlier in the earnings call:

            We have a certain spec for ultrabooks, and that is the product that Stacy said is going to be centered at as low as $599 with some [diverse] SKUs to $499. If you look at touch-enabled Intel based notebooks that are ultrathin and light using non-Core processors, those prices are going to be down to as low as $200 probably.

            Re: Stacy Smith (Chief Financial Officer, Executive Vice President) about the sources of increased confidence now in versus where Intel was three months ago talking earlier in the earnings call as follows:

            First of all, just to make sure I’m not oversoaking things here, you really just need seasonal from where we are in order to achieve the low single digit revenue growth. So I don’t think we have a hugely high bar out there, and I went through a dissection of where I think the revenue comes from.

            In terms of the things that give me confidence, or at least I personally believe it could be better than seasonal, it’s the things we talked about, improving macroeconomic environment, the fact that we now are participating across a range of compute devices, and so the mix between those don’t impact us nearly as much.

            And then third, as Paul said, you have innovative form factors coming out in ultrabooks, in convertibles, and in detachables, that are hitting these really compelling mainstream price points that are touch enabled. And as we get into the Christmas selling season, your expectation is you will see touch-enabled ultrabooks that are $499 and $599 pretty commonly out there. $599 commonly, and $499 as kind of special SKUs.

            And then we’ll see, because of Bay Trail coming into the marketplace, you’ll see touch-enabled thin notebooks with really good performance that are hitting kind of $300 price points. And then with our Android tablets, you’ll see things that are significantly, [hey, I have that]. So we’ll be participating across a broad range of compute devices as we get into the back half of this year.


            4. Earlier information from Intel:

            Intel Accelerates Mobile Computing Push [press release, Feb 24, 2012]

            NEWS HIGHLIGHTS

            • Launches dual-core Intel® Atom™ Processor-based platform (formerly “Clover Trail+”) aimed at performance and mainstream smartphone market segments, and providing double the compute performance and 3x graphics capabilities1 with competitive battery life. Product to also debut in Android* tablets.
            • Reveals one of the world’s smallest2 and lowest-power multimode-multiband LTE solutions for global roaming in one SKU with envelope tracking and antenna tuning. Shipping single mode now with multimode shipments beginning first half of 2013.
            • Demonstrates continued momentum in emerging markets with Intel® Atom™ Z2420 processor, including new smartphone engagement with Etisalat* in Egypt. ASUS* to also debut a new Android* tablet based on the Atom Z2420 processor.
            • Announces support from leading ODMs for next-generation quad-core Atom SoC (“Bay Trail”), scheduled to be available for holiday 2013.
            • Extends mobile device enabling efforts to tablets, followed by phones.
            MOBILE WORLD CONGRESS, Barcelona, Spain, Feb. 25, 2013 – Intel Corporation today announced a range of new products, ecosystem and enabling efforts that will further accelerate the company’s presence in mobile and help usher in new devices and richer experiences with Intel Inside®.
            The announcements include a new dual-core Atom™ SoC (“Clover Trail+“) platform for smartphones and Android* tablets, and the company’s first global, multimode-multiband LTE solution that will ship in the first half of this year. Other disclosures included “Bay Trail” momentum, mobile device enabling efforts, and continued smartphone momentum in emerging markets with the Intel® Atom™ Z2420 processor-based platform.
            “Today’s announcements build on Intel’s growing device portfolio across a range of mobile market segments,” said Hermann Eul, Intel vice president and co-general manager of the Mobile and Communications Group. “In less than a year’s time we have worked closely with our customers to bring Intel-based smartphones to market in more than 20 countries around the world, and have also delivered an industry-leading low-power Atom™ SoC tablet solution running Windows* 8, and shipping with leading OEM customers today. Looking forward, we will build upon this foundation and work closely with our ecosystem partners, across operating systems, to deliver the best mobile products and experiences for consumers with Intel Inside.”
            New, Efficient Atom™ SoC Platform
            Intel’s new Atom™ processor platform (“Clover Trail+“) and smartphone reference design delivers industry-leading performance with low-power and long battery life that rivals today’s most popular Android* phones. The product brings Intel’s classic product strengths, including high performance that lets you enjoy smooth Web browsing,  vibrant, glitch-free, full HD movies, and an Android* applications experience that launches fast and runs great.
            The platform’s 32nm dual core Intel® Atom™ Processors — Z2580, Z2560, Z2520 — are available in speeds up to 2.0 GHz, 1.6 GHz and 1.2GHz, respectively. The processor also features support for Intel® Hyper-Threading Technology, supporting four simultaneous application threads and further enhancing the overall efficiency of the Atom cores.
            The integrated platform also includes an Intel® Graphics Media Accelerator engine with a graphics core supporting up to 533MHz with boost mode, and delivering up to three times the graphics performance1 for rich 3-D visuals, lifelike gaming and smooth, full 1080P hardware-accelerated video encode and decode at 30fps.
            “Our second-generation product delivers double the compute performance and up to three times the graphics capabilities1, all while maintaining competitive low power,” Eul said. “As we transition to 22nm Atom SoCs later this year, we will take full advantage of the broad spectrum of capabilities enabled by our design, architecture, 22nm tri-gate transistor technology, and leading-edge manufacturing to further accelerate our position.”
            The new Atom platform also brings advanced imaging capabilities, including support for two cameras, with a primary camera sensor up to 16 megapixels. The imaging system also enables panorama capture, a 15 frame-per-second burst mode for 8 megapixel photos, real-time facial detection and recognition, and mobile HDR image capture with de-ghosting for clearer pictures in flight.
            The platform is also equipped with Intel® Identity Protection Technology (Intel IPT), helping to enable strong, two-factor authentication for protecting cloud services such as remote banking, e-commerce, online gaming and social networking from unauthorized access. Since Intel IPT is embedded at chip-level, unlike hardware or phone-based tokens, it can enable more secure, yet user-friendly cloud access protection. Intel is working with partners including Feitian*, Garanti Bank*, MasterCard*, McAfee*, SecureKey* Technologies Inc., Symantec*, Vasco Data Security International* Inc. and Visa* Inc. to incorporate this technology into their services.
            With WUXGA display support of 1920×12003, the platform will also enable larger-screen Android* tablet designs. It also includes support for Android* 4.2 (Jelly Bean), Intel Wireless Display Technology, HSPA+ at 42Mbps with the Intel® XMM 6360 slim modem solution, and the new industry-standard UltraViolet™ Common File Format.
            Customers announcing support for “Clover Trail+” platform for phones and tablets include ASUS*, Lenovo*, and ZTE*.
            Debuting at CES last month, the Lenovo* IdeaPhone K900* is based on the Intel® Atom™ processor Z2580 and delivers rich video, graphics and Web content at fantastic speeds. The IdeaPhone is 6.9mm thin and also features the world’s first 5.5-inch full high-definition 400+ PPI screen for increased clarity of text and images. The K900 will be the first product to market based on the Atom processor Z2580. Lenovo plans to introduce the smartphone in the second quarter of 2013 in China, followed soon by select international markets.
            Building on the Atom processor platform (“Clover Trail+”), Intel also highlighted its forthcoming 22nm smartphone Atom™ SoC (“Merrifield“). The product is based on Intel’s leading-edge 22nm process and an entirely new Atom microarchitecture that will help enable increased smartphone performance, power efficiency and battery life.
            Long-Term Evolution (4G LTE)
            Intel’s strategy is to deliver a leading low-power, global modem solution that works across multiple bands, modes, regions and devices.
            The Intel XMM 7160 is one of the world’s smallest2 and lowest-power multimode-multiband LTE solutions (LTE / DC-HSPA+ / EDGE), supporting multiple devices including smartphones, tablets and Ultrabooks™.  The 7160 global modem supports 15 LTE bands simultaneously, more than any other in-market solution. It also includes a highly configurable RF architecture running real time algorithms for envelope tracking and antenna tuning that enables cost-efficient multiband configurations, extended battery life, and global roaming in a single SKU.
            “The 7160 is a well-timed and highly competitive 4G LTE solution that we expect will meet the growing needs of the emerging global 4G market,” Eul said. “Independent analysts have shown our solution to be world class and I’m confident that our offerings will lead Intel into new multi-comm solutions. With LTE connections projected to double over the next 12 months to more than 120 million connections, we believe our solution will give developers and service providers a single competitive offering while delivering to consumers the best global 4G experience. Building on this, Intel will also accelerate the delivery of new advanced features to be timed with future advanced 4G network deployments.”
            Intel is currently shipping its single mode 4G LTE data solution and will begin multimode shipments later in the first half of this year. The company is also optimizing its LTE solutions concurrently with its SoC roadmap to ensure the delivery of leading-edge low-power combined solutions to the marketplace.
            Intel® Atom™ Platform Z2420
            As Intel expands its geographic presence, the company sees tremendous opportunity in delivering rich Intel-based mobile experiences to consumers across emerging markets.
            As part of its strategy to take advantage of the fast growing market for value smartphones in emerging markets, which some analysts expect to reach 500 million units by 2015, Intel highlighted continuing momentum with the Intel Atom Processor Z2420 platform (formerly “Lexington“). Since it was first announced at CES, Acer* (Thailand, Malaysia), Lava* (India) and Safaricom* (Kenya) have all announced new handsets.
            Etisalat Misr*, a leading telco operator based in Egypt and a subsidiary of Etisalat group UAE, in collaboration with Intel today announced plans for the Etisalat E-20 Smartphone with Intel Inside®. Set to debut in Egypt in April, the Intel-based handset will be the first in the Middle East and North Africa region, and the second introduction in Africa to-date, building on the recent launch of Safaricom* in Kenya.
            Demonstrating the flexibility of the Atom SoC platform to accommodate a range of device and market segment needs, ASUS* later today will announce a new Android* tablet based on the Intel® Atom™ Processor Z2420.

            Tablets with Intel Inside®

            Building on the device momentum and industry-leading power-efficiency of the award-winning Atom processor Z2760, Intel’s first quad-core Atom SoC (“Bay Trail“), will be the most powerful Atom processor to-date — doubling the computing performance of Intel’s current- generation tablet offering and providing the ecosystem with a strong technology foundation and feature set from which to innovate. The “Bay Trail” platform, scheduled to be available for holiday 2013, is already up and running on Windows* and Android* and will help enable new experiences in designs as thin as 8mm that have all-day battery life and weeks of standby.

            Intel is currently working with Compal*, ECS*, Pegatron*, Quanta* and Wistron* to accelerate “Bay Trail” tablets to the market. Intel is also extending its work with leading OEM partners globally, building on the strong foundation of  Intel Atom processor Z2760-based tablet designs in market from Acer*, ASUS*, Dell*, Fujitsu*, HP*, Lenovo*, LG Electronics and Samsung*.

            Enabling Mobile Devices with Intel Inside®

            Intel today announced an expansion of its ecosystem enabling efforts to deliver new device and market innovations across a range of Windows*- and Android*-based mobile devices.

            Intel platform and enabling programs have been the foundation of OEM and ODM innovation for decades. The new program will focus on accelerating time to market for leading-edge mobile devices based on Intel® architecture with top OEMs and ODMs. The program will focus first on tablets, followed by phones, providing pre-qualified solutions with simplified building blocks to scale designs quickly for mature and emerging markets. The Atom Processor Z2760 and the company’s forthcoming 22nm Atom SoC, codenamed “Bay Trail,” will be the starting foundation for the effort.

            1 Compared to the Intel Atom Processor Z2460 platform; Graphics clock will vary based on SKU: Z2580, Z2560, Z2520.
            2 Compared with competitive solutions shipping in market today.
            3 Corrected from misprinted ‘1900×1200’ to ‘1920×1200’ – Feb. 27,21013

            Intel Developer Forum: Transforming Computing Experiences  from the Device to the Cloud [press release, April 10, 2013]
            Images are inserted from Reinventing the Computing Experience presentation at IDF2013 by Kirk Skaugen, Intel senior vice president and general manager of the PC Client Group

            Company Accelerates Expansion of 22nm Data Center Processor Families; Graphics Innovations, Intel® Wireless Display Coming to Next-Generation Ultrabooks

            NEWS HIGHLIGHTS

            • Accelerates expansion of offerings across the data center processor product lines based on Intel’s innovative 22nm manufacturing technology.
            • Aims to revolutionize the server rack design by delivering an Intel rack scale architecture for increased flexibility, density and utilization of servers leading to lower total cost of ownership.
            • Next-generation, 64-bit Intel® Atom™ processor for microservers, codenamed “Avoton,” is being sampled to customers with broad availability expected in the second half of this year.
            • 4th generation Intel® Core™ processors are now shipping to customers and will launch later this quarter.
            INTEL DEVELOPER FORUM, Beijing, April 10, 2013 – During Intel Corporation’s annual developer forum this week, company executives announced new technologies and partnerships aimed at transforming how people experience technology from the device to the cloud. The announcements included details on new data center product lines based on the 22-nanometer (nm) process technology and the new Intel rack scale architecture, along with details on the forthcoming 4th generation Intel® Core™ processor family.
            During her keynote, Diane Bryant, Intel senior vice president and general manager of the Datacenter and Connected Systems Group, underscored the importance of the data center in enabling amazing personal computing experiences to deliver real-time information and services. She also outlined the steps Intel is taking to provide the hardware and software needed for data analytics to improve the capabilities of intelligent devices and data center infrastructure.
            “People are increasingly demanding more from their devices through applications and services whether at home, at work or wherever they may be,” Bryant said. “Intel is delivering a powerful portfolio of hardware and software computing technologies from the device to the data center that can improve experiences and enable new services.”
            Bryant outlined plans to accelerate the expansion of Intel’s offerings across the data center processor product lines based on its innovative 22nm manufacturing technology before the end of the year, thereby enabling a more cost-effective and efficient data center infrastructure. Intel’s broad portfolio of data center intellectual property enables Intel to quickly integrate features into new products and bring them to market. For example, Intel is launching the new Intel® Atom™ S12x9 processor family customized for storage today, just four months after the debut of the Intel Atom S1200 processor for microservers.
            Intel plans to deliver two more Intel Atom processor-based products this year that promise to deliver new architectures, improved performance-per-watt and an expanded feature set. Bryant demonstrated for the first time the next-generation Intel Atom processor family for microservers, codenamed “Avoton,” and confirmed it is currently shipping samples to customers for evaluation. Avoton will feature an integrated Ethernet controller and is expected to deliver industry-leading energy efficiency and performance-per-watt for microservers and scale out workloads.
            Re-Architecting the Data Center
            Bryant also revealed details on Intel’s plans to develop a reference design for rack scale architecture that uses a suite of Intel technologies optimized for deployment as a full rack. Hyper-scale data centers run by companies that maintain thousands of servers and store vast amounts of data require continued advancements in rack designs that make it easier and more cost effective to deal with major growth in users, data and devices. Traditional rack systems are designed to handle a wide variety of application workloads and may not always achieve the highest efficiency under all hyper-scale usages. The reference design will help re-architect a rack level solution that is modular at the subsystem level (storage, CPU, memory, network) while providing the ability to provision and refresh or logically allocate resources based on application specific workload requirements. Benefits include increased flexibility, higher density and higher utilization leading to a lower total cost of ownership.
            Additional information on these announcements as well as the new Intel Atom processor S12x9 product family for storage servers, Intel® Xeon® processor E3v3 product family, Intel Xeon processor E7v2 product family and Intel Atom processor for communication and networking devices codenamed “Rangeley” is available in the news fact sheet.

            Reinventing the Computing Experience

            During his keynote, Kirk Skaugen, Intel senior vice president and general manager of the PC Client Group, provided a deeper look at the forthcoming 4th generation Intel Core processor family, which he said is now shipping to OEM customers and will launch later this quarter.

            Ultrabooks based on the 4th generation Intel Core processor family will enable exciting, new computing experiences and all-day battery life delivering the most significant battery life capability improvement in Intel’s history,” said Skaugen. “It will also bring to consumers a new wave of ‘two-for-oneconvertible and detachable systems that combine the best of a full PC experience with the best of a tablet in amazing new form factors.”

            image

            image

            image

            image

            image

            NEW Architecture on 22nm Tri Gate

            NEW Intel Power Optimizer: 20x Power Reduction
            vs. 2nd gen Intel® Core™ Processors

            NEW Integrated on package PCH [Platform Controller Hub]
            for amazing form factors

            NEW Integrated Audio DSP: more battery life, higher quality

            Shipping Now and On Track for Q2 2013 Launch

            The new Intel Core microarchitecture will allow the company to deliver up to double the graphics performance over the previous generation. In addition, the new graphics solution will have high levels of integration to enable new form factors and designs with excellent visual quality built in. Skaugen demonstrated these graphics improvements on the 4th generation Intel Core processor-based Ultrabook reference design called “Harris Beach.” The demo featured Dirt 3*, a popular gaming title, showing the same visual experience and game play as a discrete graphics card that users would otherwise have to add separately. He also showed the 4th generation Intel Core processor-based concept, codenamed “Niagara,” a premium notebook with the ability to play the unreleased enthusiast title Grid 2* from CodeMasters* without the aid of a discrete graphics card.

            Along with touch capability, Intel® Wireless Display (Intel WiDi) will be enabled on all 4th generation Intel Core processor-based Ultrabook devices to allow people to quickly and securely stream content and apps from devices to the big screen, free from the burden of cables. Skaugen said the China ecosystem is taking the lead on integrating Intel WiDi into systems, and announced that the leading television manufacturer in China, TCL*, has a new model with the Intel WiDi technology built in. He also announced new receivers certified for Intel WiDi from QVOD* and Lenovo* and a set-top box from Gehua*.

            Illustrating the low-power advances in Ultrabook devices, Skaugen showed off the new Toshiba Portege* Ultrabook detachable, based on the new low-power line of the 3rd generation Intel® Core™ processors.

            image

            image

            Furthermore, Skaugen revealed that voice interaction in Mandarin is now available on Ultrabook devices from Intel through Nuance*.

            Augmenting the company’s offerings for computing at a variety of price points, Skaugen announced plans for new market variants of its “Bay Trail” 22nm SoC with PC feature sets specifically designed for value convertibles, clamshell laptops, desktops and value all-in-one computers to ship later this year.

            image

            Mobile Inside
            Tan Weng Kuan, vice president and general manager of the Mobile Communications Group, Intel China, highlighted how the company is working with ecosystem partners to deliver the best smartphone and tablet experiences with Intel inside. Tan discussed the company’s progress with the new Intel® Atom™ processor Z2580 (“Clover Trail+“) for smartphones and the Intel Atom Processor Z2760 (“Clover Trail“) for tablets, both of which are helping to usher in a range of new devices and user experiences.

            image

            Taking full advantage of the broad spectrum of capabilities enabled by Intel® architecture, processor technology leadership, manufacturing and multi OS support across Windows* 8 and Android*, Tan discussed the company’s forthcoming smartphone and tablet products based on Intel’s leading-edge 22nm process and an entirely new Atom microarchitecture. Intel’s quad-core Atom SoC (“Bay Trail“) will be the most powerful Atom processor to-date, doubling the computing performance of Intel’s current-generation tablet offering1. Scheduled for holiday 2013 tablets [in market Q4’13], “Bay Trail” will help enable new experiences and designs as thin as 8mm that have all-day battery life and weeks of standby.

            Tan also highlighted Intel’s Atom SoC, codenamed “Merrifield,” which is scheduled to ship to customers by the end of this year [in market Q1’14]. The product will deliver increased smartphone performance, power efficiency and battery life over the current-generation offering.
            Tan closed his remarks by calling upon China developers for collective innovation in helping to accelerate and grow the mobile market together. He announced the creation of a China-specific expansion of the company’s platform and ecosystem enabling efforts, focused initially on Atom processor-based tablets running Android*, and designed to speed time-to-market of leading-edge mobile devices based on Intel technology. He added that China developers are instrumental to this effort and will bring speed, scale and ingenuity that will drive new innovation globally.
            Day 2 IDF Preview
            Doug Fisher, vice president and general manager of Intel’s System Software Division, will open the second day of IDF, addressing several myths surrounding the industry and providing a vision on the vast opportunities that await developers. Specifically, he will showcase Intel’s transformation of the PC experience and advances in device segments, support of multiple operating environments and efforts to help developers scale and modernize computing with new hardware features and software advancements for more compelling user experiences. He will discuss how developers can utilize HTML5 to help lower total costs and improve time-to-market for cross-platform applications development and deployment, incorporate touch and sensor interfaces to modernize applications, and use perceptual compute technologies to enable consumers to interact with PCs via voice control, gesture recognition and more.
            Intel Chief Technology Officer Justin Rattner will also take the stage to discuss how Intel Labs is drawing up plans for a bright future. He will reveal a vision for connected and sustainable cities where information technology helps to address challenges of clean air, clean water, better health and improved safety. He will also explain how today’s mobile, urban lifestyle is demanding faster and cheaper wireless broadband communications. Forecasting a move beyond the information age, Rattner will describe a new era coined “the data society” and show how information in the cloud will work on everyone’s behalf, collaboratively and safely, by analyzing and relating different data to deliver new value to individuals, enterprises and society as a whole. Rattner plans to surprise the audience with an exclusive first look at Intel® Silicon Photonics Technology.

            This was summarized by Intel in a New Ultrabook™ experiences unveiled at IDF Beijing 2013 [Intel Developer Zone blog, April 16, 2013] post as follows:

            Last week at the Intel® Developer Forum held April 10-11, 2013 in Beijing, China, Ultrabooks™ were in the spotlight as new experiences based on the 4th generation Intel® Core™ processor family were announced:

            “Ultrabooks based on the 4th generation Intel Core processor family will enable exciting, new computing experiences and all-day battery life delivering the most significant battery life capability improvement in Intel’s history,” said Skaugen. “It will also bring to consumers a new wave of ‘two-for-one’ convertible and detachable systems that combine the best of a full PC experience with the best of a tablet in amazing new form factors.” – Kirk Skaugen, Intel senior vice president and general manager of the PC Client Group

            There are three major factors in this new announcement: amazing graphics, even more Ultrabook form factor designs, and low-power advances creating longer battery life. Touch capability will also be part of this new generation of devices, along with Intel® Wireless Display (Intel WiDi) enabled on all on all 4th generation Intel Core processor-based Ultrabook devices to allow people to quickly and securely stream content and apps from devices to the big screen.

            4th generation Intel® Core™ processors

            The Ultrabook computing category was first introduced in 2011 with a 2nd generation Intel® Core™ processor. This was ramped up greatly in 2012 with the addition of touch and mainstream price points, along with the 3rdgeneration Intel core processor. In 2013, we get to experience a 4th generation Intel Core processor and the concept of “2 for 1” computing; basically, we get to experience a table and a PC experience in one machine:

            “The new Intel Core microarchitecture will allow the company to deliver up to double the graphics performance over the previous generation. In addition, the new graphics solution will have high levels of integration to enable new form factors and designs with excellent visual quality built in. Skaugen demonstrated these graphics improvements on the 4th generation Intel Core processor-based Ultrabook reference design called “Harris Beach.” The demo featured Dirt 3*, a popular gaming title, showing the same visual experience and game play as a discrete graphics card that users would otherwise have to add separately. He also showed the 4th generation Intel Core processor-based concept, codenamed “Niagara,” a premium notebook with the ability to play the unreleased enthusiast title Grid 2* from CodeMasters* without the aid of a discrete graphics card.” –Intel Newsroom

            These new processors will include:

            • new architecture on 22nm Tri Gate
            • Intel Power Optimizer: 20x power reduction vs. 2nd gen Intel Core Processors
            • integrated on package PCH for amazing form factors
            • integrated audio DSP which means more battery life and higher quality

            Graphics

            With this new generation of processors comes increasingly higher level graphics support, including:

            • 3D graphics with up to 2x performance
            • integrated on-package EDRAM memory
            • API support
            • Display with new 3-screen collage display
            • enhanced 4k x 2k support
            • 2x bandwidth with display port 1.2
            • Media with new faster Intel Quick Sync Video
            • faster JPEG and MPEG decode
            • new OpenCL 1.2 support

            (Source: IDF Keynote)

            Touch

            Touch is becoming more mainstream, and more consumers than ever before are expecting touch as a standard addition to their devices. In an Intel study of touch carried out in December of 2011, users chose touch nearly 80% of the time when given the choice between touch, keyboard, mouse, and track pad. These findings were echoed in another touch study by UX Innovation Manager Daria Loi:

            “With touch capability becoming available in more and more Ultrabook devices, Intel undertook a research program to better understand if and how people might use touch capabilities in more traditional, notebook form-factor devices…… To spoil the ending, the results were positive-very positive, in fact. Users who were presented with a way to interact with their computers via touch, keyboard, and mouse found it an extremely natural and fluid way of working. One user described it using the Italian word simpatico-literally, that her computer was in tune with her and sympathetic to her demands.” – “The Human Touch: Building Ultrabook™ Applications in a Post-PC Age” [Intel Developer Zone blog, July 11, 2012]

            Touch designs in Ultrabook form factors continue to ramp up, especially with the October 2012 launch of Windows*8, and this trend is expected to continue.

            Power

            One of the most intriguing announcements to come out of Beijing was the idea of heightened power consumption for the Ultrabook. Chips for notebooks, phones, and tablets are going to be greatly enhanced, boosting both runtime and standby power:

            “By effectively removing nearly 3W of background drain, all operations are going to benefit, not just idle. Where Internet browsing was a 9W operation, expect to see that go down to around 6W for a big increase in battery life….. By reducing the mainboard size, space is created for more battery. Intel says there’s a chance to fit 20-45% more battery inside when motherboard sizes are reduced using HDI techniques.” – Ultrabooknews.com

            Higher power expectations ties in with the announcement of 4th generation Intel Core processor Ultrabook systems that are coming out as early as June 2013 and on track for Q2 2013 launch.

            Ultrabooks: just getting started

            The experience you can expect from an Ultrabook with the new 4th generation core processor is, in a word, superior. These are extremely responsive machines that offer amazing performance, a natural UI with touch and voice, and AOAC (always on always connected) as a given. You also get to take advantage of Intel Identity Protection, anti-virus, facial log-in, vPro, and Small Business Advantage so your data is always safe. The machine itself is meant to be mobile, with all-day battery life, thinner lighter designs, and Intel Wireless Display. And let’s not forget that it just looks cool; great visuals, 2 in 1 convertibles and detachable form factors, not to mention a high res display.

            Ultrabook as a PC category is continuing to drive market innovation; we’re seeing thinner form factors, intriguing designs (convertibles, detachable, etc.), and more natural human/computer interaction, such as voice control integration. Ultrabooks are able to deliver what is essentially a mobile computing experience; we’re looking at consumption usages similar to that of a smartphone or a tablet, with the productivity potential and sheer computing power of that of a full-blown PC. Is it a notebook or is it a tablet? The beauty of an Ultrabook is that it’s both.

            Software defined server without Microsoft: HP Moonshot

            Updates as of Dec 6, 2013 (8 months after the original post):

            image

            Martin Fink, CTO and Director of HP Labs, Hewlett-Packard Company [Oct 29, 2013]:

            This Cloud, Social, Big Data and Mobile we are referring to as this “New Style of IT” [when talking about the slide shown above]

            Through the Telescope: 3 Minutes on HP Moonshot [HewlettPackardVideos YouTube channel, July 24, 2013]

            Steven Hagler (Senior Director, HP Americas Moonshot) provides insight on Moonshot, why it’s right for the market, and what it means for your business. http://hp.com/go/moonshot

            HIGHLY RECOMMENDED READING:
            HP Offers Exclusive Peek Inside Impending Moonshot Servers [Enterprise Tech, Nov 26, 2013]: “The company is getting ready to launch a bunch of new server nodes for Moonshot in a few weeks”.
            – So far, the most simple and understandable info is serviced in Visual Configuration Moonshot diagram set: http://www.goldeneggs.fi/documents/GE-HP-MOONSHOT-A.pdf  This site includes also full visualisation for all x86 rack, desktop and blade servers.

            From HP Launches Investment Solutions to Ease Organizations’ Transitions to “New Style of IT” [press release, Dec 6, 2013]

            The HP accelerated migration program for cloud—helps …

            The HP Pre-Provisioning Solution—lets …

            New investment solutions for HP Moonshot servers and HP Converged Systems—provide customers and channel partners with quick access to the latest HP products through a simple, scalable and predictable monthly payment that aligns technology and financial requirements to business needs.   

            Access the world’s first software defined server [HP offering, Nov 27, 2013]
            With predictable and scalable monthly payments

            HP Moonshot Financing
            Cloud, Mobility, Security and Big Data require a different level of technology efficiency and scalability. Traditional systems may no longer be able to handle the increasing internet workloads with optimal performance. Having and investment strategy that gives you access to newer technology such as HP Moonshot allows you to meet the requirements for the New Style of IT.
            A simple and flexible payment structure can help you access the latest technology on your terms.
            Why leverage a predictable monthly payment?
            • Provides financial flexibility to scale up your business
            • May help mitigate the financial risk of your IT transformation
            Enables IT refresh cycles to keep up with latest technology
            • May help improve your cash flow
            • Offers predictable monthly payments which can help you stay within budget
            How does it work?
            • Talk to your HP Sales Rep about acquiring HP Moonshot using a predictable monthly payment
            Expand your capacity easily with a simple add-on payment
            • Add spare capacity needed for even greater agility
            • Set your payment terms based on your business needs
            • After an agreed term, you’ll be able to refresh your technology

            From The HP Moonshot team provides answers to your questions about the datacenter of the future [The HP Blog Hub, as of Aug 29, 2013]

            Q: WHAT IS THE FUNDAMENTAL IDEA BEHIND THE HP MOONSHOT SYSTEM?

            A: The idea is simple—use energy-efficient CPU’s attuned to a particular application to achieve radical power, space and cost savings. Stated another way; creating software defined servers for specific applications that run at scale.

            Q: WHAT IS INNOVATIVE ABOUT THE HP MOONSHOT ARCHITECTURE?

            A: The most innovative characteristic of HP Moonshot is the architecture. Everything that is a common resource in a traditional server has been converged into the chassis. The power, cooling, management, fabric, switches and uplinks are all shared across 45 hot-pluggable cartridges in a 4.3U chassis.

            Q: EXPLAIN WHAT IS MEANT BY “SOFTWARE DEFINED” SERVER

            A: Software defined servers achieve optimal useful work per watt by specializing for a given workload: matching a software application with available technology that can provide the most optimal performance. For example, the firstMoonshot server is tuned for the web front end LAMP (Linux/Apache/MySQL/PHP) stack. In the most extreme case of a future FPGA (Field Programmable Gate Array) cartridge, the hardware truly reflects the exact algorithm required.

            Q: DESCRIBE THE FABRIC THAT HAS BEEN INTEGRATED INTO THE CHASSIS

            A: The HP Moonshot 1500 Chassis has been built for future SOC designs that will require a range of network capabilities including cartridge to cartridge interconnect. Additionally, different workloads will have a range of storage needs. 

            There are four separate and independent fabrics that support a range of current and future capabilities; 8 lanes of Ethernet; storage fabric (6Gb SATA) that enable shared storage amongst cartridges or storage expansion to a single cartridge; a dedicated iLO management network to manage all the servers as one; a cluster fabric with point to point connectivity and low latency interconnect between servers.

            image

            Martin Fink, CTO and Director of HP Labs, Hewlett-Packard Company [Oct 29, 2013]:

            We’ve actually announced three ARM-based cartridges. These are available in our Discovery Labs now, and they’ll be shipping next year with new processor technology. [When talking about the slide shown above.]

            Calxeda Midway in HP Moonshot [Janet Bartleson YouTube channel, Oct 28, 2013]

            HP’s Paul Santeler encourages you to test Calxeda’s Midway-based Moonshot server cartridges in the HP Discovery Labs. http://www.hp.com/go/moonshot http://www.calxeda.com

            Details about the latest and future Calxeda SoCs see in the closing part of this Dec 7 update

            @SC13: HP Moonshot ProLiant m800 Server Cartridge with Texas Instruments [Janet Bartleson YouTube channel, Nov 26, 2013]

            @SC13, Texas Instruments’ Arnon Friedmann shows the HP ProLiant m800 Server Cartridge with 4 66K2H12 Keystone II SoCs each with 4 ARM Cortex A15 cores and 8 C66x DSP cores–alltogether providing 500 gigaflops of DSP performance and 8Gigabytes of data on the server cartridge. It’s lower power, lower cost than traditional servers.

            Details about the latest Texas Instruments DSP+ARM SoCs see after the Calxeda section in the closing part of this Dec 7 update

            The New Style of IT & HP Moonshot: Keynote by HP’s Martin Fink at ARM TechCon ’13 [ARMflix YouTube channel, recorded on Oct 29, published on Nov 11, 2013]

            Keynote Presentation: The New Style of IT Speaker: Martin Fink, CTO and Director of HP Labs, Hewlett-Packard Company It’s an exciting time to be in technology. The IT industry is at a major inflection point driven by four generation-defining trends: the cloud, social, Big Data, and mobile. These trends are forever changing how consumers and businesses communicate, collaborate, and access information. And to accommodate these changes, enterprises, governments and fast growing companies desperately need a “New Style of IT.” Shaping the future of IT starts with a radically different approach to how we think about compute — for example, in servers, HP has a game-changing new category that requires 80% less space, uses 89% less energy, costs 77% less–and is 97% less complex. There’s never been a better time to be part of the ecosystem and usher in the next-generation of innovation.

            From Big Data and the future of computing – A conversation with John Sontag [HP Enterprise 20/20 Blog, October 28, 2013]

            20/20 Team: Where is HP today in terms of helping everyone become a data scientist?
            John Sontag: For that to happen we need a set of tools that allow us to be data scientists in more than the ad hoc way I just described. These tools should let us operate productively and repeatably, using vocabulary that we can share – so that each of us doesn’t have to learn the same lessons over and over again. Currently at HP, we’re building a software tool set that is helping people find value in the data they’re already surrounded by. We have HAVEn for data management, which includes the Vertica data store, and Autonomy for analysis. For enterprise security we have ArcSight and ThreatCentral. We have our work around StoreOnce to compress things, and Express Query to allow us to consume data in huge volumes. Then we have hardware initiatives like Moonshot, which is bringing different kinds of accelerators to bear so we can actually change how fast – and how effectively – we can chew on data.
            20/20 Team: And how is HP Labs helping shape where we are going?
            John Sontag: One thing we’re doing on the software front is creating new ways to interrogate data in real time through an interface that doesn’t require you to be a computer scientist.  We’re also looking at how we present the answers you get in a way that brings attention to the things you most need to be aware of. And then we’re thinking about how to let people who don’t have massive compute resources at their disposal also become data scientists.
            20/20 Team: What’s the answer to that?
            John Sontag: For that, we need to rethink the nature of the computer itself. If Moonshot is helping us make computers smaller and less energy-hungry, then our work on memristors will allow us to collapse the old processor/memory/storage hierarchy, and put processing right next to the data. Next, our work on photonics will help collapse the communication fabric and bring these very large scales into closer proximity. That lets us combine systems in new and interesting ways. And then we’re thinking about how to package these re-imagined computers into boxes of different sizes that match the needs of everyone from the individual to the massive, multinational entity. On top of all that, we need to reduce costs – if we tried to process all the data that we’re predicting we’ll want to at today’s prices, we’d collapse the world economy – and we need to think about how we secure and manage that data, and how we deliver algorithms that let us transform it fast enough so that you, your colleagues, and partners across the world can conduct experiments on this data literally as fast as we can think them up.
            About John Sontag:
            John Sontag is vice president and director of systems research at HP Labs. The systems research organization is responsible for research in memristor, photonics, physical and system architectures, storing data at high volume, velocity and variety, and operating systems. Together with HP business units and partners, the team reaches from basic research to advanced development of key technologies.
            With more than 30 years of experience at HP in systems and operating system design and research, Sontag has had a variety of leadership roles in the development of HP-UX on PA-RISC and IPF, including 64-bit systems, support for multiple input/output systems, multi-system availability and Symmetric Multi-Processing scaling for OLTP and web servers.
            Sontag received a bachelor of science degree in electrical engineering from Carnegie Mellon University.

            Meet the Innovators [HewlettPackardVideos YouTube channel, May 23, 2013]

            Meet those behind the innovative technology that is HP Project Moonshot http://www.hp.com/go/moonshot

            From Meet the innovators behind the design and development of Project Moonshot [The HP Blog Hub, June 6, 2013]

            This video introduces you to key HP team members who were part of the team that brings you the innovative technology that fundamentally changes how hyperscale servers are built and operated such as:
            • Chandrakant Patel – HP Senior Fellow and HP Labs Chief Engineer
            • Paul Santeler  – Senior Vice President and General Manager of the HyperScale Business Unit
            • Kelly Pracht – Moonshot Hardware Platform Manager, HyperScale Business Unit
            • Dwight Barron – HP Fellow, Chief Technologist, HyperScale Business Unit

            From Six IT technologies to watch [HP Enterprise 20/20 Blog, Sept 5, 2013]

            1. Software-defined everything
            Over the last couple of years we have heard a lot about software defined networks (SDN) and more recently, software defined data center (SDDC). There are fundamentally two ways to implement a cloud. Either you take the approach of the major public cloud providers, combining low-cost skinless servers with commodity storage, linked through cheap networking. You establish racks and racks of them. It’s probably the cheapest solution, but you have to implement all the management and optimization yourself. You can use software tools to do so, but you will have to develop the policies, the workflows and the automation.
            Alternatively you can use what is becoming known as “converged infrastructure,” a term originally coined by HP, but now used by all our competitors. Servers, storage and networking are integrated in a single rack, or a series of interconnected ones, and the management and orchestration software included in the offering, provides an optimal use of the environment. You get increased flexibility and are able to respond faster to requests and opportunities.
            We all know that different workloads require different characteristics. Infrastructures are typically implemented using general purpose configurations that have been optimized to address a very large variety of workloads. So, they do an average job for each. What if we could change the configuration automatically whenever the workload changes to ensure optimal usage of the infrastructure for each workload? This is precisely the concept of software defined environments. Configurations are no longer stored in the hardware, but adapted as and when required. Obviously this requires more advanced software that is capable of reconfiguring the resources.
            A software-defined data center is described as a data center where the infrastructure is virtualized and also delivered as a service. Control of the data center is automated by software – meaning hardware configuration is maintained through intelligent software systems. Three core components comprise the SDDC, server virtualization, network virtualization and storage virtualization. It remains to be said that some workloads still require physical systems (often referred to as bare metal), hence the importance of projects such as OpenStack’s Ironic which could be defined as a hypervisor for physical environments.

            2. Specialized servers

            As I mentioned, all workloads are not equal, but run on the same, general purpose servers (typically x86). What if we create servers that are optimized for specific workloads? In particular, when developing cloud environments delivering multi-tenant SaaS services, one could well envisage the use of servers specialized for a specific task, for example video manipulation, dynamic web service management. Developing efficient, low energy specialized servers that can be configured through software is what HP’s Project Moonshot is all about. Today, although still in its infancy, there is much more to come. Imagine about 45 server/storage cartridges linked through three fabrics (for networking, storage and high speed cartridge to cartridge interconnections), sharing common elements such as network controllers, management functions and power management. If you then build the cartridges using low energy servers, you reduce energy consumption by nearly 90%. If you build SaaS type environments, using multi-tenant application modules, do you still need virtualization? This simplifies the environment, reduces the cost of running it and optimizes the use of server technology for every workload.

            Particularly for environments that constantly run certain types of workloads, such as analyzing social or sensor data, the use of specialized servers can make the difference. This is definitely an evolution to watch.

            3. Photonics

            Let’s now complement those specialized servers with photonic based connections enabling flat, hyper-efficient networks boosting bandwidth, and we have an environment that is optimized to deliver the complex tasks of analyzing and acting upon signals provided by the environment in its largest sense.

            But technology is going even further. I talked about the three fabrics, over time; why not use photonics to improve the speed of the fabrics themselves, increasing the overall compute speed. We are not there yet, but early experiments with photonic backplanes for blade systems have shown overall compute speed increased up to a factor seven. That should be the second step.

            The third step takes things further. The specialized servers I talked about are typically system on a chip (SoC) servers, in other words, complete computers on a single chip. Why not use photonics to link those chips with their outside world? On-chip lasers have been developed in prototypes, so we are not that far out. We could even bring things one step further and use photonics within the chip itself, but that is still a little further out. I can’t tell you the increase in compute power that such evolutions will provide you, but I would expect it to be huge.

            4. Storage
            Storage is at a crossroads. On the one hand, hard disk drives (HDD) have improved drastically over the last 20 years, both in reading speed and in density. I still remember the 20MB hard disk drive, weighing 125Kg of the early 80’s. When I compare that with the 3TB drive I bought a couple months ago for my home PC, I can easily depict this evolution. But then the SSD (solid state disk) has appeared. Where a HDD read will take you 4 ms, the SDD read is down at 0.05 ms.

            Using nanotechnologies, HP Labs did develop prototypes of the Memristor, a new approach to data storage, faster than Flash memory and consumes way less energy. Such a device could store up to 1 petabit of information per square centimeter and could replace both memory and storage, speeding up access to data and allowing order of magnitude increase in the amount of data stored. Since HP has been busy preparing production of these devices. First production units should be available towards the end of 2013 or early in 2014. It will transform our storage approaches completely.


            Details about the latest and future Calxeda SoCs:

            Calxeda EnergyCore ECX-2000 family – ARM TechCon ’13 [ARMflix YouTube channel, recorded on Oct 30, 2013]

            Calxeda tells us about their new EnergyCore ECX-2000 product line based on ARM Cortex-A15. http://www.calxeda.com/ecx-2000-family/

            From ECX-2000 Product Brief [October, 2013]

            The Calxeda EnergyCore ECX-2000 Series is a family of SoC (Server-on-Chip) products that delivers the power efficiency of ARM® processors, and the OpenStack, Linux, and virtualization software needed for modern cloud infrastructures. Using the ARM Cortex A15 quad-core processor, the ECX-2000 delivers roughly twice the performance, three times the memory bandwidth, and four times the memory capacity of the ground-breaking ECX-1000. It is extremely scalable due to the integrated Fleet Fabric Switch, while the embedded Fleet Engine simultaneously provides out-of-band control and intelligence for autonomic operation.

            In addition to enhanced performance, the ECX-2000 provides hardware virtualization support via KVM and Xen hypervisors. Coupled with certified support for Ubuntu 13.10 and the Havana Openstack release, this marks the first time an ARM SoC is ready for Cloud computing. The Fleet Fabric enables the highest network and interconnect bandwidth in the MicroServer space, making this an ideal platform for streaming media and network-intensive applications.

            The net result of the EnergyCore SoC architecture is a dramatic reduction in power and space requirements, allowing rapidly growing data centers to quickly realize operating and capital cost savings.

            image

            Scalability you can grow into. An integrated EnergyCore Fabric Switch within every SoC provides up to five 10 Gigabit lanes for connecting thousands of ECX-2000 server nodes into clusters capable of handling distributed applications at extreme scale. Completely topology agnostic, each SoC can be deployed to work in a variety of mesh, grid, or tree network structures, providing opportunities to find the right balance of network throughput and fault resiliency for any given workload.

            Fleet Fabric Switch
            • Integrated 80Gb (8×8) crossbar switch with through-traffic support
            • Five (5) 10Gb external channels, three (3) 10Gb internal channels
            • Configurable topology capable of connecting up to 4096 nodes
            • Dynamic Link Speed Control from 1Gb to 10Gb to minimize power and maximize performance
            • Network Proxy Support maintains network presence even with node powered off
            • In-order flow delivery
            • MAC learning provider support for virtualization

            ARM Servers and Xen — Hypervisor Support at Hyperscale – Larry Wikelius, [Co-Founder of] Calxeda [TheLinuxFoundation YouTube channel, Oct 1, 2013]

            [Xen User Summit 2013] The emergence of power optimized hyperscale servers is leading to a revolution in Data Center design. The intersection of this revolution with the growth of Cloud Computing, Big Data and Scale Out Storage solutions is resulting in innovation at rate and pace in the Server Industry that has not been seen for years. One particular example of this innovation is the deployment of ARM based servers in the Data Center and the impact these servers have on Power, Density and Scale. In this presentation we will look at the role that Xen is playing in the Revolution of ARM based server design and deployment and the impact on applications, systems management and provisioning.

            Calxeda Launches Midway ARM Server Chips, Extends Roadmap [EnterpriseTech, Oct 28, 2013]

            ARM server chip supplier Calxeda is just about to ship its second generation of EnergyCore processors for hyperscale systems and most of its competitors are still working on their first products. Calxeda is also tweaking its roadmap to add a new chip to its lineup, which will bridge between the current 32-bit ARM chips and its future 64-bit processors.
            There is going to be a lot of talk about server-class ARM processors this week, particularly with ARM Holdings hosting its TechCon conference in Santa Clara.
            A month ago, EnterpriseTech told you about the “Midway” chip that Calxeda had in the works and as well as its roadmap to get beefier 64-bit cores and extend its Fleet Services fabric to allow for more than 100,000 nodes to be linked together.
            The details were a little thin on the Midway chip, but we now know that it will be commercialized as the ECX-2000, and that Calxeda is sending out samples to server makers right now. The plan is to have the ECX-2000 generally available by the end of the year, and that is why company is ready to talk about some feeds and speeds. Karl Freund, vice president of marketing at Calxeda, walked EnterpriseTech through the details.

            image

            The Midway chip is fabricated in the same 40 nanometer process as the existing “High Bank” ECX-1000 chip that Calxeda first put into the field in November 2011 in the experimental “Redstone” hyperscale servers from Hewlett-Packard. That 32-bit chip, based on the ARM Cortex-A9 core, was subsequently adopted in systems from Penguin Computing, Boston, and a number of other hyperscale datacenter operators who did proofs of concept with the chips. The ECX-1000 has four cores and was somewhat limited in its performance and was definitely limited in its main memory, which topped out at 4 GB across the four-core processor. But the ECX-2000 addresses these issues.
            The ECX-2000 is based on ARM Holding’s Cortex-A15 core and has the 40-bit physical memory extensions, which allows for up to 16 GB of memory to be physically attached to each socket. With the 40-bit physical addressing added with the Cortex-A15, the memory controller can, in theory, address up to 1 TB of main memory; this is called Large Physical Address Extension (LPAE) in the ARM lingo, and it maps the 32-bit physical addressing on the core to a 40-bit virtual address space. Each core on the ECX-2000 has 32 KB of L1 instruction cache and 32 KB of L1 data cache, and ARM licensees are allowed to scale the L2 cache as they see fit. The ECX-2000 has 4 MB of L2 cache shared across the four cores on the die. These are exactly the same L1 and L2 cache sizes as used in the prior ECX-1000 chips.
            The Cortex-A15 design was created to scale to 2.5 GHz, but as you crank up the clocks on any chip, the amount of energy consumed and heat radiated grows progressively larger as clock speeds go up. At a certain point, it just doesn’t make sense to push clock speeds. Moreover, every drop in clock speed gives a proportionately larger increase in thermal efficiency, and this is why, says Freund, Calxeda is making its implementation of the Cortex-A15 top out at 1.8 GHz. The company will offer lower-speed parts running at 1.1 GHz and 1.4 GHz for customers that need an even better thermal profile or a cheaper part where low cost is more important than raw performance or thermals.
            What Calxeda and its server and storage array customers are focused on is the fact that the Midway chip running at 1.8 GHz has twice the integer, floating point, and Java performance of a 1.1 GHz High Bank chip. That is possible, in part, because the new chip has four times the main memory and three times the memory bandwidth as the old chip in addition to a 64 percent boost in clock speed. Calxeda is not yet done benchmarking systems using the chips to get a measure of their thermal efficiency, but is saying that there is as much as a 33 percent boost in performance per watt comparing old to new ECX chips.
            The new ECX-2000 chip has a dual-core Cortex-A7 chip on the die that is used as a controller for the system BIOS as well as a baseboard management controller and a power management controller for the servers that use them. These Fleet Engines, as Calxeda calls them, eliminate yet another set of components, and therefore their cost, in the system. These engines also control the topology of the Fleet Services fabric, which can be set up in 2D torus, mesh, butterfly tree, and fat tree network configurations.
            The Fleet Services fabric has 80 Gb/sec of aggregate bandwidth and offers multiple 10 Gb/sec Ethernet links coming off the die to interconnect server nodes on a single card, multiple cards in an enclosure, multiple enclosures in a rack, and multiple racks in a data center. The Ethernet links are also used to allow users to get to applications running on the machines.
            Freund says that the ECX-2000 chip is aimed at distributed, stateless server workloads, such as web server front ends, caching servers, and content distribution. It is also suitable for analytics workloads like Hadoop and distributed NoSQL data stores like Cassandra, all of which tend to run on Linux. Both Red Hat and Canonical are cooking up commercial-grade Linuxes for the Calxeda chips, and SUSE Linux is probably not going to be far behind. The new chips are also expected to see action in scale-out storage systems such as OpenStack Swift object storage or the more elaborate Gluster and Ceph clustered file systems. The OpenStack cloud controller embedded in the just-announced Ubuntu Server 13.10 is also certified to run on the Midway chip.
            Hewlett-Packard has confirmed that it is creating a quad-node server cartridge for its “Moonshot” hyperscale servers, which should ship to customers sometime in the first or second quarter of 2014. (It all depends on how long HP takes to certify the system board.) Penguin Computing, Foxconn, Aaeon, and Boston are expected to get beta systems out the door this year using the Midway chip and will have them in production in the first half of next year. Yes, that’s pretty vague, but that is the server business, and vagueness is to be expected in such a young market as the ARM server market is.
            Looking ahead, Calxeda is adding a new processor to its roadmap, code-named “Sarita.” Here’s what the latest system-on-chip roadmap looks like now:

            image

            The future “Lago” chip is the first 64-bit chip that will come out of Calxeda, and it is based on the Cortex-A57 design from ARM Holdings –one of several ARMv8 designs, in fact. (The existing Calxeda chips are based on the ARMv7 architecture.)
            Both Sarita and Lago will be implemented in TSMC’s 28 nanometer processes, and that shrink from the current 40 nanometer to 28 nanometer processes is going to allow for a lot more cores and other features to be added to the die and also likely a decent jump in clock speed, too. Freund is not saying at the moment which way it will go.
            But what Freund will confirm is that Sarita will be pin-compatible with the existing Midway chip, meaning that server makers who adopt Midway will have a processor bump they can offer in a relatively easy fashion. It will also be based on the Cortex-A57 cores from ARM Holdings, and will sport four cores on a die that deliver about a 50 percent performance increase compared to the Midway chips.
            The Lago chips, we now know, will scale to eight cores on a die and deliver about twice the performance of the Midway chips. Both Lago and Sarita are on the same schedule, in fact, and they are expected to tape out this quarter. Calxeda expects to start sampling them to customers in the second quarter of 2014, with production quantities being available at the end of 2014.
            Not Just Compute, But Networking, Too
            As important as the processing is to a system, the Fleet Services fabric interconnect is perhaps the key differentiator in its design. The current iteration of that interconnect, which is a distributed Layer 2 switch fabric that is spread across each chip in a cluster, can scale across 4,096 nodes without requiring top-of-rack and aggregation switches.

            image

            Both of the Lago and Sarita chips will be using the Fleet Services 2.0 intehttp://www.ti.com/product/66ak2h12rconnect that is now being launched with Midway. This iteration of the interconnect has all kinds of tweaks and nips and tucks but no scalability enhancements beyond the 4,096 nodes in the original fabric.
            Freund says that the Fleet Services 3.0 fabric, which allows the distributed switch architecture to scale above 100,000 nodes in a flat network, will probably now come with the “Ratamosa” chips in 2015. It was originally – and loosely – scheduled for Lago next year. The circuits that do the fabric interconnect is not substantially different, says Freund, but the scalability is enabled through software. It could be that customers are not going to need such scalability as rapidly as Calxeda originally thought.
            The “Navarro” kicker to the Ratamosa chip is presumably based on the ARMv9 architecture, and Calxeda is not saying anything about when we might see that and what properties it might have. All that it has said thus far is that it is aimed at the “enterprise server era.”


            Details about the latest Texas Instruments DSP+ARM SoCs:

            A Better Way to Cloud [MultiVuOnlineVideo YouTube channel, Nov 13, 2012]

            To most technologists, cloud computing is about applications, servers, storage and connectivity. To Texas Instruments Incorporated (TI) (NASDAQ: TXN) it means much more. Today, TI is unveiling a BETTER way to cloud with six new multicore System-on-Chips (SoCs). Based on its award winning KeyStone architecture, TI’s SoCs are designed to revitalize cloud computing, inject new verve and excitement into pivotal infrastructure systems and, despite their feature rich specifications and superior performance, actually reduce energy consumption. To view Multimedia News Release, go to http://www.multivu.com/mnr/54044-texas-instruments-keystone-multicore-socs-revitalize-cloud-applications

            Infinite Scalability in Multicore Processors [Texas Instruments YouTube channel, Aug 27, 2012]

            Over the years, our industry has preached how different types of end equipments and applications are best served by distinctive multicore architectures tailored to each. There are even those applications, such as high performance computing, which can be addressed by more than one type of multicore architecture. Yet most multicore devices today tend to be suited for a specific approach or a particular set of markets. This keynote address, from the 2012 Multicore Developer’s Conferece, touches upon why the market needs an “infinitely scalable” multicore architecture which is both scalable and flexible enough to support disparate markets and the varied ways in which certain applications are addressed. The speaker presents examples of how a single multicore architecture can be scalable enough to address the needs of various high performance markets, including cloud RAN, networking, imaging and high performance computing. Ramesh Kumar manages the worldwide business for TI’s multicore growth markets organization. The organization develops multicore processors and software that are targeted for the communication infrastructure space, including multimedia and networking infrastructure equipment, as well as end equipment that requires multicore processors like public safety, medical imaging, high performance computing and test and measurement. Ramesh is a graduate of Northeastern University, where he obtained an executive MBA, and Purdue University where he received a master of science in electrical engineering.

            From Imagine the impact…TI’s KeyStone SoC + HP Moonshot [TI’s Multicore Mix Blog, April 19, 2013]

            TI’s participation in HP’s Pathfinder Innovation Ecosystem is the first step towards arming HP’s customers with optimized server systems that are ideally suited for workloads such as oil and gas exploration, Cloud Radio Access Networks (C-RAN), voice over LTE and video transcoding. This collaboration between TI and HP is a bold step forward, enabling flexible, optimized servers to bring differentiated technologies, such as TI’s DSPs, to a broader set of application providers. TI’s KeyStone II-based SoCs, which integrate fixed- and floating- point DSP cores with multiple ARM® Cortex™A-15 MPCore processors, packet and security processing, and high speed interconnect, give customers the performance, scalability and programmability needed to build software-defined servers. HP’s Moonshot system integrates storage, networking and compute cards with a flexible interconnect, allowing customers to choose the optimized ratio enabling the industry’s first software-defined server platform. Bringing TI’s KeyStone II-based SoCs into HP’s Moonshot system opens up several tantalizing possibilities for the future. Let’s look at a few examples:
            Think about the number of voice conversations happening over mobile devices every day. These conversations are independent of each other, and each will need transcoding from one voice format to another as voice travels from one mobile device, through the network infrastructure and to the other mobile device. The sheer number of such conversations demand that the servers used for voice transcoding be optimized for this function. Voice is just one example. Now think about video and music, and you can imagine the vast amount of processing required. Using TI’s KeyStone II-based SoCs with DSP technology provides optimized server architecture for these applications because our SoCs are specifically tuned for signal processing workloads.
            Another example can be with C-RAN. We have seen a huge push for mobile operators to move most of the mobile radio processing to the data center. There are several approaches to achieve this goal, and each has pros and cons associated with them. But one thing is certain – each approach has to do wireless symbol processing to achieve optimum 3G or 4G communications with smart mobile devices. TI’s KeyStone II-based SoCs are leading the wireless communication infrastructure market and combine key accelerators such as BCP (Bit Rate Co-Processor), VCP (Viturbi Co-Processor) and others to enable 3G/4G standards compliant for wireless processing. These key accelerators offload standard-based wireless processing from the ARM and/or DSP cores, freeing the cores for value-added processing. The combination of ARM/DSP with these accelerators provides an optimum SoC for 3G/4G wireless processing. By combining TI’s KeyStone II-based SoC with HP’s Moonshot system, operators and network equipment providers can now build customized servers for C-RAN to achieve higher performance systems at lower cost and ultimately provide better experiences to their customers.

            A better way to cloud: TI’s new KeyStone multicore SoCs [embeddednewstv YouTube channel, published on Jan 12,2013 (YouTube: Oct 21, 2013)]

            Brian Glinsman, vice president of multicore processors at Texas Instruments, discusses TI’s new KeyStone multicore SoCs for cloud infrastructure applications. TI announced six new SoCs, based on their 28-nm KeyStone architecture, featuring the Industry’s first implementation of quad ARM Cortex-A15 MPCore processors and TMS320C66x DSPs for purpose built servers, networking, high performance computing, gaming and media processing applications.

            Texas Instruments Offers System on a Chip for HPC Applications [RichReport YouTube channel, Nov 20, 2012]

            In this video from SC12, Arnon Friedmann from Texas Instruments describes the company’s new multicore System-on-Chips (SoCs). Based on its award winning KeyStone architecture, TI’s SoCs are designed to revitalize cloud computing, inject new verve and excitement into pivotal infrastructure systems and, despite their feature rich specifications and superior performance, actually reduce energy consumption. “Using multicore DSPs in a cloud environment enables significant performance and operational advantages with accelerated compute intensive cloud applications,” said Rob Sherrard, VP of Service Delivery, Nimbix. “When selecting DSP technology for our accelerated cloud compute environment, TI’s KeyStone multicore SoCs were the obvious choice. TI’s multicore software enables easy integration for a variety of high performance cloud workloads like video, imaging, analytics and computing and we look forward to working with TI to help bring significant OPEX savings to high performance compute users.”

            A better way to cloud: TI’s new KeyStone multicore SoCs revitalize cloud applications, enabling new capabilities and a quantum leap in performance at significantly reduced power consumption

              • Industry’s first implementation of quad ARM® Cortex™-A15 MPCore™ processors in infrastructure-class embedded SoC offers developers exceptional capacity & performance at significantly reduced power for networking, high performance computing and more
              • Unmatched combination of Cortex-A15 processors, C66x DSPs, packet processing, security processing and Ethernet switching, transforms the real-time cloud into an optimized high performance, power efficient processing platform
              • Scalable KeyStone architecture now features 20+ software compatible devices, enabling customers to more easily design integrated, power and cost-efficient products for high-performance markets from a range of devices

            ELECTRONICA – MUNICH (Nov.13, 2012) /PRNewswire/ — To most technologists, cloud computing is about applications, servers, storage and connectivity. To Texas Instruments Incorporated (TI) (NASDAQ: TXN) it means much more. Today, TI is unveiling a BETTER way to cloud with six new multicore System-on-Chips (SoCs). Based on its award winning KeyStone architecture, TI’s SoCs are designed to revitalize cloud computing, inject new verve and excitement into pivotal infrastructure systems and, despite their feature rich specifications and superior performance, actually reduce energy consumption.

            To TI, a BETTER way to cloud means:

              • Safer communities thanks to enhanced weather modeling;
              • Higher returns from time sensitive financial analysis;
              • Improved productivity and safety in energy exploration;
              • Faster commuting on safer highways in safer cars;
              • Exceptional video on any screen, anywhere, any time;
              • More productive and environmentally friendly factories; and
              • An overall reduction in energy consumption for a greener planet.
              TI’s new KeyStone multicore SoCs are enabling this – and much more. These 28-nm devices integrate TI’s fixed-and floating-point TMS320C66x digital signal processor (DSP) generation cores – yielding the best performance per watt ratio in the DSP industry – with multiple ARM® Cortex™-A15 MPCore™ processors – delivering unprecedented processing capability combined with low power consumption – facilitating the development of a wide-range of infrastructure applications that can enable more efficient cloud experiences. The unique combination of Cortex-A15 processors and C66x DSPcores, with built-in packet processing and Ethernet switching, is designed to efficiently offload and enhance the cloud’s first generation general purpose servers; servers that struggle with big data applications like high performance computing and video processing.
              “Using multicore DSPs in a cloud environment enables significant performance and operational advantages with accelerated compute intensive cloud applications,” said Rob Sherrard, VP of Service Delivery, Nimbix. “When selecting DSP technology for our accelerated cloud compute environment, TI’s KeyStone multicore SoCs were the obvious choice. TI’s multicore software enables easy integration for a variety of high performance cloud workloads like video, imaging, analytics and computing and we look forward to working with TI to help bring significant OPEX savings to high performance compute users.”
              TI’s six new high-performance SoCs include the 66AK2E02, 66AK2E05, 66AK2H06, 66AK2H12, AM5K2E02 and AM5K2E04, all based on the KeyStone multicore architecture. With KeyStone’s low latency high bandwidth multicore shared memory controller (MSMC), these new SoCs yield 50 percent higher memory throughput when compared to other RISC-based SoCs. Together, these processing elements, with the integration of security processing, networking and switching, reduce system cost and power consumption, allowing developers to support the development of more cost-efficient, green applications and workloads, including high performance computing, video delivery and media and image processing. With the matchless combination TI has integrated into its newest multicore SoCs, developers of media and image processing applications will also create highly dense media solutions.

              image

              “Visionary and innovative are two words that come to mind when working with TI’s KeyStone devices,” said Joe Ye, CEO, CyWee. “Our goal is to offer solutions that merge the digital and physical worlds, and with TI’s new SoCs we are one step closer to making this a reality by pushing state-of-the-art video to virtualized server environments. Our collaboration with TI should enable developers to deliver richer multimedia experiences in a variety of cloud-based markets, including cloud gaming, virtual office, video conferencing and remote education.”
              Simplified development with complete tools and support
              TI continues to ease development with its scalable KeyStone architecture, comprehensive software platform and low-cost tools. In the past two years, TI has developed over 20 software compatible multicore devices, including variations of DSP-based solutions, ARM-based solutions and hybrid solutions with both DSP and ARM-based processing, all based on two generations of the KeyStone architecture. With compatible platforms across TI’s multicore DSPs and SoCs, customers can more easily design integrated, power and cost-efficient products for high-performance markets from a range of devices, starting at just $30 and operating at a clock rate of 850MHz all the way to 15GHz of total processing power.
              TI is also making it easier for developers to quickly get started with its KeyStone multicore solutions by offering easy-to-use, evaluation modules (EVMs) for less than $1K, reducing developers’ programming burdens and speeding development time with a robust ecosystem of multicore tools and software.
              In addition, TI’s Design Network features a worldwide community of respected and well established companies offering products and services that support TI multicore solutions. Companies offering supporting solutions to TI’s newest KeyStone-based multicore SoCs include 3L Ltd., 6WIND, Advantech, Aricent, Azcom Technology, Canonical, CriticalBlue Enea, Ittiam Systems, Mentor Graphics, mimoOn, MontaVista Software, Nash Technologies, PolyCore Software and Wind River.
              Availability and pricing
              TI’s 66AK2Hx SoCs are currently available for sampling, with broader device availability in 1Q13 and EVM availability in 2Q13. AM5K2Ex and 66AK2Ex samples and EVMs will be available in the second half of 2013. Pricing for these devices will start at $49 for 1 KU.

              66AK2H14 (ACTIVE) Multicore DSP+ARM KeyStone II System-on-Chip (SoC) [TI.com, Nov 10, 2013]
              The same as below for 66AK2H12 SoC with addition of:

              More Literature:

              From that the below excerpt is essential to understand the added value above 66AK2H12 SoC:

              image

              Figure 1. TI’s KeyStone™ 66AK2H14 SoC

              The 66AK2H14 SoC shown in Figure 1, with the raw computing power of eight C66x processors and quad ARM Cortex-A15s at over 1GHz performance, enables applications such as very large fast fourier transforms (FFT) in radar and multiple camera image analytics where a 10Gbit/s networking connection is needed. There are, and have been, several sophisticated technologies that have offered the bandwidth and additional features to fill this role. Some such as Serial RapidIO® and Infiniband have been successful in application domains that Gigabit Ethernet could not address, and continue to make sense, but 10Gbit/s Ethernet will challenge their existence.

              66AK2H12 (ACTIVE) Multicore DSP+ARM KeyStone II System-on-Chip (SoC) [TI.com, created on Nov 8, 2012]

              Datasheet manual [351 pages]:

              More Literature:

              Description

              The 66AK2Hx platform is TI’s first to combine the quad ARM® Cortex™-A15 MPCore™ processors with up to eight TMS320C66x high-performance DSPs using the KeyStone II architecture. Unlike previous ARM Cortex-A15 devices that were designed for consumer products, the 66AK2Hx platform provides up to 5.6 GHz of ARM and 11.2 GHz of DSP processing coupled with security and packet processing and Ethernet switching, all at lower power than multi-chip solutions making it optimal for embedded infrastructure applications like cloud computing, media processing, high-performance computing, transcoding, security, gaming, analytics and virtual desktop. Using TI’s heterogeneous programming runtime software and tools, customers can easily develop differentiated products with 66AK2Hx SoCs.

              image

              Taking Multicore to the Next Level: KeyStone II Architecture [Texas Instruments YouTube channel, Feb 26, 2012]

              TI’s scalable KeyStone II multicore architecture includes support for both TMS320C66x DSP cores and multiple cache coherent quad ARM Cortex™-A15 clusters, for a mixture of up to 32 DSP and RISC cores. With significant updates to its award-winning KeyStone architecture, TI is now paving the way for a new era of high performance 28-nm devices that meld signal processing, networking, security and control functionality, with KeyStone II. Ideal for applications that demand superior performance and low power, devices based on the KeyStone architecture are optimized for high performance markets including communications infrastructure, mission critical, test and automation, medical imaging and high performance and cloud computing. For more information, please visit http://www.ti.com/multicore.

              Introducing the EVMK2H [Texas Instruments YouTube channel, Nov 15, 2013]

              Introducing the EVMK2H evaluation module, the cost-efficient development tool from Texas Instruments that enables developers to quickly get started working on designs for the 66AK2H06, 66AK2H12, and 66AK2H14 multicore DSP + ARM devices based on the KeyStone architecture.

              Kick start development of high performance compute systems with TI’s new KeyStone™ SoC and evaluation module [TI press release, Nov 14, 2013]

              Combination of DSP + ARM® cores and high-speed peripherals offer developers an optimal compute solution at low power consumption

              DALLAS, Nov. 14, 2013 /PRNewswire/ — Further easing the development of processing-intensive applications, Texas Instruments (TI) (NASDAQ: TXN) is unveiling a new system-on-chip (SoC), the 66AK2H14, and evaluation module (EVM) for its KeyStoneTM-based 66AK2Hx family of SoCs. With the new 66AK2H14 device, developers designing high-performance compute systems now have access to a 10Gbps Ethernet switch-on-chip. The inclusion of the 10GigE switch, along with the other high-speed, on-chip interfaces, saves overall board space, reduces chip count and ultimately lowers system cost and power. The EVM enables developers to evaluate and benchmark faster and easier. The 66AK2H14 SoC provides industry-leading computational DSP performance at 307 GMACS/153 GFLOPS and 19600 DMIPS of ARM performance, making it ideal for a wide variety of applications such as video surveillance, radar processing, medical imaging, machine vision and geological exploration.

              “Customers today require increased performance to process compute-intensive workloads using less energy in a smaller footprint,” said Paul Santeler, vice president and general manager, Hyperscale Business, HP. “As a partner in HP’s Moonshot ecosystem dedicated to the rapid development of new Moonshot servers, we believe TI’s KeyStone design will provide new capabilities across multiple disciplines to accelerate the pace of telecommunication innovations and geological exploration.”

              Meet TI’s new 10Gbps Ethernet DSP + ARM SoC
              TI’s newest silicon variant, the 66AK2H14, is the latest addition to its high-performance 66AK2Hx SoC family which integrates multiple ARM Cortex™-A15 MPCore™ processors and TI’s fixed- and floating-point TMS320C66x digital signal processor (DSP) generation cores. The 66AK2H14 offers developers exceptional capacity and performance (up to 9.6 GHz of cumulative DSP processing) at industry-leading size, weight, and power. In addition, the new SoC features a wide array of unique high-speed interfaces, including PCIe, RapidIO, Hyperlink, 1Gbps and 10Gbps Ethernet, achieving total I/O throughput of up to 154Gbps. These interfaces are all distinct and not multiplexed, allowing designers tremendous flexibility with uncompromising performance in their designs.
              Ease development and debugging with TI’s tools and software
              TI helps simplify the design process by offering developers highly optimized software for embedded HPC systems along with development and debugging tools for the EVMK2H – all for under $1,000. The EVMK2H features a single 66AK2H14 SoC, a status LCD, two 1Gbps Ethernet RJ-45 interfaces and on-board emulation. An optional EVM breakout card (available separately) also provides two 10Gbps Ethernet optical interfaces for 20Gbps backplane connectivity and optional wire rate switching in high density systems.
              The EVMK2H is bundled with TI’s Multicore Software Development Kit (MCSDK), enabling faster development with production ready foundational software. The MCSDK eases development and reduces time to market by providing highly-optimized bundles of foundational, platform-specific drivers, optimized libraries and demos.
              Complementary analog products to increase system performance
              TI offers a wide range of power management and analog signal chain components to increase the system performance of 66AK2H14 SoC-based designs. For example, the TPS53xx integrated FET DC/DC converters provide the highest level of power conversion efficiency even at light loads, while the LM10011 VID converter with dynamic voltage control helps reduce system power consumption. The CDCM6208 low-jitter clock generator also eliminates the need for external buffers, jitter cleaners and level translators.
              Availability and pricing
              TI’s EVMK2H is available now through TI distribution partners or TI.com for $995. In addition to TI’s Linux distribution provided in the MCSDK, Wind River® Linux is available now for the 66AK2Hxx family of SoCs. Green Hills® INTEGRITY® RTOS and Wind River VxWorks® RTOS support will each be available before the end of the year. Pricing for the 66AK2H14 SoC will start at $330 for 1 KU. The 10Gbps Ethernet breakout card will be available from Mistral.

              Ask the Expert: How can developers accelerate scientific computing with TI’s multicore DSPs? [Texas Instruments YouTube channel, Feb 7, 2012]

              Dr. Arnon Friedmann is the business manager for TI’s high performance computing products in the multicore and media infrastructure business. In this video, he explains how TI’s multicore DSPs are well suited for computing applications in oil and gas exploration, financial modeling and molecular dynamics, where ultra- high performance, low power and easy programmability are critical requirements.

              Ask the Expert: Arnon Friedmann [Texas Instruments YouTube channel, Sept 6, 2012]

              How are TI’s latest multicore devices a fit for video surveillance and smart analytic camera applications? Dr. Arnon Friedmann, PhD, is a business manager for multicore processors at Texas Instruments. In this role, he is responsible for growing TI’s business in high performance computing, mission critical, test and measurement and imaging markets. Prior to his current role, Dr. Friedmann served as the marketing director for TI’s wireless base station infrastructure group, where he was responsible for all marketing and design activities. Throughout his 14 years of experience in digital communications research and development, Dr. Friedmann has accumulated patents in the areas of disk drive systems, ADSL modems and 3G/4G wireless communications. He holds a PhD in electrical engineering and bachelor of science in engineering physics, both from the University of California, San Diego.

              End of Updates as of Dec 6, 2013


              The original post (8 months ago):

              HP Moonshot: Designed for the Data Center, Built for the Planet [HP press kit, April 8, 2013]

              On April 8, 2013, HP unveiled the world’s first commercially available HP Moonshot system, delivering compelling new infrastructure economics by using up to 89 percent less energy, 80 percent less space and costing 77 percent less, compared to traditional servers. Today’s mega data centers are nearing a breaking point where further growth is restricted due to the current economics of traditional infrastructure. HP Moonshot servers are a first step organizations can take to address these constraints.

              For more details on the disruptive potential of HP Moonshot, visit TheDisruption.com

              Introducing HP Moonshot [HewlettPackardVideos April 11, 2013]

              See how HP is defining disruption with the introduction of HP Moonshot.

              HP’s Cutting Edge Data Center Innovation [Ramón Baez, Senior Vice President and Chief Information Officer (CIO) of HP, HP Next [launched on April 2], April 10, 2013]

              This is an exciting time to be in the IT industry right now. For those of you who have been around for a while — as I have — there have been dramatic shifts that have changed how businesses operate.
              From the early days of the mainframes, to the explosion of the Internet and now social networks, every so often very important game-changing innovation comes along. We’re in the midst of another sea change in technology.
              Inside HP IT, we are testing the company’s Moonshot servers. With these servers running the same chips found in smart phones and tablets, they are using incredibly less power, require considerably less cooling and have a smaller footprint.

              We currently are running some of our intensive hp.com applications on Moonshot and are seeing very encouraging results. Over half a billion people will visit hp.com this year, and the new Moonshot technology will run at a fraction of the space, power and cost – basically we expect to run HP.com off of the same amount of energy needed for a dozen 60-watt light bulbs.

              This technology will revolutionize data centers.
              Within HP IT, we are fortunate in that over the past several years we have built a solid data center foundation to run our company. Like many companies, we were a victim of IT sprawl — with more than 85 data centers in 29 countries. We decided to make a change and took on a total network redesign, cutting our principle worldwide data centers down to six and housing all of them in the United States.
              With the addition of four new EcoPODs to our infrastructure and these new Moonshot servers, we are in the perfect position to build out our private cloud and provide our businesses with the speed and quality of innovation they need.
              Moonshot is just the beginning.The product roadmap for Moonshot is extremely promising and I am excited to see what we can do with it within HP IT, and what benefits our customers will see.

              What Calxeda is saying about HP Moonshot [HewlettPackardVideos YouTube channel, April 8, 2013] which is best to start with for its simple and efficient message, as well as what Intel targeting ARM based microservers: the Calxeda case [‘Experiencing the Cloud’ blog, Dec 14, 2012] already contained on this blog earlier:

              Calxeda discusses HP’s Project Moonshot and the cost, space, and efficiency innovations being enabled through the Pathfinder Innovation Ecosystem. http://hp.com/go/moonshot

              Then we can turn to the Moonshot product launch by HP 2 days ago:

              Note that the first three videos following here were released 3 days later, so don’t be surpised by YouTube dates, in fact the same 3 videos (as well as the “Introducing HP Moonshot” embedded above) were delivered on April 8 live webcast, see the first 18 minutes of that, and then follow according HP’s flow of the presentation if you like. I would certainly recommend my own presentation compiled here.

              HP president and CEO Meg Whitman on the emergence of a new style of IT [HewlettPackardVideos YouTube channel, April 11, 2013]

              HP president and CEO Meg Whitman outlines the four megatrends causing strain on current infrastructure and how HP Project Moonshot servers are built to withstand data center challenges.

              EVP and GM of HP’s Enterprise Group Dave Donatelli discusses HP Moonshot [HewlettPackardVideos YouTube channel, April 11, 2013]

              EVP and GM of HP’s Enterprise Group Dave Donatelli details how HP Moonshot redefines the server market.

              Tour the Houston Discovery Lab — where the next generation of innovation is created [HewlettPackardVideos YouTube channel, April 11, 2013]

              SVP and GM of HP’s Industry Standard Servers and Software Mark Potter and VP and GM of HP’s Hyperscale Business Unit Paul Santeler tour HP’s Discovery Lab in Houston, Texas. HP’s Discovery Lab allows customers to test, tune and port their applications on HP Moonshot servers in-person and remotely.

              A new era of accelerated innovation [HP Moonshot minisite, April 8, 2013]

              Cloud, Mobility, Security, and Big Data are transforming what the business expects from IT resulting in a “New Style of IT.” The result of alternative thinking from a proven industry leader, HP Moonshot is the world’s first software defined server that will accelerate innovation while delivering breakthrough efficiency and scale.

              Watch the unveiling [link to HP Moonshot – The Disruption [HP Event registration page at ‘thedisruption.com’]image

              On the right is the Moonshot System with the very first Moonshot servers (“microservers/server appliances” as called by the industry) based on Intel® Atom S1200 processors and for supporting web-hosting workloads (see also on right part  of the image below). Currently there is also a storage cartridge (on the left of the below image) and a multinode for highly dense computing solutions (see in the hands of presenter on the image below). Many more are to come later on.

              image

              imageWith up to a 180 servers inside the box (45 now) it was necessary to integrate network switching. There are two sockets (see left) for the network switch so you can configure for redundancy. The downlink module which talks to the cartridges is on left of the below image. This module is paired with an uplink module (see on the middle of the below image as taken out, and then shown with the uplink module on the right) that is in the back of the server. There will be more options available.image

              More information:
              Enterprise Information Library for Moonshot
              HP Moonshot System [Technical white paper from HP, April 5, 2013] from which I will include here the following excerpts for more information:

              HP Moonshot 1500 Chassis

              The HP Moonshot 1500 Chassis is a 4.3U form factor and slides out of the rack on a set of rails like a file cabinet drawer. It supports 45 HP ProLiant Moonshot Servers and an HP Moonshot-45G Switch Module that are serviceable from the top.
              It is a modern architecture engineered for the new style of IT that can support server cartridges, server and storage cartridges, storage only cartridges and a range of x86, ARM or accelerator based processor technologies.
              As an initial offering, the HP Moonshot 1500 Chassis is fully populated 45 HP ProLiant Moonshot Servers and one HP Moonshot-45G Switch Module and a second HP Moonshot-45G Switch Module can be purchased as an option. Future offerings will include quad server cartridges and will result in up to 180 servers per chassis. The 4.3U form factor allows for 10 chassis per rack, which with the quad server cartridge amounts to 1800 servers in a single rack.
              The Moonshot 1500 Chassis simplifies management with four iLO processors that share management responsibility for the 45 servers, power, cooling, and switches.

              Highly flexible fabric

              Built into the HP Moonshot 1500 Chassis architecture are four separate and independent fabrics that support a range of current and future capabilities:
              • Network fabric
              • Storage fabric
              • Management fabric
              • Integrated cluster fabric
              Network fabric
              The Network fabric provides the primary external communication path for the HP Moonshot 1500 Chassis.
              For communication within the chassis, the network switch has four communication channels to each of the 45 servers. Each channel supports a 1-GbE or 10-GbE interface. Each HP Moonshot-45G Switch Module supports 6 channels of 10GbE interface to the HP Moonshot-6SFP network uplink modules located in the rear of the chassis.
              Storage fabric
              The Storage fabric provides dedicated SAS lanes between server and storage cartridges. We utilize HP Smart Storage firmware found in the ProLiant family of servers to enable multiple core to spindle ratios for specific solutions. A hard drive can be shared among multiple server cartridges to enable low cost boot, logging, or attached to a node to provide storage expansion.
              The current HP Moonshot System configuration targets light scale-out applications. To provide the best operating environment for these applications, it includes HP ProLiant Moonshot Servers with a hard disk drive (HDD) as part of the server architecture. Shared storage is not an advantage for these environments. Future releases of the servers thattarget different solutions will take advantage of the storage fabric.
              Management fabric
              We utilize the Integrated Lights-Out (iLO) application-specific integrated circuit (ASIC) standard in the HP ProLiant family of servers to provide the innovative management features in the HP Moonshot System. To handle the range of extreme low energy processors we provide a device neutral approach to management, which can be easily consumed by data center operators to deploy at scale.
              The Management fabric enables management of the HP Moonshot System components as one platform with a dedicated iLO network. Benefits of the management fabric include:
              • The iLO Chassis Manager aggregates data to a common set of management interfaces.
              • The HP Moonshot 1500 Chassis has a single Ethernet port gateway that is the single point of access for the Moonshot Chassis manager.
              • Intelligent Platform Management Interface (IPMI) and Serial Console for each server
              • True out-of-band firmware update services
              • SL-APM Rack Management spans rack or multiple racks
              Integrated Cluster fabric
              The Integrated Cluster fabric provides a high-speed interface among future server cartridge technologies that will benefit from high bandwidth node-to-node communication. North, south, east, and west lanes are provided between individual server cartridges.
              The current HP ProLiant Moonshot Servertargets light scale-out applications. These applications do not benefit from the node-to-node communications, so the Integrated Cluster fabric is not utilized. Future releases of the cartridges that target different workloads that require low latency interconnects will take advantage of the Integrated Cluster fabric.

              HP ProLiant Moonshot Server

              HP will bring a growing library of cartridges, utilizing cutting-edge technology from industry leading partners. Each server will target specific solutions that support emerging Web, Cloud, and Massive-Scale Environments, as well as Analytics and Telecommunications. We are continuing server development for other applications, including Big Data, High-Performance Computing, Gaming, Financial Services, Genomics, Facial Recognition, Video Analysis, and more.
              Figure 4. Cartridges target specific solutions

              image

              The first server cartridge now available is HP ProLiant Moonshot Server, which includes the Intel® Atom Processor S1260. This is a low power processor that is right-sized for the light workloads. It has dedicated memory and storage, with discrete resources. This server design is idealfor light scale-out applications. Light scale-out applications require relatively little processing but moderately high I/O and include environments that perform the following functions:
              • Dedicated web hosting
              • Simple content delivery
              The HP ProLiant Moonshot Server can hot plug in the HP Moonshot 1500 Chassis. If service is necessary, it can be removed without affecting the other servers in the chassis. Table 1 defines the HP ProLiant Moonshot Server specifications.
              Table 1. HP ProLiant Moonshot Server specifications

              Processor
              One Intel® Atom Processor S1260
              Memory
              8 GB DDR3 ECC 1333 MHz
              Networking
              Integrated dual-port 1Gb Ethernet NIC
              Storage
              500 GB or 1 TB HDD or SSD, non-hot-plug, small form factor
              Operating systems
              Canonical Ubuntu 12.04
              Red Hat Enterprise Linux 6.4
              SUSE Linux Enterprise Server 11 SP2

              imageWith that HP CEO Seeks Turnaround Unveiling ‘Moonshot’ Super-Server: Tech [Bloomberg, April, 2013] as well as HP Moonshot: Say Goodbye to the Vanilla Server [Forbes, April 8, 2013]. HP however is much more eyeing the ARM based Moonshot servers which are expected to come later, because of the trends reflected on the left (source: HP). The software defined server concept is very general. image

              There are a number of quite different server cartridges expected to come, all specialised by server software installed on it. Typical specialised servers, for example, are the ones on which CyWee from Taiwan is working on with Texas Instruments’ new KeyStone II architecture featuring both ARM Cortex-A15 CPU cores and TI’s own C66x DSP cores for a mixture of up to 32 DSP and RISC cores in TI’s new 66AK2Hx family of SoCs, first of which is the TMS320TCI6636 implemented in 28nm foundry technology. Based on that CyWee will deliver multimedia Moonshot server cartridges for cloud gaming, virtual office, video conferencing and remote education (see even the first Keystone announcement). This CyWee involvement in HP Moonshot effort is part of HP’s Pathfinder Partner Program which Texas Instruments also joined recently to exploit a larger opportunity as:

              TI’s 66AK2Hx family and its integrated c66x multicore DSPs are applicable for workloads ranging from high performance computing, media processing, video conferencing, off-line image processing & analytics, video recorders (DVR/NVR), gaming, virtual desktop infrastructure and medical imaging.

              But Intel was able to win the central piece of the Moonshot System launch (originally initiated by HP as the “Moonshot Project” in November 2011 for disruption in terms of power and TCO for servers, actually with a Calxeda board used for research and development with other partners), at least as it was productized just two days ago:
              Raejeanne Skillern from Intel – HP Moonshot 2013 – theCUBE [siliconangle YouTube channel]

              Raejeanne Skillern, Intel Director of Marketing for Cloud Computing, at HP Moonshot 2013 with John Furrier and Dave Vellante

              However ARM was not left out either just relegated in the beginning to highly advanced and/or specialised server roles with its SoC partners, and coming later in the year:

              • Applied Micro with networking and connectivity background having now the X-Gene ARM 64-bit Server on a Chip platform as well which features 8 ARM 64-bit high-performance cores developed from scratch according to an architecture license (i.e. not ARM’s own Cortex-A50 series core), clocked at up to 2.4GHz and also has 4 smaller cores for network and storage offloads (see AppliedMicro on the X-Gene ARM Server Platform and HP Moonshot [SiliconANGLE blog [April 9, 2013]). Sample reference boards to key customers were shipped in March (see Applied Micro’s cloud chip is an ARM-based, switch-killing machine [GigaOM, April 3, 2013]). In the latest X-Gene Arrives in Silicon [Open Compute Summit Winter 2013 presentation, Jan 16, 2013] video you can have the most recent strategic details (upto 2014 with FinFET implementation of a “Software defined X-Gene based data center components”, should be assumed that at 16nm). Here I will include a more product-oriented AppliedMicro Shows ARM 64-bit X-Gene Server on a Chip Hardware and Software [Charbax YouTube channel, Nov 3, 2012] overview video:
                Vinay Ravuri, Vice President and General Manager, Server Products at AppliedMicro gives an update on the 64bit ARM X-Gene Server Platform. At ARM Techcon 2012, AppliedMicro, ARM and several open-source software providers gave updates on their support of the ARM 64-bit X-Gene Server on a Chip Platform.

                More information: A 2013 Resolution for the Data Center [Applied Micro on Smart Connected Devices blog from ARM, Feb 4, 2013] about “plans from Oracle, Red Hat, Citrix and Cloudera to support this revolutionary architecture … Dell’s “Iron” server concept with X-Gene … an X-Gene based ARM server managed by the Dell DCS Software suite …” etc.

              • Texas Instruments with digital signal processing (DSP) background, as it was already presented above. 
              • Calxeda with integration of storage fabric and Internet switching background, with details coming later, etc.:

              This is what is empasized by Lakshmi Mandyam from ARM – HP Moonshot 2013 – theCUBE [siliconangle YouTube channel, April 8, 2013]

              Lakshmi Mandyam, Director of Server Systems and Ecosystems, ARM, at HP Moonshot 2013, with John Furrier and Dave Vellante

              She is also mentioning in the talk the achievements which could put ARM and its SoC partners into a role which Intel now has with its general Atom S1200 based server cartridge product fitting into the Moonshot system. Perspective information on that is already available on my ‘Experiencing the Cloud’ blog here:
              The state of big.LITTLE processing [April 7, 2013]
              The future of mobile gaming at GDC 2013 and elsewhere [April 6, 2013]
              TSMC’s 16nm FinFET process to be further optimised with Imagination’s PowerVR Series6 GPUs and Cadence design infrastructure [April 8, 2013]
              With 28nm non-exclusive in 2013 TSMC tested first tape-out of an ARM Cortex™-A57 processor on 16nm FinFET process technology [April 3, 2013]

              The absence of Microsoft is even more interesting as AMD is also on this Moonshot bandwagon: Suresh Gopalakrishnan from AMD – HP Moonshot 2013 – theCUBE [siliconangle YouTube channel, April 8, 2013]

              Suresh Gopalakrishnan, Vice President and General Manager, Server Business, AMD, at HP Moonshot 2013, with John Furrier and Dave Vellante

              already showing a Moonshot fitting server cartridge with AMD’s four next-generation SoCs (while Intel’s already productized cartridge is not yet at an SoC level). We know from CES 2013 that AMD Unveils Innovative New APUs and SoCs that Give Consumers a More Exciting and Immersive Experience [press release, Jan 7, 2013] with the:

              Temash” … elite low-power mobility processor for Windows 8 tablets and hybrids … to be the highest-performance SoC for tablets in the market, with 100 percent more graphics processing performance2 than its predecessor (codenamed “Hondo.”)
              Kabini” [SoC which] targets ultrathin notebooks with exceptional battery life and offers impressive levels of performance in both dual- and quad-core options. “Kabini” is expected to deliver an increase of more than 50 percent in performance3 over the previous generation of AMD essential computing APUs (codenamed “Brazos 2.0.”)
              Both APUs are scheduled to ship in the first half of 2013

              so AMD is really close to a server SoC to be delivered soon as well.

              The “more information” sections which follow her are:

              1. The Announcement
              2. Software Partners
              3. Hardware Partners


              1. The Announcement

              HP Moonshot [MultiVuOnlineVideo YouTube channel, April 8, 2013]

              HP today unveiled the world’s first commercially available HP Moonshot system, delivering compelling new infrastructure economics by using up to 89 percent less energy, 80 percent less space and costing 77 percent less, compared to traditional servers. Today’s mega data centers are nearing a breaking point where further growth is restricted due to the current economics of traditional infrastructure. HP Moonshot servers are a first step organizations can take to address these constraints.

              HP Launches New Class of Server for Social, Mobile, Cloud and Big Data [press release, April 8, 2013]

              Software defined servers designed for the data center and built for the planet
              … Built from HP’s industry-leading server intellectual property (IP) and 10 years of extensive research from HP Labs, the company’s central research arm, HP Moonshot delivers a significant improvement in energy, space, cost and simplicity. …
              The HP Moonshot system consists of the HP Moonshot 1500 enclosure and application-optimized HP ProLiant Moonshot servers. These servers will offer processors from multiple HP partners, each targeting a specific workload.
              With support for up to 1,800 servers per rack, HP Moonshot servers occupy one-eighth of the space required by traditional servers. This offers a compelling solution to the problem of physical data center space.(3) Each chassis shares traditional components including the fabric, HP Integrated Lights-Out (iLo) management, power supply and cooling fans. These shared components reduce complexity as well as add to the reduction in energy use and space.  
              The first HP ProLiant Moonshot server is available with the Intel® Atom S1200 processor and supports web-hosting workloads. HP Moonshot 1500, a 4.3u server enclosure, is fully equipped with 45 Intel-based servers, one network switch and supporting components.
              HP also announced a comprehensive roadmap of workload-optimized HP ProLiant Moonshot servers incorporating processors from a broad ecosystem of HP partners including AMD, AppliedMicro, Calxeda, Intel and Texas Instruments Incorporated.

              Scheduled to be released in the second half of 2013, the new HP ProLiant Moonshot servers will support emerging web, cloud and massive scale environments, as well as analytics and telecommunications. Future servers will be delivered for big data, high-performance computing, gaming, financial services, genomics, facial recognition, video analysis and other applications.

              The HP Moonshot system is immediately available in the United States and Canada and will be available in Europe, Asia and Latin America beginning next month.
              Pricing begins at $61,875 for the enclosure, 45 HP ProLiant Moonshot servers and an integrated switch.(4)
              (4) Estimated U.S. street prices. Actual prices may vary.

              More information:
              HP Moonshot System [Family data sheet, April 8, 2013]
              HP Moonshot – The Disruption [HP Event registration page at ‘thedisruption.com’ with embedded video gallery, press kit and more, originally created on April 12, 2010, obviously updated for the April 8, 2013 event]

              Moonshot 101 [HewlettPackardVideos YouTube channel, April 8, 2013]

              Paul Santeler, Vice President & GM of Hyperscale Business Unit at HP, discusses how HP Project Moonshot creates the new style of IT.http://hp.com/go/moonshot

              Alert for Microsoft:

              [4:42] We defined the industry standard server market [reference to HP’s Compaq heritage] and we’ve been the leader for years. With Moonshot we bring to find the market and taking it to the next level. [4:53]

              People Behind HP Moonshot [HP YouTube channel, April 10, 2013]

              HP Moonshot is a groundbreaking new class of server that requires less energy, less space and less cost. Built from HP’s industry-leading server IP and 10 years of research from HP Labs, HP Moonshot is an example of the best of HP working together. In the video: Gerald Kleyn, Director of Platform Research and Development, Hyperscale Business Unit, Industry Standard Servers; Scott Herbel, Worldwide Product Marketing Manager, Hyperscale Business Unit, Industry Standard Servers; Ron Mann, Director of Engineering, Industry Standard Servers; Kelly Pracht, Hardware Platform Manager R&D, Hyperscale Business Unit, Industry Standard Servers; Mike Sabotta, Distinguished Technologist, Hyperscale Business Unit, Industry Standard Servers; Dwight Barron, HP Fellow, Chief Technologist, Hyperscale Business Unit, Industry Standard Servers. For more information, visit http://www.hpnext.com.

              HP Moonshot System Tour [HewlettPackardVideos YouTube channel, April 8, 2013]

              Kelly Pracht, Moonshot Hardware Platform Program Manager, HP, takes you on a private tour of the HP Moonshot System and introduces the foundational HW components of HP Project Moonshot. This video guides you around the entire system highlighting the cartridges and switches.http://hp.com/go/moonshot

              HP Moonshot System is Hot Pluggable [HewlettPackardVideos YouTube channel, April 8, 2013]

              “Show me around the HP Moonshot System!” Vicki Doehring, Moonshot Hardware Engineer, HP, shows us just how simple and intuitive it is to remove components in the HP Moonshot System. This video explains how HP’s hot pluggable technology works with the HP Moonshot System.http://hp.com/go/moonshot

              Alert for Microsoft: how and when will you have a system like this with all the bells and whistles as presented above, as well as the rich ecosystem of hardware and software partners given below 

              HP Pathfinder Innovation Ecosystem [HewlettPackardVideos YouTube channel, April 8, 2013]

              A key element of HP Moonshot, the HP Pathfinder Innovation Ecosystem brings together industry leading sofware and hardware partners to accelerate the development of workload optimized applications. http://hp.com/go/moonshot

              Software partners:

              What Linaro is saying about HP Moonshot [HewlettPackardVideos YouTube channel, April 8, 2013]

              Linaro discusses HP’s Project Moonshot and the cost, space, and efficiency innovations being enabled through the Pathfinder Innovation Ecosystem. http://hp.com/go/moonshot

              Alert for Microsoft:

              [0:11] In HP approach Linaro is about forming an enterprise group. What they were hoping for, what’s happened is to get a bunch of companies who are interested in taking the ARM architecture into the server space. [0:26]

              Canonical joins Linaro Enterprise Group (LEG) and commits Ubuntu Hyperscale Availability for ARM V8 in 2013 [press release, Nov 1, 2012]

                • Canonical continues its leadership of commercial deployment for ARM-based servers through membership of Linaro Enterprise Group (LEG)
                • Ubuntu, the only commercially supported OS for ARM v7 today, commits to support ARM v8 server next year
                • Ubuntu extends its position as the natural choice for hyperscale  server computing with long term support

              … “Canonical has been supporting our work optimising and consolidating the Linux kernel since our founding in June 2010”, said George Grey, CEO of Linaro. “We’re very happy to welcome them as a member of the Linaro Enterprise Group, building on our relationship to help accelerate development of the ARM server software ecosystem.” …

              … “Calxeda has been thrilled with Canonical’s leadership in developing the ARM ecosystem”,  said Karl Freund, VP marketing at Calxeda. “These guys get it. They are driving hard and fast, already delivering enterprise-class code and support for Calxeda’s 32-bit product today to our mutual clients.  Working together in LEG will enable us to continue to build on the momentum we have already created.” …

              What Canonical is saying about HP Moonshot [HewlettPackardVideos YouTube channel, April 8, 2013]

              HP Moonshot and Ubuntu work together [Ubuntu partner site, April 9, 2013]

              … Ubuntu, as the lead operating system platform for x86 and ARM-based HP Moonshot Systems, featured extensively at the launch of the program in April 2013. …
              Ubuntu Server is the only OS fully operational today across HP Moonshot x86 and ARM servers, launched in April 2013.
              Ubuntu is recognised as the leader in scale out and Hyperscale. Together, Canonical and HP are delivering massive reductions in data-center energy, space and costs. …

              Canonical has been working with HP for the past two years
              on HP Moonshot
              , and with Ubuntu, customers can achieve higher performance with greater manageability across both x86 and ARM chip sets” Paul Santeler, VP & GM, Hyperscale Business Unit, HP

              Ubuntu & HP’s project Moonshot [Canonical blog, Nov 2, 2011]

              Today HP announced Project Moonshot  – a programme to accelerate the use of low power processors in the data centre.
              The three elements of the announcement are the launch of Redstone – a development platform that harnesses low-power processors (both ARM & x86),  the opening of the HP Discovery lab in Houston and the Pathfinder partnership programme.
              Canonical is delighted to be involved in all three elements of HP’s Moonshot programme to reduce both power and complexity in data centres.
              imageThe HP Redstone platform unveiled in Palo Alto showcases HP’s thinking around highly federated environments and Calxeda’s EnergyCore ARM processors. The Calxeda system on chip (SoC) design is powered by Calxeda’s own ARM based processor and combines mobile phone like power consumption with the attributes required to run a tangible proportion of hyperscale data centre workloads.
              The promise of server grade SoC’s running at less than 5W and achieving per rack density of 2800+ nodes is impressive, but what about the software stacks that are used to run the web and analyse big data – when will they be ready for this new architecture?
              Ubuntu Server is increasingly the operating system of choice for web, big data and cloud infrastructure workloads. Films like Avatar are rendered on Ubuntu, Hadoop is run on it and companies like Rackspace and HP are using Ubuntu Server as the foundation of their public cloud offerings.
              The good news is that Canonical has been working with ARM and Calxeda for several years now and we released the first version of Ubuntu Server ported for ARM Cortex A9 class  processors last month.
              The Ubuntu 11.10 release (download) is an functioning port and over the next six months and we will be working hard to benchmark and optimize Ubuntu Server and the workloads that our users prioritize on ARM.  This work, by us and by upstream open source projects is going to be accelerated by today’s announcement and access to hardware in the HP Discovery lab.
              As HP stated today, this is beginning of a journey to re-inventing a power efficient and less complex data center. We look forward to working with HP and Calxeda on that journey.

              The biggest enterprise alert for Microsoft because of what was discussed in Will Microsoft Stand Out In the Big Data Fray? [Redmondmag.com, March 22, 2013]: What NuoDB is saying about HP Moonshot [HewlettPackardVideos YouTube channel, April 9, 2013] especially as it is a brand new offering, see NuoDB Announces General Availability of Industry’s First & Only Cloud Data Management System at Live-Streamed Event [press release, Jan 15, 2013] now available in archive at this link: http://go.nuodb.com/cdms-2013-register-e.html

              Barry Morris, founder and CEO of NuoDB discusses HP’s Project Moonshot and the database innovations delivered by the combined offering

              Extreme density on HP’s Project Moonshot [NuoDB Techblog, April 9, 2013]

              A few months ago HP came to us with something very cool. It’s called Project Moonshot, and it’s a new way of thinking about how you design infrastructure. Essentially, it’s a composable system that gives you serious flexibility and density.

              A single Moonshot System is 4.3u tall and holds 45 independent servers connected to each other via 1-Gig Ethernet. There’s a 10-Gig Ethernet interface to the system as a whole, and management interfaces for the system and each individual server. The long-term design is to have servers that provide specific capabilities (compute, storage, memory, etc.) and can scale to up to 180 nodes in a single 4.3u chassis.
              The initial system, announced this week, comes with a single server configuration: an Intel Atom S1260 processor, 8 Gigabytes of memory and either a 200GB SSD or a 500GB HDD. On its own, that’s not a powerful server, but when you put 45 of these into a 4.3 rack-unit space you get something in aggregate that has a lot of capacity while still drawing very little power (see below). The challenge, then, is how to really take advantage of this collection of servers.

              NuoDB on Project Moonshot: Density and Efficiency

              We’ve shown how NuoDB can scale a single database to large transaction rates. For this new system, however, we decided to try a different approach. Rather than make a single database scale to large volume we decided to see how many individual, smaller databases we could support at the same time. Essentially, could we take a fully-configured HP Project Moonshot System and turn it into a high-density, low-power, easy to manage hosting appliance.

              To put this in context, think about a web site that hosts blogs. Typically, each blog is going to have a single database supporting it (just like this blog you’re reading). The problem is that while a few blogs will be active all the time, most of them see relatively light traffic. This is known as a long-tail pattern. Still, because the blogs always need to be available, so too the backing databases always need to be running.

              This leads to a design trade-off. Do you map the blogs to a single database (breaking isolation and making management harder) or somehow try to juggle multiple database instances (which is hard to automate, expensive in resource-usage and makes migration difficult)? And what happens when a blog suddenly takes off in popularity? In other words, how do you make it easy to manage the databases and make resource-utilization as efficient as possible so you don’t over-spend on hardware?

              As I’ve discussed on this blog NuoDB is a multi-tenant system that manages individual databases dynamically and efficiently. That should mean that we’re a perfect fit for this very cool (pun intended) new system from HP.

              The Design

              After some initial profiling on a single server, we came up with a goal: support 7,200 active databases. You can read all about how we did the math, but essentially this was a balance between available CPU, Memory, Disk and bandwidth. In this case a “database” is a single Transaction Engine and Storage Manager pair, running on one of the 45 available servers.

              When we need to start a database, we pick the server that’s least-utilized. We choose this based on local monitoring at each server that is rolled up through the management tier to the Connection Brokers. It’s simple to do given all that NuoDB already provides, and because we know what each server supports it lets us calculate a single capacity percentage.
              It gets better. Because a NuoDB database is made of an agile collection of processes, it’s very inexpensive to start or stop a database. So, in addition to monitoring for server capacity we also watch what’s going on inside each database, and if we think it’s been idle long enough that something else could use the associated resources more effectively we shut it down. In other words, if a database isn’t doing anything active we stop it to make room for other databases.
              When an SQL client needs to access that database, we simply re-start it where there are available resources. We call this mechanism hibernating and waking a database. This on-demand resource management means that while there are some number of databases actively running, we can really support a much larger in total (remember, we’re talking about applications that exhibit a long-tail access pattern). With this capability, our original goal of 7,200 active databases translates into 72,000 total supported databases. On a single 4.3u System.
              The final piece we added is what we call database bursting. If a single database gets really popular it will start to take up too many resources on a single server. If you provision another server, separate from the Moonshot System, then we’ll temporarily “burst” a high-activity database to that new host until activity dies down. It’s automatic, quick and gives you on-demand capacity support when something gets suddenly hot.
              The Tests
              I’m not going to repeat too much here about how we drove our tests. That’s already covered in the discussion on how we’re trying to design a new kind of benchmark focused on density and efficiency. You should go check that out … it’s pretty neat. Suffice it say, the really critical thing to us in all of this was that we were demonstrating something that solves a real-world problem under real-world load.
              You should also go read about how we setup and ran on a Moonshot System. The bottom-line is that the system worked just like you’d expect, and gave us the kinds of management and monitoring features to go beyond basic load testing.
              The Results
              We were really lucky to be given access to a full Moonshot System. It gave us a chance to test out our ideas, and we actually were able to do better than our target. You can see this in the view from our management interface running against a real system under our benchmark load. You can see there that when we hit 7200 active databases we were only at about 70% utilization, so there was a lot more room to grow. Huge thanks to HP for giving us time on a real Moonshot System to see all those idea work!

              Something that’s easy to lose track of in all this discussion is the question of power. Part of the value proposition from Project Moonshot is in energy efficiency, and we saw that in spades. Under load a single server only draws 18 Watts, and the system infrastructure is closer to 250 Watts. Taken together, that’s a seriously dense system that is using very little energy for each database.

              Bottom Line
              We were psyched to have the chance to test on a Moonshot System. It gave us the chance to prove out ideas around automation and efficiency that we’ll be folding into NuoDB over the next few releases. It also gave us the perfect platform to put our architecture through its paces and validate a lot about the flexibility of our core architecture.
              We’re also seriously impressed by what we experienced from Project Moonshot itself. We were able to create something self-contained and easy to manage that solves a real-world problem. Couple that with the fact that a Moonshot System draws so little power, the Total Cost of Ownership is impressively low.  That’s probably the last point to make about all this: the combination of our two technologies gave us something where we could talk concretely about capacity and TCO, something that’s usually hard to do in such clear terms.
              In case it’s not obvious, we’re excited. We’ve already been posting this week about some ideas that came out of this work, and we’ll keep posting as the week goes on. Look for the moonshot tag and please follow-up with comments if you’re curious about anything specific and would like to hear more!

              Project Moonshot by the Numbers [NuoDB Techblog, April 9, 2013]

              To really understand the value from HP Project Moonshot you need to think beyond the list price of one system and focus instead on the Total Cost of Ownership. Figuring out the TCO for a server running arbitrary software is often a hard (and thankless?) task, so one of the things we’ve tried to do is not just demonstrate great technology but something that naturally lets you think about TCO in a simple way. We think the final metrics are pretty simple, but to get there requires a little math.

              Executive Summary

              If you’re a CIO, and just want to know the bottom line, then we’ll ruin the suspense and cut to the chase. It will cost you about $70,500 up-front, $1,800 in your first year’s electricity bills and take 8.3 rack-units to support the web-front end and database back-end for 72,000 blogs under real-world load.

              Cost of a Single Database
              Recall that we set the goal at 72,000 databases within a single system. At launch the list price for a fully-configured Moonshot System is around $60,000, so we start out at 83 cents per-database. In practice were seeing much higher capacity in our tests, but let’s start with this conservative number.
              Now consider the power used by the system. From what we’ve measured through the iLO interfaces a single server draws no more than 18 Watts at peak load (measured against CPU and IO activity). The System itself (fans, switches etc.) draws around 250 Watts in our tests. That means that under full load each database is drawing about .015 Watts.
              NuoDB is a commercial software offering, which means that you pay up-front to deploy the software (and get support as part of that fee). For anyone who wants to run a Moonshot System in production as a super-dense NuoDB appliance we’ll offer you a flat-rate license.
              Put together, we can say that the cost per database-watt is 1.22 cents. That’s on a 4.3 rack-unit system. Awesome.
              Quantify the Supported Load
              As we discussed in our post on benchmarking, we’re trying to test under real-world load. As a simple starting-point we chose a profile based on WordPress because it’s fairly ubiquitous and has somewhat serious transactional requirements. In our benchmarking discussion we explain that a typical application action (post, read, comment) does around 20 SQL operations.
              Given 72,000 databases most of these are fairly inactive, so on average we’ll say that each database gets about 250 hits a day (generous by most reports I’ve seen). That’s 18,000,000 hits a day or 208 hits per-second. 4,166 SQL statements a second isn’t much for a single database, but it’s pretty significant given that we’re spreading it across many databases some of which might have to be “woken” on-demand.
              HP was generous enough not only to give us time on a Moonshot System but also access to some co-located servers for driving our load tests. In this case, 16 lower-powered ARM-based Calxeda systems that all went through the same 1-Gig ethernet connection to our Moonshot System. These came from HP’s Discovery Lab; check out our post about working with the Moonshot System for more details.
              From these load-drivers we able to run our benchmark application with up to 16 threads per server, simulating 128 simultaneous clients. In this case a typical “client” would be a web server trying to respond to a web client request. We averaged around 320 hits per-second, well above the target of 208. From what we could observe, we expect that given more capable network and client drivers we would be able to get 3 or 4 times that rate easily.
              Tangible Cost
              We have the cost of the Moonshot System itself. We also know that it can support expected load from a fairly small collection of low-end servers. In our own labs we use systems that cost around $10,000, fit in 3 rack-units and would be able to drive at least the same kind of load we’re citing here. Add a single switch at around $500 and you have a full system ready to serve blogs. That’s $70,500 total in 8.3 rack units, still under $1 per database.
              I don’t know what power costs you have in your data center, but I’ve seen numbers ranging from 2.5 to 25 cents per Kilowatt-Hour. In our tests, where we saw .015 Watts per-database, if you assume an average rate of 13.75 cents per KwH that comes out to .00020625 cents per-hour per-database in energy costs. In one year, with no down-time, that would cost you $1,276.77 in total electricity fees.
              Just as an aside, according to the New York Times, Facebook uses around 60,000,000 Watts a year!
              One of the great things about a Moonshot System is that the 45 servers are already being switched inside the chassis. This means that you don’t need to buy switches & cabling, and you don’t need to allocate all the associated space in your racks. For our systems administrator that alone would make him very happy.
              Intangible Cost
              What I haven’t been talking about in all of this are the intangible costs. This is where figuring out TCO becomes harder.
              For instance, one of the value-propositions here is that the Moonshot System is a self-contained, automated component. That means that systems administrators are freed up from the tasks of figuring out how to allocate and monitor databases, and how to size the data-center for growth. Database developers can focus more easily on their target applications. CIOs can spend less time staring at spreadsheets … or, at least, can allocate more time to spreadsheets on different topics.
              Providing a single number in terms of capacity makes it easy to figure out what you need in your datacenter. When a single server within a Moonshot System fails you can simply replace it, and in the meantime you know that the system will still run smoothly just with slightly lower capacity. From a provisioning point of view, all you need to figure out is where your ceiling is and how much stand-by capacity you need to have at the ready.
              NuoDB by its nature is dynamic, even when you’re doing upgrades. This means that you can roll through a running Moonshot System applying patches or new versions with no down-time. I don’t know how you calculate the value in saved cost here, but you probably do!
              Comparisons and Planned Optimizations
              It’s hard to do an “apples-to-apples” comparison against other database software here. Mostly, this is because other databases aren’t designed to be dynamic enough to support hibernation, bursting and capacity-based automated balancing. So, you can’t really get the same levels of density, and a lot of the “intangible” cost benefits would go away.
              Still, to be fair, we tried running MySQL on the same system and under the same benchmarks. We could indeed run 7200 instances, although that was already hitting the upper-bounds of memory/swap. In order to get the same density you would need 10 Moonshot Systems, or you would need larger-powered expensive servers. Either way, the power, density, automation and efficiency savings go out the window, and obviously there’s no support for bursting to more capable systems on-demand.
              Unsurprisingly, the response time was faster on-average (about half the time) from MySQL instances. I say “unsurprisingly” for two reasons. First, we tried to use schema/queries directly from WordPress to be fair in our comparison, and these are doing things that are still known to be less-optimized in NuoDB. They’re also in the path of what we’re currently optimizing and expect to be much faster in the near-term.
              The second is that NuoDB clients were originally designed assuming longer-running connections (or pooled connections) to databases that always run with security & encryption enabled. We ran all of our tests in our default modes to be fair. That means we’re spending more time on each action setting up & tearing down a connection. We’ve already been working on optimizations here that would shrink the gap pretty substantially.
              In the end, however, our response time is still on the order of a few hundred milliseconds worst-case, and is less important than the overall density and efficiency metrics that we proved out. We think the value in terms of ease of use, density, flexibility on load spikes and low-cost speaks for itself. This setup is inexpensive by comparison to deploying multiple servers and supports what we believe is real-world load. Just wait until the next generation of HP Project Moonshot servers roll out and we can start scaling out individual databases at the same time!

              More information:
              Benchmarking Density & Efficiency [NuoDB Techblog, April 9, 2013]
              Database Hibernation and Bursting [NuoDB Techblog, April 8, 2013]
              An Enterprise Management UI for Project Moonshot [NuoDB Techblog, April 9, 2013]Regarding the cloud based version of NuoDB see:
              NuoDB Partners with Amazon [press release, March 26, 2013]
              NuoDB Extends Database Leadership in Scalability & Performance on a Private Cloud [press release, March 14, 2013] “… the industry’s first and only patented, elastically scalable Cloud Data Management System (CDMS), announced performance of 1.84 million transactions per second (TPS) running on 32 machines. … With NuoDB Starlings release 1.0.1, available as of March 1, 2013, the company has made advancements in performance and scalability and customers can now experience 26% improvement in TPS per machine.
              Google Compute Engine: interview with NuoDB [GoogleDevelopers YouTube channel, March 21, 2013]

              Meet engineers from NuoDB: an elastically scalable SQL database built for the cloud. We will learn about their approach to distributed SQL databases and get a live demo. We’ll cover the steps they took to get NuoDB running on Google Compute Engine, talk about how they evaluate infrastructure (both physical hardware and cloud), and reveal the results of their evaluation of Compute Engine performance.

              Actually Calxeda was best to explain the preeminence of software over the SoC itself:
              Karl Freund from Calxeda – HP Moonshot 2013 – theCUBE [siliconangle YouTube channel, April 8, 2013], see also HP Moonshot: It’s a lot closer than it looks! [Calxeda’s ‘ARM Servers, Now!’ blog, April 8, 2013]

              Karl Freund, VP of Marketing, Calxeda, at HP Moonshot 2013 with John Furrier and Dave Vellante.

              as well as ending with Calxeda’s very practical, gradual approach to ARM based served market with things like:

              [16.03] Our 2nd generation platform called Midway, which will be out later this year [in the 2nd half of the year], that’s probably the target for Big Data. Our current product is great for web serving, it’s great for media serving, it’s great for storage. It doesn’t have enough memory for Big Data … in a large. So we’ll getting that 2nd generation product out, and that should be a really good Big Data platform. Why? Because it’s low power, it’s low cost, but it’s also got a lot of I/O. Big Data is all that moving a lot of data around. And if you do that more cost effectively you save a lot of money. [16:38]

              mentioning also that their strategy is using standard ARM cores like the Cortex-A57 for their H1 2014 product, and focus on things like the fabric and the management, which actually allows them to work with a streamlined staff of around 150 people.

              Detailed background about Calxeda in a concise form:
              Redefining Datacenter Efficiency: An Overview of Calxeda’s architecture and early performance measurements [Karl Freund, Nov 12, 2012] from where the core info is:

                • Founded in 2008   
                • $103M Funding       
                • 1st Product Announced with HP,  Nov  2011   
                • Initial Shipments in Q2 2012   
                • Volume production in Q4 2012

              image

              image* The power consumed under normal operating conditions
              under full application load (ie, 100% CPU utilization)

              imageA small Calxeda Cluster: a Simple Example
              • Start with four ServerNodes
              • Consumes only 20W total power   
              • Connected via distributed fabric switches   
              • Connect up to 4 SATA drives per node   
              • Then scale this to thousands of ServerNodes

              EnergyCard: a Quad-Node Reference Design

                • Four-node reference platform from Calxeda
                • Available as product and/or design
                • Plugs into OEM system board with passive fabric, no additional switch HW
                  EnergyCard delivers 80Gb Bandwidth to the system board. (8 x 10Gb links)

              image

              image

              It is also important to have a look at what were the Open Source Software Packages for Initial Calxeda Shipments [Calxeda’s ‘ARM Servers, Now!’ blog, May 24, 2012]

              We are often asked what open-source software packages are available for initial shipments of Calxeda-based servers.

              Here’s the current list (changing frequently).  Let us know what else you need!

              image

              Then Perspectives From Linaro Connect [Calxeda’s ‘ARM Servers, Now!’ blog, March 20, 2013] sheds more light on the recent software alliances which make Calxeda to deliver:

              – From Larry Wikelius,   Co-Founder and VP Ecosystems,  Calxeda:

              The most recent Linaro Connect (Linaro Connect Asia 2013 – LCA), held in Hong Kong the first week of March, really put a spotlight on the incredible momentum around ARM based technology and products moving into the Data Center.  Yes – you read that correctly – the DATA CENTER!

              When Linaro was originally launched almost three years ago the focus was exclusively on the mobile and client market – where ARM has and continues to be dominant.  However, as Calxeda has demonstrated, the opportunity for the ARM architecture goes well beyond devices that you carry in your pocket.  Calxeda was a key driver in the formation of the Linaro Enterprise Group (LEG), which was publicly launched at the previous LinaroConnect event in Copenhagen in early November, 2012.

              LEG has been an exciting development for Linaro and now has 13 member companies that include server vendors such as Calxeda, Linux distribution companies Red Hat and Canonical, OEM representation from HP and even Hyperscale Data Center end user Facebook.  There were many sessions throughout the week that focused on Server specific topics such as UEFI, ACPI, Virtualization, Hyperscale Testing with LAVA and Distributed Storage.  Calxeda was very active throughout the week with the team participating directly in a number of roadmap definition sessions, presenting on Server RAS and providing guidance in key areas such as application optimization and compiler focus for Servers.

              Linaro Connect is proving to be a tremendous catalyst for the the growing eco-system around the ARM software community as a whole and the server segment in particular.  A great example of this was the keynote presentation given jointly by Mark Heath and Lars Kurth from Citrix on Tuesday morning.  Mark is the VP of XenServer at Citirix and Lars is well know in the OpenSource community for his work with Xen.  The most exciting announcement coming out of Mark’s presentation is that Citrix will be joining Linaro as a member of LEG.  Citrix will be certainly prove to be another valuable member of the Linaro team and during the week attendees were able to appreciate how serious Citrix is about supporting ARM servers.  The Xen team has not only added full support for ARM V7 systems in the Xen 4.3 release but they have accomplished some very impressive optimizations for the ARM platform.  The Xen team has leveraged Device Tree for optimal device discovery.  Combined with a number of other code optimizations they showed a dramatically smaller code base for the ARM platform.  We at Calxeda are thrilled to welcome Citrix into LEG!

              As an indication of the draw that the Linaro Connect conference is already having on the broader industry the Open Compute Project (OCP) held their first International Event co-incident with LCA at the same venue.  The synergy between Linaro and OCP is significant with the emphasis on both organizations around Open Source development (one software and one hardware) along with the dramatically changing design points for today’s Hyperscale Data Center.  In fact the keynote at LCA on Wednesday morning really put a spotlight on how significant this is likely to be.  Jason Taylor, Director of Capacity Engineering and Analysis at Facebook, presented on Facebook’s approach to ARM based servers.   Facebook’s consumption of Data Center equipment is quite stunning – Jason quoted from Facebook’s 10-Q filed in October 2012 which stated that “The first nine months of 2012 … $1.0 billion for capital expenditures” related to data center equipment and infrastructure.  Clearly with this level of investment Facebook is extremely motivated to optimize where possible.  Jason focused on the strategic opportunity for ARM based severs in a disaggregated Data Center of the future to provide lower cost computing capabilities with much greater flexibility.

              Calxeda has been very active in building the Server Eco-System for ARM based servers.  This week in Hong Kong really underscored how important that investment has become – not just for Calxeda but for the industry as a whole. Our commitment to Open Source software development in general and Linaro in particular has resulted in a thriving Linux Infrastructure for ARM servers that allows Calxeda to leverage and focus on key differentiation for our end users.  The Open Compute Project, which we are an active member in and have contributed to key projects such as the Knockout Storage design as well as the Open Slot Specification, demonstrates how the combination of an Open Source approach for both Software and Hardware can compliment each other and can drive Data Center innovation.  We are early in this journey but it is very exciting!

              Calxeda will continue to invest aggressively in forums and industry groups such as these to drive the ARM based server market.  We look forward to continue to work with the incredibly innovative partners that are members in these groups and we are confident that more will join this exciting revolution.  If you are interested in more information on these events and activities please reach out to us directly at info@calxeda.com.

              The next Linaro Connnect is scheduled for early July in Dublin. We expect more exciting events and topics there and hope to see you there!

              They are also referring on their blog to Mobile, cloud computing spur tripling of micro server shipments this year [IHS iSuppli press release, Feb 6, 2013] which showing the general market situation well into the future as:

              Driven by booming demand for new data center services for mobile platforms and cloud computing, shipments of micro servers are expected to more than triple this year, according to an IHS iSuppli Compute Platforms Topical Report from information and analytics provider IHS (NYSE: IHS).
              Shipments this year of micro servers are forecast to reach 291,000 units, up 230 percent from 88,000 units in 2012. Shipments of micro servers commenced in 2011 with just 19,000 units. However, shipments by the end of 2016 will rise to some 1.2 million units, as shown in the attached figure.

              image

              The penetration of micro servers compared to total server shipments amounted to a negligible 0.2 percent in 2011. But by 2016, the machines will claim a penetration rate of more than 10 percent—a stunning fiftyfold jump.
              Micro servers are general-purpose computers, housing single or multiple low-power microprocessors and usually consuming less than 45 watts in a single motherboard. The machines employ shared infrastructure such as power, cooling and cabling with other similar devices, allowing for an extremely dense configuration when micro servers are cascaded together.
              “Micro servers provide a solution to the challenge of increasing data-center usage driven by mobile platforms,” said Peter Lin, senior analyst for compute platforms at IHS. “With cloud computing and data centers in high demand in order to serve more smartphones, tablets and mobile PCs online, specific aspects of server design are becoming increasingly important, including maintenance, expandability, energy efficiency and low cost. Such factors are among the advantages delivered by micro servers compared to higher-end machines like mainframes, supercomputers and enterprise servers—all of which emphasize performance and reliability instead.”
              Server Salad Days
              Micro servers are not the only type of server that will experience rapid expansion in 2013 and the years to come. Other high-growth segments of the server market are cloud servers, blade servers and virtualization servers.
              The distinction of fastest-growing server segment, however, belongs solely to micro servers.
              The compound annual growth rate for micro servers from 2011 to 2016 stands at a remarkable 130 percent—higher than that of the entire server market by a factor of 26. Shipments will rise by double- and even triple-digit percentages for each year during the period.
              Key Players Stand to Benefit
              Given the dazzling outlook for micro servers, makers with strong product portfolios of the machines will be well-positioned during the next five years—as will their component suppliers and contract manufacturers.
              A slew of hardware providers are in line to reap benefits, including microprocessor vendors like Intel, ARM and AMD; server original equipment manufacturers such as Dell and Hewlett-Packard; and server original development manufacturers including Taiwanese firms Quanta Computer and Wistron.
              Among software providers, the list of potential beneficiaries from the micro server boom extends to Microsoft, Red Hat, Citrix and Oracle. For the group of application or service providers that offer micro servers to the public, entities like Amazon, eBay, Google and Yahoo are foremost.
              The most aggressive bid for the micro server space comes from Intel and ARM.
              Intel first unveiled the micro server concept and reference design in 2009, ostensibly to block rival ARM from entering the field.
              ARM, the leader for many years in the mobile world with smartphone and tablet chips because of the low-power design of its central processing units, has been just as eager to enter the server arena—dominated by x86 chip architecture from the likes of Intel and a third chip player, AMD. ARM faces an uphill battle, as the majority of server software is written for x86 architecture. Shifting from x86 to ARM will also be difficult for legacy products.
              ARM, however, is gaining greater support from software and OS vendors, which could potentially put pressure on Intel in the coming years.
              Read More > Micro Servers: When Small is the Next Big Thing

              Then there are a number of Intel competitive posts on Calxeda’s ‘ARM Servers, Now!’ blog:
              What is a “Server-Class” SOC? [Dec 12, 2012]
              Comparing Calxeda ECX1000 to Intel’s new S1200 Centerton chip [Dec 11, 2012]
              which you can also find in my Intel targeting ARM based microservers: the Calxeda case [‘Experiencing the Cloud’ blog, Dec 14, 2012] with significantly wider additional information upto binary translation from x86 to ARM with Linux

              See also:
              ARM Powered Servers: 2013 is off to a great start & it is only March! [Smart Connected Devices blog of ARM, March 6, 2013]
              Moonshot – a shot in the ARM for the 21st century data center [Smart Connected Devices blog of ARM, April 9, 2013]
              Are you running out of data center space? It may be time for a new server architecture: HP Moonshot [Hyperscale Computing Blog of HP, April 8, 2013]
              HP Moonshot: the HP Labs team that did some of the groundbreaking research [Innovation @ HP Labs blog of HP, April 9, 2013]
              HP Moonshot: An Accelerator for Hyperscale Workloads [Moor Insights White Paper, April 8, 2013]
              Comparing Pattern Mining on a Billion Records with HP Vertica and Hadoop [HP Vertica blog, April 9, 2013] by team of HP Labs researchers show how the Vertica Analytics Platform can be used to find patterns from a billion records in a couple of minutes, about 9x faster than Hadoop.
              PCs and cloud clients are not parts of Hewlett-Packard’s strategy anymore [‘Experiencing the Cloud’, Aug 11, 2011 – Jan 17, 2012] see the Autonomy IDOL related content there
              ENCO Systems Selects HP Autonomy for Audio and Video Processing [HP Autonomy press release, April 8, 2013]

              HP Autonomy today announced that ENCO Systems, a global provider of radio automation and live television audio solutions, has selected Autonomy’s Intelligent Data Operating Layer (IDOL) to upgrade ENCO’s latest-generation enCaption product.

              ENCO Systems provides live automated captioning solutions to the broadcast industry, leveraging technology to deliver closed captioning by taking live audio data and turning it into text. ENCO Systems is capitalizing on IDOL’s unique ability to understand meaning, concepts and patterns within massive volumes of spoken and visual content to deliver more accurate speech analytics as part of enCaption3.

              “Many television stations count on ENCO to provide real-time closed captioning so that all of their viewers get news and information as it happens, regardless of their auditory limitations,” said Ken Frommert, director, Marketing, ENCO Systems. “Autonomy IDOL helps us provide industry-leading automated closed captioning for a fraction of the cost of traditional services.”
              enCaption3 is the only fully automated speech recognition-based closed captioning system for live television that does not require speaker training. It gives broadcasters the ability to caption their programming, including breaking news and weather, any time, day or night, since it is always on and always available. enCaption3 provides captioning in near real time-with only a 3 to 6 second delay-in nearly 30 languages.
              “Television networks are under increasing pressure to provide real-time closed captioning services-they face fines if they don’t, and their growing and diverse viewers demand it,” said Rohit de Souza, general manager, Power, HP Autonomy. “This is another example of a technology company integrating Autonomy IDOL to create a stronger, faster and more accurate product offering, and demonstrates yet another powerful way in which IDOL can be applied to help organizations succeed in the human information era.”

              Using Big Data to change the game in the Energy industry [Enterprise Services Blog of HP, Oct 24, 2012]

              … Tools like HP’s Autonomy that analyzes the unstructured data found in call recordings, survey responses, chat logs, e-mails, social media posts and more. Autonomy’s Intelligent Data Operating Layer (IDOL) technology uses sophisticated pattern-matching techniques and probabilistic modeling to interpret information in much the same way that humans do. …

              Stouffer Egan turns the tables on computers in keynote address at HP Discover [Enterprise Services Blog of HP, June 8, 2012]

              For decades now, the human mind has adjusted itself to computers by providing and retrieving structured data in two-dimensional worksheets with constraints on format, data types, list of values, etc. But, this is not the way the human mind has been architected to work. Our minds have the uncanny ability to capture the essence of what is being conveyed in a facial expression in a photograph, the tone of voice or inflection in an audio and the body language in a video. At the HP Discover conference, Autonomy VP for United States, Stouffer Egan showed the audience how software can begin to do what the human mind has being doing since the dawn of time. In a demonstration where Iron Man came live out of a two-dimensional photograph, Egan turned the tables on computers. It is about time computers started thinking like us rather than us forcing us to think like them.
              Egan states that the “I” in IT is where the change is happening. We have a newfound wealth of data through various channels including video, social, click stream, audio, etc. However, data unprocessed without any analysis is just that — raw data. For enterprises to realize business value from this unstructured data, we need tools that can process it across multiple media. Imagine software that recognizes the picture in a photograph and searches for a video matching the person in the picture. The cover page of a newspaper showing a basketball star doing a slam dunk suddenly turns live pulling up the video of this superstar’s winning shot in last night’s game. …


              2. Software Partners

              image
              HP Moonshot is setting the roadmap for next generation data centers by changing the model for density, power, cost and innovation. Ubuntu has been designed to meet the needs of Hyperscale customers and, combined with its management tools, is ideally suited be the operating system platform for HP Moonshot. Canonical has been working with HP since the beginning of the Moonshot Project, and Ubuntu is the only OS integrated and fully operational across the complete Moonshot System covering x86 and ARM chip technologies.
              What Canonical is saying about HP Moonshot
              image
              As mobile workstyles become the norm, the scalability needs of today’s applications and devices are increasingly challenging what traditional infrastructures can support. With HP’s Moonshot System, customers will be able to rapidly deploy, scale, and manage any workload with dramatically lower space and energy constraints. The HP Pathfinder Innovation Ecosystem is a prime opportunity for Citrix to help accelerate the development of innovative solutions that will benefit our enterprise cloud, virtualization and mobility customers.
              image
              We’re committed to helping enterprises achieve the most from their Big Data initiatives. Our partnership with HP enables joint customers to keep and query their data at scale so they can ask bigger questions and get bigger answers. By using HP’s Moonshot System, our customers can benefit from the improved resource utilization of next generation data center solutions that are workload optimized for specific applications.
               
              imageToday’s interactive applications are accessed 24×365 by millions of web and mobile users, and the volume and velocity of data they generate is growing at an unprecedented rate. Traditional technologies are hard pressed to keep up with the scalability and performance demands of these new applications. Couchbase NoSQL database technology combined with HP’s Moonshot System is a powerful offering for customers who want to easily develop interactive web and mobile applications and run them reliably at scale. image
              Our partnership with HP facilitates CyWee’s goal of offering solutions that merge the digital and physical worlds. With TI’s new SoCs, we are one step closer to making this a reality by pushing state-of-the-art video to specialized server environments. Together, CyWee and HP will deliver richer multimedia experiences in a variety of cloud-based markets, including cloud gaming, virtual office, video conferencing and remote education.
              image
              HP’s new Moonshot System will enable organizations to increase the energy efficiency of their data centers while reducing costs. Our Cassandra-based database platform provides the massive scalability and multi-datacenter capabilities that are a perfect complement to this initiative, and we are excited to be working with HP to bring this solution to a wide range of customers.
              image
              Big data comes in a wide range for formats and types and is a result of the connected everything world we live in. Through Project Moonshot, HP has enabled a new class of infrastructure to run more efficient workloads, like Apache Hadoop, and meet the market demand of more performance for less.
              image
              The unprecedented volume and variety of data introduces unique challenges to organizations today… By combining the HP Moonshot system with Autonomy IDOL’s unique ability to understand concepts in information, organizations can dramatically reduce the cost, space, and energy requirements for their big data initiatives, and at the same time gain insights that grow revenue, reduce risk, and increase their overall Return on Information.
              image
              Big Data is not just for Big Companies – or Big Servers – anymore – it’s affecting all sectors of the market. At HP Vertica we’re very excited about the work we’ve been doing with the Moonshot team on innovative configurations and types of analytic appliances which will allow us to bring the benefits of real-time Big Data analytics to new segments of the market. The combination of the HP Vertica Analytics Platform and Moonshot is going to be a game-changer for many.
              image
              HP worked closely with Linaro to establish the Linaro Enterprise Group (LEG). This will help accelerate the development of the software ecosystem around ARM Powered servers. HP’s Moonshot System is a great platform for innovation – encouraging a wide range of silicon vendors to offer competing ‘plug-and-play’ server solutions, which will give end users maximum choice for all their different workloads.
              What Linaro is saying about HP Moonshot[HewlettPackardVideos YouTube channel, April 8, 2013]
              image
              Organizations are looking for ways to rapidly deploy, scale, and manage their infrastructure, with an architecture that is optimized for today’s application workloads. HP Moonshot System is an energy efficient, space saving, workload-optimized solution to meet these needs, and HP has partnered with MapR Technologies, a Hadoop technology leader, to accelerate innovation and deployment of Big Data solutions.
              image
              NuoDB and HP are shattering the scalability and density barriers of a traditional database server. NuoDB on the HP Moonshot System delivers unparalleled database density, where customers can now run their applications across thousands of databases on a single box, significantly reducing the total cost across hardware, software, and power consumption. The flexible architecture of HP Moonshot coupled with NuoDB’s hyper-pluggable database design and its innovative “database hibernation” technology makes it possible to bring this unprecedented hardware and software combination to market.
              What NuoDB is saying about HP Moonshot [HewlettPackardVideos YouTube channel, April 9, 2013]
              image
              As the leading solution provider for the hosting market, Parallels is excited to be collaborating in the HP Pathfinder Innovation Ecosystem. The HP Moonshot System in concert with Parallels Plesk Panel and Parallels Containers provides a flexible and efficient solution for cloud computing and hosting.
              image
              Red Hat Enterprise Linux on HP’s converged infrastructure means predictability, consistency and stability. Companies around the globe rely on these attributes when deploying applications every day, and our value proposition is just as important in the Hyperscale segment. When customers require a standard operating environment based on Red Hat Enterprise Linux, I believe they will look to the HP Moonshot System as a strong platform for high-density Hyperscale implementations.
              What Red Hat is saying about HP Moonshot [HewlettPackardVideos YouTube channel, April 8, 2013]
              image
              HP Project Moonshot’s promise of extreme low-energy servers is a game changer, and SUSE is pleased to partner with HP to bring this new innovation to market. For more than twenty years, SUSE has adapted its enterprise-grade Linux operating system to achieve ever-increasing performance needs that succeed both today and tomorrow in areas such as Big Data and cloud computing.
              What SUSE is saying about HP Moonshot [HewlettPackardVideos YouTube channel, April 8, 2013]


              3. Hardware Partners

              image
              AMD is excited to continue our deep collaboration with HP to bring extreme low-energy, ultra dense, specialized server solutions to the market. Both companies share a passion to bring innovative workload optimized solutions to the market, enabling customers to scale-out to new levels within existing energy and space constraints. The new low-power x86 AMD Opteron™ APU is optimized in the HP Moonshot System to dramatically lower TCO in quickly emerging media oriented workloads.
              What AMD is saying about HP Moonshot
              image

              It is exciting to see HP take the lead in innovating low-energy servers for the cloud. Applied Micro’s ARM 64-bit X-Gene Server on a Chip will enable performance levels seen in today’s deployments while offering higher densities, greatly improved I/O, and substantial reductions in the total cost of ownership. Together, we will unleash innovation unlike anything we’ve seen in the server market for decades.

              What Applied Micro is saying about HP Moonshot

              image
              In the current economic and power realities, today’s server infrastructure cannot meet the needs of the next billion data users, or the evolving needs of currently supported users. Customers need innovative SoC solutions which deliver more integration and optimization than has historically been required by traditional enterprise workloads. HP’s Moonshot System is a departure from the one size fits all approach of traditional enterprise and embraces a range of ARM partner solutions that address different performance, workloads and cost points.
              What ARM is saying HP Moonshot
              image
              Calxeda and HP’s new Moonshot System are a powerful combination, and sets a new standard for ultra-efficient web and application serving. Fulfilling a journey started together in November 2011, Project Moonshot creates the foundation for the new age of application-specific computing.
              What Calxeda is saying about HP Moonshot
              image
              HP Moonshot System is a game changer for delivering optimized server solutions. It beautifully balances the need for mixing different processor solutions optimized for different workloads under a standard hardware and software framework. Cavium’s Project Thunder will provide a family of 64-bit ARM v8 processors with dense and scalable sever class performance at extremely attractive power and cost metrics. We are doing this by blending performance and power efficient compute, high performance memory and networking into a single, highly integrated SoC.
              What Cavium is saying about HP Moonshot
              image
              Intel is proud to deliver the only server class, 64-bit SoC technology that powers the first and only production shipping HP ProLiant Moonshot Server today. 64-bit Intel Atom processor S1200 family features extreme low power combined with required datacenter class capabilities for lightweight web scale workloads, such as low end dedicated hosting and static web serving. In collaboration with HP, we have a strong roadmap of additional server solutions shipping later this year, including Intel’s 2nd generation 64-bit SoC, “Avoton” based on leading 22nm manufacturing technology, that will deliver best in class energy efficiency and density for HP Moonshot System.
              What Intel is saying about HP Moonshot
              image What Marvell is saying about HP Moonshot
              image
              HP Moonshot System’s high density packaging coupled with integrated network capability provides the perfect platform to enable HP Pathfinder Innovation Ecosystem partners to deliver cutting edge technology to the hyper-scale market. SRC Computers is excited to bring its history of delivering paradigm shifting high-performance, low-power, reconfigurable processors to HP Project Moonshot’s vision of optimizing hardware for maximum application performance at lowest TCO.
              What SRC Computers is saying about HP Moonshot
              image
              The scalability and high performance at low power offered through HP’s Moonshot System gives customers an unmatched ability to adapt their solutions to the ever-changing and demanding market needs in the high performance computing, cloud computing and communications infrastructure markets. The strong collaboration efforts between HP and TI through the HP Pathfinder Innovation Ecosystem ensure that customers understand and get the most benefit from the processors at a system-level.
              What TI is saying about HP Moonshot

              Ubuntu and HTC in lockstep

              Update at 18:05 PM CET: Both Ubuntu’s and HTC’s countdowns have ended, and there was no relationship between the two. Ubuntu, however, managed a clever publicity this way. What Ubuntu is promising now – touch enhanced experience from a single binary through tablets to desktop and TV. It would be even possible to use your Ubuntu smartphone and dock it to a larger touchscreen and Ubuntu presents a tablet interface, add to the tablet a keyboard and mouse and your tablet becomes a desktop PC on which even Microsoft Windows application can be run via one of the thin client solutions, even the presentation may go to your TV screen.  

              This is what I observed today at 12:05 AM CET on ubuntu.com and htc.com:image
              imageWhat is going on? Here is the explanation from HTC HOSTING SPECIAL EVENT IN NYC & LONDON ON FEB 19, HINTS AT NEW M7FLAGSHIP [UnleashThePhones.com, Jan 30, 2013] with the invitation:
              Meanwhile on HTC’s social site yesterday appeared a table with a number of devices covered by cloth, and one of them has a tablet like shape:
              image(via Instagram). Interesting coincidence with the Ubuntu home page declaring:

              Tick, tock, tablet time!

              as seen on my lockstep screenshot above.
              What is this Ubuntu thing anyway?

              Ubuntu comes to the phone, with a beautifully distilled interface and a unique full PC capability when docked [Canonical press release, Jan 2, 2013]

                • Leading open PC platform with huge global following announces mobile version for network operators, OEMs and silicon vendors
                • Fast, beautiful interface for entry level smartphones
                • Unique PC experience on superphones when docked with a monitor, keyboard and mouse
                • Ubuntu raises the bar for mobile UI design, for richer and more immersive apps
                • A single OS for phone, PC and TV
              Canonical today announced a distinctive smartphone interface for its popular operating system, Ubuntu, using all four edges of the screen for a more immersive experience. Ubuntu uniquely gives handset OEMs and mobile operators the ability to converge phone, PC and thin client into a single enterprise superphone.
              “We expect Ubuntu to be popular in the enterprise market, enabling customers to provision a single secure device for all PC, thin client and phone functions. Ubuntu is already the most widely used Linux enterprise desktop, with customers in a wide range of sectors focused on security, cost and manageability” said Jane Silber, CEO of Canonical. “We also see an opportunity in basic smartphones that are used for the phone, SMS, web and email, where Ubuntu outperforms thanks to its native core apps and stylish presentation.”
              Ubuntu is aimed at two core mobile segments: the high-end superphone, and the entry-level basic smartphone, helping operators grow the use of data amongst consumers who typically use only the phone and messaging but who might embrace the use of web and email on their phone. Ubuntu also appeals to aspirational prosumers who want a fresh experience with faster, richer performance on a lower bill-of-materials device.

              The handset interface for Ubuntu introduces distinctive new user experiences to the mobile market, including:

                • Edge magic: thumb gestures from all four edges of the screen enable users to find content and switch between apps faster than other phones.
                • Deep content immersion – controls appear only when the user wants them.
                • A beautiful global search for apps, content and products.
                • Voice and text commands in any application for faster access to rich capabilities.
                • Both native and web or HTML5 apps.
                • Evolving personalised art on the welcome screen.
              Ubuntu offers compelling customisation options for partner apps, content and services. Operators and OEMs can easily add their own branded offerings. Canonical’s personal cloud service, Ubuntu One, provides storage and media services, file sharing and a secure transaction service which enables partners to integrate their own service offerings easily.
              Canonical makes it easy to build phones with Ubuntu. The company provides engineering services to offload the complexity of maintaining multiple code bases which has proven to be a common issue for smartphone manufacturers, freeing the manufacturer to focus on hardware design and integration. For silicon vendors, Ubuntu is compatible with a typical Android Board Support Package (BSP). This means Ubuntu is ready to run on the most cost-efficient chipset designs.
              In bringing Ubuntu to the phone, Canonical is uniquely placed with a single operating system for client, server and cloud, and a unified family of interfaces for the phone, the PC and the TV. “We are defining a new era of convergence in technology, with one unified operating system that underpins cloud computing, data centers, PCs and consumer electronics” says Mark Shuttleworth, founder of Ubuntu and VP Products at Canonical.
              Canonical currently serves the leading PC OEMs: ASUS, Dell, HP, and Lenovo all certify the majority of their PCs on Ubuntu and pre-install it in global markets. Over 20 million desktop PCs run the OS today, and Canonical estimates that close to 10% of the world’s new desktops and laptops will ship with Ubuntu in 2014. Ubuntu is also wildly popular as a server platform, the number one server OS on the key major public clouds and the leading host OS for OpenStack, the open source IAAS.

              With that Canonical had achieved something even much more: Ubuntu for phones – Industry proposition [celebrateubuntu YouTube channel, Jan 2, 2013]

              Watch Ubuntu founder Mark Shuttleworth explain Ubuntu’s mobile strategy and what it offers industry partners.

              So this is the Ubuntu thing, most probably today to be expanded into the tablets as well.

              Will add that information as released in a couple of hours or so!


              Ubuntu unveils tablet experience with multi-tasking [Canonical press release, Feb 19, 2013]

              • Unique ‘side stage’ multi-tasking puts phone and tablet apps on a single tablet screen
              • Secure enterprise tablets with full disk encryption, multiple secure user accounts and standard management tool that covers Ubuntu server, PC and touch
              • Unique convergence across all four form factors: a phone can provide tablet, TV and PC interfaces when docked to the appropriate screen / keyboard / remote
              Canonical today presented Ubuntu’s tablet interface – the next step towards one unified family of experiences for personal computing on phones, tablets, PCs and TVs.
              “Multi-tasking productivity meets elegance and rigorous security in our tablet experience,” said Mark Shuttleworth, founder of Ubuntu and Canonical. “Our family of interfaces now scales across all screens, so your phone can provide tablet, PC and TV experiences when you dock it. That’s unique to Ubuntu and it’s the future of personal computing.”
              “Fashion industry friends say the Ubuntu phone and tablet are the most beautiful interfaces they’ve seen for touch,” said Ivo Weevers, who leads the Canonical design team. “We’re inspired by the twin goals of style and usability, and working with developers who are motivated to create the best possible experience for friends, family and industry.”

              The new tablet design doesn’t just raise the bar for elegant presentation, it breaks new ground in design and engineering, featuring:

              • Real multitasking: Uniquely, Ubuntu allows a phone app on the screen at the same time as a tablet app. The Ubuntu side stage was invented both to enable efficient multitasking and to improve the usability of phone apps on tablets.
              • Secure multi-user: Multiple accounts on one tablet with full encryption for personal data, combined with the trusted Ubuntu security model that is widely used in banks, governments and sensitive environments, making it ideal for work and family use.
              • Voice controlled HUD productivity: The Heads-Up Display, unique to Ubuntu, makes it fast and easy to do complex things on touch devices, and transforms touch interfaces for rich applications, bringing all the power of the PC to your tablet.
              • Edge magic for cleaner apps: Screen edges are used for navigation between apps, settings and controls. That makes for less clutter, more content, and sleeker hardware. No physical or soft buttons are required. It’s pure touch elegance.
              • Content focus: Media is neatly presented on the customisable home screen, which can search hundreds of sources. Perfect for carriers and content owners that want to highlight their own content, while still providing access to a global catalogue.
              • Full convergence: The tablet interface is presented by exactly the same OS and code that provides the phone, PC and TV interfaces, enabling true device convergence. Ubuntu is uniquely designed to scale smoothly across all form factors.
              The Ubuntu tablet interface supports screen sizes from 6″ to 20″ and resolutions from 100 to 450 PPI. “The tablet fits perfectly between phone and PC in the Ubuntu family,” says Oren Horev, lead designer for the Ubuntu tablet experience. “Not only do we integrate phone apps in a distinctive way, we shift from tablet to PC very smoothly in convergence devices.”
              On high end silicon, Ubuntu offers a full PC experience when the tablet is docked to a keyboard, with access to remote Windows applications over standard protocols from Microsoft, Citrix, VMware and Wyse. “An Ubuntu tablet is a secure thin client that can be managed with the same tools as any Ubuntu server or desktop,” said Stephane Verdy, who leads enterprise desktop and thin client products at Canonical. “We are delighted to support partners on touch and mobile thin clients for the enterprise market.”
              Even without chipset-specific optimisation, Ubuntu performs beautifully on entry level hardware. “Our four-year engagement with ARM has shaped Ubuntu for mobile” said Rick Spencer, VP Ubuntu Engineering at Canonical. “We benefit from the huge number of contributing developers who run Ubuntu every day, many of whom are moving to touch devices as their primary development environment.”
              For silicon vendors, Ubuntu is compatible with any Linux-oriented Board Support Package (BSP). This means Ubuntu is easy to enable on most chipset designs that are currently running Android. Ubuntu and Android are the two platforms enabled by Linaro members.
              The Touch Developer Preview of Ubuntu will be published on the 21st February 2013 with installation instructions for the Nexus 7 and Nexus 10 tablet devices as well as smartphones such as the Nexus 4 and Galaxy Nexus. Installable images and source code will be available from developer.ubuntu.com.
              The Preview SDK, which currently supports phone app development, will now be updated to support tablet apps as well. Uniquely, on Ubuntu, developers can create a single application that works on the phone, tablet, PC and TV because it is the same system and all services work across all form factors.
              Visit us at Mobile World Congress: Booth Number: 81D30, App Planet Hall 8.1. The Canonical team will be available to install Ubuntu on your phones and tablets at Mobile World Congress. Note: Ubuntu Touch Developer Preview is a developer build and not a consumer-ready release.

              Ubuntu for tablets – Full video [celebrateubuntu YouTube channel, Feb 19, 2013]

              Watch Ubuntu founder Mark Shuttleworth explain Ubuntu for tablets and what it offers industry partners.

              Touch Developer Preview of Ubuntu to be published on 21 February 2013 [Canonical press release, Feb 15, 2013]

                • Touch Developer Preview of Ubuntu for Galaxy Nexus and Nexus 4 will be available
                • Daily update mechanism to follow progress in Ubuntu
                • Canonical will flash phones at MWC for industry, developers and enthusiasts
                • Preview SDK and App Design Guides already available for developers building touch apps for Ubuntu
              Images and open source code for the Touch Developer Preview of Ubuntu will be published on Thursday 21st February, supporting the Galaxy Nexus and Nexus 4 smartphones.
              They are intended for enthusiasts and developers, to familiarise themselves with Ubuntu’™s smartphone experience and develop applications on spare handsets. Tools that manage the flashing of the phone will be available on the same day in the Ubuntu archives, making it easy to keep a device up to date with the latest version of the Touch Developer Preview.
              Attendees of Mobile World Congress (MWC) in Barcelona, 25th – 28th February can have their phones flashed to Ubuntu by Canonical team members at the Ubuntu stand, booth number 81D30, App Planet Hall 8.1, where Ubuntu will be shown on a range of devices.
              The code release is a milestone in the development program for Ubuntu’™s phone experience, and enables developers to port the platform to other devices. “Our platform supports a wide range of screen sizes and resolutions. Developers who have experience bringing up phone environments will find it relatively easy to port Ubuntu to current handsets” said Pat McGowan, who leads the integration effort that produced the images being released. “We look forward to adding support for additional devices for everyday testing and experimentation.”
              The install process and supported device list are maintained at wiki.ubuntu.com/TouchInstallProcess and will be updated as new devices are added.
              The release also marks the start of a new era for Ubuntu, with true convergence between devices. When complete, the same Ubuntu code will deliver a mobile, tablet, desktop or TV experiences depending on the device it is installed on, or where it is docked. Ubuntu 13.10 (due in October) will include a complete entry-level smartphone experience.
              Canonical has published a Preview SDK and App Design Guides to allow developers to create applications for the full range of Ubuntu platforms. The toolkit provides a range of documented templates to enable native applications to be created quickly and easily. The App Design Guides explain how these templates can be used to design and build beautiful and usable apps. Blackberry Touch developers will be familiar with the Qt/QML environment, which supports rich native touch apps. Developers will not need to cross-compile or package applications differently for phone, tablet, PC and TV. One platform serves all four, a single application binary can do the same.
              On Ubuntu, native and web or HTML5 applications sit as equal citizens and so those developers already developing HTML5 applications will easily gain support for Ubuntu.
              “This release marks the threshold of wider engagement – both with industry and community.” says Mark Shuttleworth, founder of Ubuntu. “For developers, contributors and partners, there is now a coherent experience that warrants attention. The cleanest, most stylish mobile interface around.”

              Availability:
              Go to wiki.ubuntu.com/TouchInstallProcess to download Touch Developer Preview of Ubuntu from Thursday 21st February.
              Go to developer.ubuntu.com to download the SDK to develop applications for Ubuntu.
              Go to http://design.ubuntu.com/apps to read the Apps Design Guide giving advice about designing and building beautiful and usable apps for Ubuntu on the phone.
              Visit Canonical at Mobile World Congress: Booth Number: 81D30, App Planet Hall 8.1.

              Introducing the New HTC One [HTC YouTube channel, Feb 19, 2013]

              With a sleek aluminum body, a live home screen that streams all of your favorite content, a photo gallery that comes to life, and dual frontal stereo speakers, the New HTC One is ready to reshape your smartphone experience.
              With a sleek aluminum body, a live home screen that streams all of your favorite content, a photo gallery that comes to life, and dual frontal stereo speakers, the New HTC One is ready to reshape your smartphone experience.
                We introduced the brand new HTC One to the world in London and New York on 19 February, 2013. This is the full press conference led by HTC CEO Peter Chou in London

                HTC BlinkFeedTM, HTC ZoeTM and HTC BoomSoundTM Deliver HTC One’s Unprecedented New Smartphone Experience

                HTC, a global leader in mobile innovation and design, today announced its new flagship smartphone, the new HTC One. Crafted with a distinct zero-gap aluminium unibody, the new HTC One introduces HTC BlinkFeedTM, HTC ZoeTM and HTC BoomSoundTM, key new HTC Sense® innovations that reinvent the mobile experience and set a new standard for smartphones.
                “People today immerse themselves in a constant stream of updates, news and information. Although smartphones are one of the main ways we stay in touch with the people and information we care about, conventional designs have failed to keep pace with how people are actually using them,” said Peter Chou, CEO of HTC Corporation. “A new, exciting approach to the smartphone is needed and with the new HTC One, we have re-imagined the mobile experience from the ground up to reflect this new reality.”
                HTC BlinkFeed: A personal live stream right on the home screen
                At the centre of the new HTC One experience is HTC BlinkFeed. HTC BlinkFeed is a bold new experience that transforms the home screen into a single live stream of personally relevant information such as social updates, entertainment and lifestyle updates, news and photos with immersive images so that people no longer need to go to separate applications to find out what’s happening. HTC BlinkFeed aggregates the freshest content from the most relevant and interesting sources, giving it to people at a glance, all in one place, without the need to jump between multiple applications and web sites.
                To enable this new dynamic approach to the smartphone, HTC will provide both local and global content from more than 1,400 media sources with more than 10,000 articles per day from some of the most innovative media companies, such as the AOL family of media properties, ESPN, MTV, Vice Media, CoolHunting, Reuters and many others. For more information on HTC BlinkFeed’s content partners, visit the HTC Blog.
                page1image18280
                HTC UltraPixel Camera with HTC Zoe
                The breakthrough HTC UltraPixel Camera redefines how people capture, relive and share their most precious moments. HTC Zoe gives people the ability to shoot high-res photos that come to life in three-second snippets. These Zoes, photos and videos are then displayed in a unique way that brings the gallery to life and transforms the traditional photo gallery of still images into a motion gallery of memories. It also automatically creates integrated highlight films from each event comprised of Zoes, photos and videos set to music with professionally designed cuts, transitions and effects. These highlight videos can be remixed or set to different themes, and can be easily shared on social networks, email and other services.
                To enable this innovative camera experience, HTC developed a custom camera that includes a best-in-class f/2.0 aperture lens and a breakthrough sensor with UltraPixels that gather 300 percent more light than traditional smartphone camera sensors. This new approach also delivers astounding low-light performance and a variety of other improvements to photos and videos. In addition, the perfect self-portrait or video is just a tap away with an ultra-wide angle front-facing camera which supports 1080p video capture. Multi-axis optical image stabilisation for both the front and rear cameras also helps ensure video footage smoother whether stationary or on the move. HTC UltraPixel camera adds many other features and effects such as enhanced 360′ panorama, time sequencing and object removal.
                HTC BoomSound
                The new HTC One offers the best audio experience of any mobile phone available today. HTC BoomSound introduces for the first time on a phone, front-facing stereo speakers with a dedicated amplifier and an amazing full HD display that immerses people in music, videos, games and the YouTubeTM clips they love. BeatsTM Audio integration is enabled across the entire experience for rich, authentic sound whether you’re listening to your favorite music, watching a YouTube video or playing a game.
                HDR recording uses advanced dual microphones and audio processing to capture clean, rich sound that is worthy of high-definition video footage. Phone calls sound great on HTC One thanks to the addition of HTC Sense VoiceTM, which boosts the call volume and quality in noisy environments so that conversations come through loud and clear.
                HTC Sense TV
                HTC Sense TV transforms the new HTC One into an interactive program guide and remote control for most TVs, set-top boxes and receivers. Tapping the power of the cloud, Sense TV makes it simple and intuitive to see what’s on and find that favourite show.
                Metal Unibody Design
                Wrapped in a zero-gap aluminum unibody and sporting a brilliant 4.7”, Full HD (1080p) screen, the new HTC One features the latest Android Jelly Bean operating system and LTE network technology to offer blazingly-fast browsing in a package that combines premium design with breakthrough build quality.
                Available in stunning silver and beautiful black, the sleek and crafted aluminum unibody sits comfortably in the hand and showcases HTC’s unique antenna technology, which helps people achieve a crystal clear signal. The display also resists scratches and reduces glare, whilst offering incredible 468ppi resolution and rich, natural colours.
                Global Availability
                The new HTC One will be available globally through more than 185 mobile operators and major retailers in more than 80 regions and countries beginning in March. For more information and to pre-register for the new HTC One, visit http://www.htc.com.

                Analysis: Michael Dell acquiring the rest 84% stake in Dell for $2.15B in cash, before becoming the next IBM, and even getting the cash back after the transaction

                OR Michael Dell’s new cash skimming strategy by privatization and targeting the high-growth and fast SME/SMB (small to medium-sized) businesses with solutions worldwide which will help the adoption of Dell solutions by larger enterprises later on as well. OR how to exploit Dell’s competitive advantage of having NO legacy (“old things”/”old”) business in the enterprise market versus the established enterprise solution players like IBM, HP, Oracle et al. OR the story of leaving its traditional PC business behind, and how the explosion of consumer IT devices and consumerization of IT is playing well with this specific kind of small to large enterprises focus by Dell. OR Michael’s way of showing a fig to all stock market actors (the diversity of “analysts” included) inspired by his thinking ‘You are utterly stupid, and will remain so’. OR the huge bonus for creating the tremendous value in the last 6 years he’d lead the company again, as described in the details sections of this post, as well as earlier in the Pre-Commerce and the Consumerization of IT [Sept 10, 2011] and Thin/Zero Client and Virtual Desktop Futures [May 30, 2012] posts on this same blog. OR, in the very worst case, getting a normal evaluation (sooner or later) of his 16% of shares.

                ANYWAY Michael will become hyper-rich. As a minimum think of attaining a $36B value instead of his current $3.8B for his 16% share of Dell when the company indeed becomes the next IBM. This is absolutely possible, and for no more time than another 6 years he will continue to lead Dell. See more about all that in the first section of this post titled:

                Michael Dell: We are not a PC company anymore

                Update: Highlights From Dell Tech Camp 2013 [DellVlog YouTube channel, Feb 12, 2013] will provide the latest and only 3 minutes long glimpse into the current state of such a “non-PC company anymore”

                The event, now in its fourth year featured: * Dell’s latest technologies and solutions that address customer issues and challenges around Cloud Computing, Data Insights, Mobility and Converged Infrastructure * Speakers from Dell including Marius Haas, President of Enterprise Solutions; Aongus Hegarty, President, Dell EMEA; and Tony Parkinson, Vice President, EMEA Enterprise Solutions alongside a number of Dell solutions experts, customers and partners * Hands-on, deep-dive sessions around Dell’s latest Cloud, Storage, Mobility and Convergence solutions * Customer and partner insight on the latest enterprise technology challenges and trends * Two live-streamed Think Tank events at the event which bring together some of the industry’s principal thought leaders to discuss Converged Infrastructure and enterprise solutions for SMBs

                Here is a slide copy which is speaking for itself in showing the difference:
                image

                Then read the second section of this post titled:

                The Indian case as a proofpoint of readiness

                Before those detailed background sections I should elaborate somewhat more about the founder’s cash skimming approach. Michael Dell’s classical business recipe was to collect the bills ahead of paying his suppliers. What was possible in the 90’s is not anymore. Nevertheless: Dell Push-Pull Supply Chain Strategy [Ian Johnson YouTube channel, June 11, 2012]

                http://www.driveyoursuccess.com this video explains how to run Dell’s Push-Pull supply chain strategy.

                Now he decided to apply the original idea to the current state of Dell’s business. This was the sole reason of his one a half year effort taking Dell private with which he succeeded 3 days ago. The official press release, certainly, has no mention of that at all, just the usual bullshit:

                Dell Enters Into Agreement to Be Acquired By Michael Dell and Silver Lake [press release, Feb 5, 2012]

                Mr. Dell said: “I believe this transaction will open an exciting new chapter for Dell, our customers and team members. We can deliver immediate value to stockholders, while we continue the execution of our long-term strategy and focus on delivering best-in-class solutions to our customers as a private enterprise. Dell has made solid progress executing this strategy over the past four years, but we recognize that it will still take more time, investment and patience, and I believe our efforts will be better supported by partnering with Silver Lake in our shared vision. I am committed to this journey and I have put a substantial amount of my own capital at risk together with Silver Lake, a world-class investor with an outstanding reputation. We are committed to delivering an unmatched customer experience and excited to pursue the path ahead.”

                An opinion a little bit closer to the real aim:
                Dell Computers In Buyout Bid By Firm’s Founder [spworldnews YouTube channel, Feb 5, 2012]

                With an attached background article: Dell Heads For Radical Restructure

                Dell Computers was built from scratch in a college dorm room, and now its founder launches a $24.4bn bid to make the firm private. Once-dominant US computer company Dell has unveiled a £15.5bn plan to take the firm private in a buyout by founder Michael Dell. The firm said it had signed “a definitive merger agreement” that gives shareholders $13.65 (£8.70) per share in cash – a premium of 25% over Dell’s January 11 closing share price.
                “I believe this transaction will open an exciting new chapter for Dell, our customers and team members,” Mr Dell said.
                The deal was unveiled with investment firm Silver Lake, and backed by a $2bn (£1.27bn) loan from Microsoft. Dell shares dropped 2.6% to $13.27 on the Nasdaq after the plan was announced. The move, which would de-list the company from stock markets, could ease some of the pressure on Dell, which is cash-rich but has been seeing profits slump.
                Michael Dell Michael Dell founded the firm in his college dorm room. The Texas-based computer maker, which Mr Dell started in his college dormitory room, once topped a market capitalisation of $100bn (£63bn) as the world’s biggest PC producer.
                The plan is subject to several conditions, including a vote of unaffiliated stockholders. It calls for a “go shop” period to allow shareholders to determine if there is a better offer.
                “We can deliver immediate value to stockholders, while we continue the execution of our long-term strategy and focus on delivering best-in-class solutions to our customers as a private enterprise,” Mr Dell said of the plan.
                Dell was a pioneer of phone-ordered, custom built PCs in Britain during the 1990s.
                The company worked from facilities in the Irish Republic, Britons were able to specify their hard and software requirements before machines were delivered to their home.

                But a realistic assesment I’ve found only in that source:
                Here’s The Secret Private-Equity Plan For Dell by Henry Blodget [Daily Ticker on Yahoo! Finance, Feb 6, 2013] CLICK TO THE LINK AS THERE IS A VERY GOOD VIDEO RECORD OF DISCUSSION BETWEEN DAILY TICKER’S HOSTS AARON TASK AND HENRY BLODGET

                Earlier, I wrote about what Dell was likely to do now that it is taking itself private.

                I suggested that Michael Dell and his private-equity backers would coin money, in part by paying themselves a huge one-time dividend with the cash sitting on Dell’s balance sheet.

                I also bemoaned the fact that Michael Dell had to take his company private to coin this money instead of executing his plan as a public company and sharing the loot with his current shareholders.

                More broadly, I complained that too few public-company management teams (like Dell’s) have the balls to tell short-term public-market investors to take a hike and implement long-term strategic plans.

                And that is indeed a bummer.

                But it’s also the reality.

                Most public-company management teams are so cowed by Wall Street’s short-term demands that they sacrifice the vision and cojones that enabled them to build big public companies in the first place. And then they just manage their companies from quarter to quarter while avoiding the tough, ballsy decisions that separate great companies from good ones.
                Anyway, Dell has decided to go private.
                So the questions are:
                • Why is Dell going private?
                • What is Dell going to do as a private company?
                Earlier, I speculated about what a generic private-equity firm might do with Dell after taking it private.
                I have since spoken with sources familiar with the specific Dell situation. So I have some better information.
                Here’s what the sources told me:

                • Dell is going private because the company is in the middle of a 5-year transformation from “PC manufacturer” to “single-source provider of corporate cloud and security solutions” (sort of a mini-HP or mini-IBM model) and the market is giving it no credit for that transformation. The company feels it has been making good progress on its transformation, but management is worried about meeting quarterly targets and other milestones that are slowing the transformation down. And the stock just keeps dropping. So Michael Dell and Silver Lake felt there was an opportunity to be bolder and more aggressive with Dell as a private company.


                • Silver Lake and Michael Dell are borrowing about $17 billion of the $24 billion Dell purchase price ($15 billion from banks and $2 billion from Microsoft), which means they are temporarily putting up about $7 billion of equity capital. Dell has $15 billion of cash sitting in the bank. So it seems highly likely–we’ll know in 45 days, when the SEC filing appears–that Silver Lake and Dell will pay themselves a big dividend to cover their cash investment. After that point, they’ll be playing with house money. (Correct–it doesn’t suck to be in the private-equity business!).

                • The secret plan for Dell is NOT to fire thousands of people and chop the company up and sell off the parts. Sure, some folks might get fired and some divisions might get sold. But the plan is to invest in the company’s product suite, R&D, pricing*, and marketing capabilities, thus accelerating Dell’s transformation into a solutions provider. This investment will temporarily reduce the company’s free cash flow and profits, which public-market investors might (stupidly) have freaked out about. This was one of the reasons Michael Dell wanted to take the company private.

                • Dell’s plan is to focus on selling its solutions to mid-market companies (~500 employees [more precisely to companies with 215-2,000 employees, see the details in the first “Michael Dell: We are not a PC company anymore” section of my analysis]), not the gigantic Fortune 500 companies that are already well-served by IBM, HP, and other huge “solutions” providers. By providing comprehensive solutions for cloud and security to companies that are not currently well-served, Dell also hopes to increase demand for PCs at these companies–PCs that Dell will obviously provide.
                The private-equity firm backing Dell, Silver Lake, has a long history of investing in troubled tech companies, and it has posted excellent returns over the years. Silver Lake’s target investment time horizon is about 5 years, which is about 100-times longer than the time horizon of the typical public-market investor. So Silver Lake is willing to depress Dell’s earnings and cash flow for a couple of the years while investing heavily to transform the company–thus, hopefully, creating a more valuable Dell over the long term.
                That said, Dell’s competitor HP is not so optimistic and had these crushing statements about Dell’s turnaround:
                That said, Dell’s competitor HP is not so optimistic and had these crushing statements about Dell’s turnaround:
                “Dell has a very tough road ahead. The company faces an extended period of uncertainty and transition that will not be good for its customers. And with a significant debt load, Dell’s ability to invest in new products and services will be extremely limited. Leveraged buyouts tend to leave existing customers and innovation at the curb. We believe Dell’s customers will now be eager to explore alternatives, and HP plans to take full advantage of that opportunity.”
                Public market investors and wimpy management teams take note: Your obsession with quarterly performance creates the opportunity for firms like Silver Lake to come along and buy your companies on the cheap, thus coining money for their private-market investors. In short, your quarterly earnings obsession is ruining companies and destroying value. So grow a pair, tell Wall Street to be patient, and focus on creating value for the long term!
                * What I mean by “investing in pricing” is cutting prices on hardware and, thus, reducing profit per unit. This will hurt profit margins but make the company’s solutions more attractive to customers. And given that the focus is now on “solutions,” they’ll be looking to sell the hardware at closer to cost and then make money on add-on software and services.

                In addition I will draw your attention to the following facts in the first “Michael Dell: We are not a PC company anymore” section of my analysis:

                • John Swainson President of Dell Software Group was senior advisor to Silver Lake before he came to Dell a year ago to form this most essential unit for Dell’s long-term business strategy. His earlier role was to advise on value creation activities for Silver Lake’s portfolio companies. Prior to that he was CEO of the big software company Computer Associates (now CA Technologies) for five years, and before that worked for IBM Corp for more than 26 years, including seven years as general manager of the Application Integration Middleware Division, a business he founded in 1997. During that period, he and his team developed the WebSphere family of middleware products and the Eclipse open source tools project. He also led the IBM worldwide software sales organization.
                • Marius Haas hired in August to lead the Enterprise Solutions Group (ESG) came from Kohlberg Kravis Roberts & Co. L.P. (KKR). KKR was the leader of the leveraged buyout boom of the 1980s. Its biggest LBO deal is still the biggest one in the histroy of mankind, and well documented in both a book and a film Barbarians at the Gate: The Fall of RJR Nabisco. Prior to KKR Haas was senior vice president and worldwide general manager of the Hewlett-Packard (HP) Networking Division, and also served as senior vice president of Strategy and Corporate Development. Before that he worked in senior operations roles at Compaq and Intel Corporation.
                • Jai Menon became CTO of Dell’s Enterprise Solutions Group in last August but before that he was CTO and VP, Technical Strategy for IBM’s Systems and Technology Group (STG). … Jai joined IBM Research in 1982. He has made many contributions to the storage industry and to IBM in the areas of disk emulation, storage controllers, disk caching, storage networking, storage virtualization, file systems and RAID. He is one of the early RAID pioneers that helped create a technology that is now a $20B industry.

                With such high level of private equity, leveraged buyout and both business and technical strategy expertise in the Executive Leadership Team, as well as top enterprise technology leadership behind that, Michael Dell is best positioned to reap both immediate and ongoing financial benefits of unprecedented scale from taking Dell private. Some more information from the business media to support my statement:

                Inside Michael Dell’s World [The Wall Street Journal, Feb 5, 2013]

                … The buyout would give Mr. Dell the largest stake in the company, ensuring that the 47-year-old is the one who gets to oversee any changes. … As part of the deal to go private, Mr. Dell would contribute his nearly 16% stake valued at about $3.7 billion, plus $700 million from an investment firm he controls, the people said. Microsoft would invest about $2 billion in the form of a subordinated debenture, a less-risky investment than common stock. … Microsoft isn’t expected to get board seats or governance rights in a closely held Dell, one of the people said. Instead, the companies would tighten their relationship regarding use of Microsoft’s Windows software, the person said.

                Microsoft Loan Said to Help Dell While Avoiding Favorites [Bloomberg, Feb 5, 2013]

                Microsoft Corp. (MSFT) is using a $2 billion loan to help finance Dell Inc. (DELL)’s $24.4 billion buyout to bolster one of the largest makers of computers using Windows software and fend off competition from Google Inc. and Apple Inc.

                Steve Ballmer, Microsoft’s chief executive officer, discussed the loan with Dell founder and CEO Michael Dell, according to two people familiar with the negotiations. Microsoft opted for a loan rather than an equity investment to avoid rankling other personal-computer makers that use Windows, said one of the people, who asked not to be named because the matter isn’t public. …

                … Microsoft’s investment helps to support “the long term success of the entire PC ecosystem,” the company said in a statement. Peter Wootton, a spokesman for Microsoft, declined to comment beyond the statement.

                Microsoft won’t be involved in day-to-day operations, Dell Chief Financial Officer Brian Gladden said in an interview. …

                Michael Dell coughs up $750 million cash to buy out Dell [Reuters, Feb 6, 2013]

                Michael Dell and his investment firm are ponying up $750 million in cash toward the $24.4 billion purchase of Dell Inc to help bankroll the largest private equity-backed buyout since the financial crisis.

                The Dell founder and CEO this week struck a deal to take private the company he created out of a college dorm room in 1984, partnering with private equity house Silver Lake and Microsoft Corp.

                Michael Dell will contribute $500 million of his own cash, and MSDC Management – an affiliate of his investment vehicle, MSD Capital – will contribute another $250 million, according to a company filing on Wednesday.

                Dell Inc also said it is targeting the repatriation of $7.4 billion of cash now parked abroad to help finance the deal. That may dismay some shareholders, as a hefty tax is usually levied on cash brought back from overseas.

                The deal, which ends Dell’s rocky 24-year run on the Nasdaq just as the once-dominant PC maker struggles to revive growth, is contingent on approval by a majority of shareholders — excluding Michael Dell himself.

                Several shareholders, including prominent investor Frederick “Shad” Rowe of Greenbrier Partners, have spoken out against the deal, protesting a lack of specifics as well as a potential conflict of interest with Michael Dell being the company’s single largest shareholder with a roughly 16 percent stake.

                “Some shareholders are glad. But there are others who feel it’s a raw deal,” said Shaw Wu, an analyst with Sterne Agee, who has spoken with several Dell shareholders since the announcement but declined to provide further details.

                The company has not given many specifics on what it would do differently as a private entity, angering some shareholders who said they needed more information to determine whether the $13.65-a-share deal price – a 25 percent premium to Dell’s stock price before buyout talks leaked in January – was adequate.

                On Wednesday, an individual shareholder filed the first lawsuit, in Delaware, attempting to stop the buyout. The lawsuit – which is seeking class-action status – maintains that the $13.65 per share offered sharply underestimated the company’s long-term prospects.

                By engaging in the going private transaction nowin the midst of the company’s transition from a PC vendor to full service software and enterprise solution providerthe board is allowing defendants M. Dell and Silver Lake to obtain Dell on the cheap,” read the lawsuit filed by Catherine Christner.

                Dell, the world’s No. 3 personal computer maker, broke down details of the equity and debt financing secured for the buyout in Wednesday’s filing.

                Silver Lake is putting up $1.4 billion, while banks including Bank of America, Barclays, Credit Suisse and RBC will provide roughly $16 billion in term loans and other forms of financing.

                Wednesday’s filing also disclosed that under certain circumstances if the merger cannot be completed, Michael Dell and Silver Lake could have to pay a termination fee of up to $750 million to the company.

                What Should Dell Shareholders Do? [Seeking Alpha, Feb 6, 2013]

                … let’s have a look at some balance sheet items. If the company was highly leveraged, things would be different and this price could make some kind of sense given the risk. But, if we look at the numbers, at the end of last quarter Dell had $11.2 billion in cash and equivalents, a long term debt of $5 billion and a total equity of $10.1 billion. In other words, a very healthy balance sheet.

                Putting things together, it’s very hard to recommend accepting the current offer. Unless you have another investment where you can put your money to work at a higher rate of return than you would by sticking with Dell (and with the safety of its balance sheet) I cannot recommend selling the shares at this price.

                Of course, Michael Dell and Silver Lake know the company is worth much more, and that’s why they are offering to take the company private.

                Unplugged: Why is Michael Dell buying back his company? [USA TODAY, Feb 5, 2013]

                … Because the 47-year-old CEO is already a billionaire, who has had scrapes with the Securities and Exchange Commission, critics contend that he has become adept at financial engineering and is simply sticking it to current shareholders to enrich himself yet even more. (The chairman and the company settled fraud allegations with the SEC in October 2010.)

                No doubt, Michael Dell is a capitalist. But I doubt his sole motivation is pure greed and a perverse joy in sticking it to shareholders, which include employees.

                Yet having met and interviewed Michael Dell on a number of occasions over the past decade, I think he is far more complex than a money-grubbing tech titan without heart or soul. In fact, I think he really cares about his legacy, the company and Austin. …

                MEANWHILE BELOW YOU CAN FIND A FEW “NO-CASH-SKIMMING” VIEWS OF THE PROPOSED DEAL:

                Channel: Happy, Worried [CRN, Feb 5, 2013]

                Solution providers see two sides to Dell’s privatization move.

                The first side is the opportunity for Dell to go through the painful transformation into an enterprise solution developer. Paul Clifford (pictured), president of Davenport Group, a St. Paul-based solution provider, said Dell should be able to accelerate its enterprise transformation without the eyes of Wall Street on them. “Dell is bringing us great products and support,” Clifford said. “If they go private, I think we’ll see more good stuff.”

                The second side is how Microsoft’s new relationship with Dell will impact the rest of the industry. Michael Goldstein, CEO of LAN Infotech, a Fort Lauderdale, Fla.-based solution provider, said such a close relationship between the two is a little scary. “Dell is Microsoft’s biggest reseller partner,” Goldstein said. “They’re hugely important. Seeing the two of them combined makes me a little nervous because we’re a smaller solution provider, and we don’t want to get lost in the mix if [the deal] does happen.”

                What Will We Learn From Dell Tomorrow? [Bloomberg YouTube channel, Feb 5, 2013]

                Feb. 4 (Bloomberg) — Today’s “BWest Byte” is 1, for how many more days until we find out what’s happening at Dell. Cory Johnson reports on Bloomberg Television’s “Bloomberg West.” (Source: Bloomberg)

                Dell Gets Hit Hard by Sluggish Worldwide PC Market [Bloomberg YouTube channel, Nov 16, 2012]

                Nov. 15 (Bloomberg) — Nicole Lapin reports on trouble at Dell. She speaks on Bloomberg Television’s “Bloomberg West.” (Source: Bloomberg)

                Dell and HP down for the count? [CNNMoney YouTube channel, Aug 22, 2012]

                Slow to find success in the realm of mobile, HP and Dell are caught in a downward slide with no apparent end in sight.


                Michael Dell:

                We are not a PC company anymore

                Michael Dell addresses Dell’s future [published on FortuneMagazineVideo YouTube channel, Jan 16, 2013; recorded on July 17, 2012]

                Michael Dell, Chairman and CEO, Dell, was interviewed by Fortune’s Andy Serwer at Brainstorm Tech in Aspen. They talked about the PC market, the enterprise, China, and Apple. He also announced a new $60M venture fund and said sales have slowed in China.

                Full transcript: Michael Dell addresses Dell’s future [Fortune, July 17, 2012]
                See also: Pre-Commerce and the Consumerization of IT [this same ‘Experiencing the Cloud’ blog, Sept 10, 2011]

                A sure sign of that “not a PC company anymore” statement came recently with
                Financial Reporting Change – Product and Service-based P&L by Robert L Williams [DellShares blog, Jan 10, 2013]

                In 2009, we charted our course to become a leading provider of end-to-end solutions. We’ve been executing our strategy with discipline and consistency ever since, investing for growth in the data center, software and services.  Our Enterprise Solutions and Services business revenue was about $14 billion in FY08 and by Q3 FY13 we saw an annual run rate approaching $20 billion.  We now have critical mass in these businesses, and we need a financial reporting structure that supports their growth and success.  Today in an 8-K filing Dell announced in the first quarter of fiscal 2014, which begins on February 2, 2013, it will replace its current global customer segment reporting structure with the following product and services groups:
                •  End User Computing (EUC), led by Jeff Clarke, vice chairman of operations and president Dell EUC, will include a wide variety of mobility, desktop, desktop virtualization, third-party software, and client-related services and peripheral products.
                •  Enterprise Solutions Group (ESG), led by Marius Haas, president Dell ESG, will include servers, networking, storage, and related peripherals products.
                •  Dell Services, led by Suresh Vaswani, president Dell Services, will include a broad range of IT and business services, including support and deployment services, infrastructure, cloud, and security services, and applications and business process services.
                •  Dell Software Group, led by John Swainson, president Dell Software will include systems management, security and business intelligence software offerings.
                Steve Felice, chief commercial officer, will continue to lead Dell’s global sales and marketing organizations.

                That was already well manifested at Dell World [2012] Influencer Panel Highlights – December 11, 2012 [DellVlog YouTube channel, Dec 11, 2012]

                Highlights from the Dell World Influencer Panel and Q&A with Michael Dell and Dell’s Executive Leadership Team held December 11, 2012 live from Austin, TX. Join the conversation on Twitter via #DellWorld.

                The Dell wants to be more than your box provider post from The Register summarizes the above [Dec 12, 2012] as:

                Solutions in hand – but supply your own drinks

                … Dell is dead serious about being a “solution provider” … – and it has to be, because as we all know the margins are in software and services.

                That’s why Steve Felice, Dell co-president and chief commercial officer, bragged that Dell had spent over $10bn in the past five years to acquire Perot Systems, Quest Software, Wyse Technologies, Scalent, Boomi, AppAssure, SonicWall, KACE, SecurityWorks, and a slew of others to build out its portfolio of services and software.

                The executive roundtable was a way to introduce some of the new faces of Dell to customers and partners, with just about everybody but Dell, the man, and [Steve]Felice [Dell co-president and chief commercial officer], who joined Dell in 1999 from third-party tech support firm DecisionOne, and Jeff Clarke, vice chairman and co-president in charge of global operations and end user computing, being the old Dell hands.
                Marius Haas, president of the cross-group Enterprise Solutions (gulp!) group, just came aboard this year after a short stint at private equity firm KKR and a long career at rival HP. John Swainson, who runs Dell’s Software Group, is a long-time IBMer who turned CA Technologies around. After the surprise resignation last week of long-time EDS executive Steve Schuckenbrock, who has been at Dell since 2007 and who has run its Services and then its Large Enterprise groups, Suresh Vaswani is the new president of the Services group and was formerly in charge of Dell’s Indian services group; before that, he was the co-CEO at Indian services giant Wipro. The consensus on the street seems to be that Schuckenbrock wants to be a CEO, and it ain’t gonna happen at Dell. (There could be some openings up at HP.)
                The opening of Dell World was also a way to toss out some more statistics. Dell says that it has presence at 95 per cent of the Fortune 500, and that more than 10 million small and medium businesses rely on its solutions (gulp!) and services (okay, new rule, when Dell says services, you have to pay the person to your right $5.) Dell also has something on the order of 115,000 partners, with about 650 of them showing up at Dell World to get the inside track.
                The execs were also put on the spot to answer questions, and Dell, the man, was asked about what he thought about the future of the PC business, something on the minds of both HP and Dell these days and not something that IBM is worried about much these days. (IBM is more worried about the future of systems and services, and it will have its own issues here, fear not.)
                “We spend a lot of time talking about this and working and working on it together,” Dell said, referring to his collaboration with Clarke. “We’re quite optimistic about Windows 8. You’re going to hear over the next few days about a broad set of products. Think about a product like Latitude 10, which is a thin, light tablet that also docks to become a full workstation – totally secure, works with all of the other Windows things that a customer have, runs Microsoft Office, and has a USB port, and so on.
                “That’s the kind of product that really excites out customers and helps address some of the challenges that exist. We think the touch experience is incredible. We have this stunning 27-inch, quad HD display with our XPS27 all-in-one. We think we are seeing a real revolution in the PC.”
                Clarke was more adamant: “We still believe that the PC is still the preferred device to do work, to drive productivity, to create. I look at the long-term prospects of the PC business and I am very optimistic; 85 per cent of the world’s population has a PC penetration rate of less than 20 per cent. I look at the middle class as it grows over the next 20 years from 1.8 billion people to 4.9 billion people, and I see the opportunity there. I look at the number of small businesses that we sell to today, and the creation of small businesses continues at an unprecedented rate and serving that with PCs is still a huge opportunity for the company.”

                One of the big events at Dell World on Wednesday, which Felice hinted at, would be a partnership with the Clinton Foundation, the organ of former president Bill Clinton, to help spur the growth of small businesses. (I doubt they talk about solutions much.)

                The real issue, explained Dell, was moving from selling individual point products to standing up combinations of servers, storage, networking, PCs, software, and services to solve a particular problem. This is precisely what every major systems player is trying to do, and the big independent OS suppliers (Microsoft and Red Hat) as well, who treat x86 iron the same way they treat electricity: as a given and not worth much consideration or profits.

                The company  issued the following press releases to clarify everything:
                Dell Investment in Enterprise Solutions and Services Gives Customers Worldwide the Power to Do More [Dell press release, Dec 11, 2012], an important excerpt to add to the above

                Strategy, Execution and Progress
                Dell’s long-term strategy is grounded upon helping IT organizations more rapidly respond to business demands, improve efficiency and capitalize on new, standard-based technologies. Dell is successfully executing on its long-term strategy, including key acquisitions of Wyse, SonicWall and Quest Software in 2012, while growth in its Enterprise Solutions and Services businesses continues to outpace its competitors.

                • Dell’s server and networking business grew 11 percent in the 3rd quarter, representing the 12th consecutive quarter of growth.
                • Dell’s server business grew revenue 4 percent in the 3rd quarter, and was the only provider among the top three to achieve positive unit growth, while other providers lost share.
                • Dell’s storage business (Dell-branded storage) grew at twice the rate of a major competitor and continues to outpace other providers, many of which reported declining revenue.

                Dell Enterprise Solutions and Services now represent one-third of the company’s revenue and half of its gross margin. These businesses, which were about $14 billion in FY08 are on an annual run-rate approaching $20 billion through the 3rd fiscal quarter, are up 4 percent from the previous year. Dell is making solid progress in executing its strategy and continues to add to capabilities valued by customers.

                Dell Backs Growing Businesses With Scalable Technology Solutions, Resources and Capital to Fuel Job Creation, Economic Growth Worldwide [Dell press release, Dec 11, 2012], an important excerpt to add to the above

                Dell today announced a renewed commitment to accelerate growth of small and midsize companies with scalable technology solutions, resources for entrepreneurs, and a new partnership with Clinton Global Initiative designed for next generation business founders.

                Fast-growing entrepreneurial companies are an important catalyst for global economic recovery and job creation,” said Michael Dell, Chairman and CEO of Dell. “At Dell, we’re delivering agile, efficient and powerful solutions to help entrepreneurs succeed today, scale quickly and have their ventures grow as big as their dreams and ambitions.”

                Dell started to communicate heavily this change about one and a half year ago as evidenced by My Take on Dell’s Solutions Strategy post by Lionel Menchaca, Chief Blogger [Direct2Dell blog, June 13, 2011]. More communication since then were given in the following posts:
                My Thoughts on Dell’s Analyst Meeting by Lionel Menchaca, Chief Blogger [Direct2Dell blog, July 5, 2011]
                I see a mixed data center environment in your future by Praveen Asthana [Direct2Dell blog, Dec 15, 2011]
                Enterprise Solutions and Services Strength Highlight Dell’s FY2012 Results by Lionel Menchaca, Chief Blogger [Direct2Dell blog, Feb 21, 2012]
                Business Intelligence for the Mid-Market by Vickie Farrell [Direct2Dell blog, Feb 27, 2012]
                New Dell Appliance Makes Data Warehouses Simple and Affordable by Vickie Farrell[Direct2Dell blog, July 11, 2012]
                How Dell Helped Grow Financial Grow by Scott Schram [Direct2Dell blog, May 21, 2012]

                In my prior role with Dell I was part of the SMB business transformation team charged with integrating M&A acquisition solutions including KACE, Boomi, Compellent, SecureWorks and Force10 Networks into the core business. So when I moved into my new role with our Commercial Verticals organization focused on the Financial Services industry, I was anxious to observe firsthand how this newly acquired Dell IP was meeting customer needs. It didn’t take long.

                Dell announces the completion of its acquisition of Make Technologies by Suresh Vaswani, Chairman–Dell India [Direct2Dell blog, May 24, 2012]
                The NHS Information Strategy and Information-Driven Healthcare by Andrew Jackson [Direct2Dell blog, May 29, 2012]
                Dell AppAssure takes you beyond backup by Zorian Rotenberg [Direct2Dell blog, June 12, 2012]

                It’s been a little over four months since Dell acquired AppAssure, and we’ve settled right into the Dell family. Today at the Dell Storage Forum in Boston, Darren Thomas announced the first new Dell AppAssure release – Dell AppAssure 5 – designed to allow customers to achieve higher levels of scale, speed and efficiency for backups of big data sets.

                Mid-size organizations can gain first-mover advantages with desktop virtualization by Brent Doncaster [Direct2Dell blog, June 13, 2012]

                Watch how DVS Simplified offers a simple, easy-to-deploy and operate VDI appliance that delivers traditional desktop virtualization benefits in an all-in-one package. Learn more at:http://lt.dell.com/lt/lt.aspx?CID=823…

                Start virtualizing desktops with DVS Simplified DaaS – a cloud-based solution for desktop virtualization by Janet Diaz Solutions Communications Manager, Desktop Virtualization Solutions – End User Computing at Dell [Inside Enterprise IT blog from Dell, June 22, 2012]

                DVS Simplified DaaS delivers full-featured virtual desktops delivered from Dell’s state-of-the-art data centers and powered by Desktone’s industry-leading, secure, multi-tenant DaaS platform. DVS Simplified DaaS is ideal for organizations that want a cloud-based virtual desktop infrastructure (VDI) solution, simple onboarding and management (deployment takes only a few days and can include a proof of concept), a low set-up cost with monthly subscription-based pricing, and the flexibility to scale from a few seats to thousands of seats.
                DVS Simplified DaaS provides organizations of all sizes – SMBs, large enterprises and public sector entities – the ability to quickly deploy a VDI solution to address a variety of business imperatives. Picture workers in industries such as healthcare, insurance, construction, etc. using different devices to connect to their desktops while in the field. Or picture a company needing to quickly provision hundreds of desktops for an incoming class of interns (and also needing to redeploy these desktops at the end of the internship program). Or think of an organization that has a few employees on a different continent but does not want to invest in data centers and IT resources there. DVS Simplified DaaS can be the right solution in each of these cases.

                Knock Down the Barriers to Desktop Virtualization by Ann Newman, a technology writer, blogger and editor for Digital Online Marketing at Dell with specialties in BYOD, desktop virtualization, Windows 8 and other high-technology topics. Follow Ann on Twitter at @DellWebWoman [DellWorld 2012 blog, Oct 12, 2012]

                In today’s business environments, where BYOD (bring your own device) is becoming a fact of life, desktop virtualization is becoming a must-have. Don’t let the old barriers hold you back.

                Winning the data center by Paul Shaffer [Direct2Dell blog, June 18, 2012]
                Dell’s Enterprise Solutions Strategy Will Drive Company’s Long-Term Growth [Dell press release, July 13, 2012]

                “Through strategic acquisitions and organic growth, we are creating innovative solutions that provide more value and competitive edge for our customers,” Michael Dell, chairman and CEO, told stockholders. …
                Mr. Dell and Brian Gladden, Dell CFO, outlined the steps taken by the company to establish Dell as a full-service solutions company, and how the company’s business has shifted, with enterprise solutions and services accounting for 50 percent of its gross margin in the first quarter of fiscal year 2013. Among those actions was the formation earlier this year of a Software Group to add to Dell’s enterprise solutions capability, accelerate strategic growth and further differentiate the company from competitors with standards-based, scalable and flexible Dell-owned intellectual property.
                Dell is building its software portfolio in part through strategic acquisitions. The company recently announced a definitive agreement for Dell to acquire Quest Software, an award-winning IT management software provider offering a broad selection of IT solutions. The Quest acquisition is expected to be completed in Dell’s fiscal third quarter. Dell has made eight acquisitions in the last 12 months and 16 in the past two years.

                Dell Software Leadership Team Event #DellSoftware by Sarah Richardson Luden [Direct2Dell blog, July 19, 2012]

                Dell’s software organization leverages the strength of existing Dell software assets, as well as those obtained through organic and acquisitive growth, to better provide our customers with competitively differentiated hardware, software and services solutions. Dell recently announced its intent to acquire Quest, an IT management software provider, which extends Dell’s existing capabilities in security and systems and data management.
                Dell Software will initially focus on these four core areas:

                Dell CloudExpo Keynote Presentation from Kevin Hanes, Executive Director of Dell Services by Stephen Spector [#DellSolves blog, June 14, 2012] about Dell’s solution oriented approach to cloud computing to meet the challenge for any organization how to evolve, to adopt new architectures and processes that increase business agility, scalability and governance/compliance and decrease risk.
                Dell Cloud Client Computing launches public beta of Project Stratus by Allison Darin [Direct2Dell blog, Aug 27, 2012]

                Project Stratus is a comprehensive cloud-based management console that is geared at helping enterprises thrive in a world of “Consumerized IT” where corporate and consumer technologies intermingle. It empowers employees with the highest productivity and the best user experience, while giving IT organizations the required control to allow them to welcome employee owned devices into the enterprise. Through its unified, cloud-based console, IT administrators will be able to to securely manage user devices as well as deliver applications and services to their users across a variety of scenarios; in office, mobile and remote, corporate owned and managed, user owned and self-service.
                “As the BYOD trend expands the private or public cloud access paradigm beyond PCs to include mobile devices of all types, and organizations start to adopt other consumer technologies like apps, we see IT needing the ability to rapidly adapt and embrace new end user service delivery models,” says Hector Angulo, Product Manager for Project Stratus at Dell. “Project Stratus was designed to provide this agility in a simple, secure and cost-effective package – if IT needs to manage end user devices, they can; if all they care about is managing how corporate data and apps are delivered regardless of device, it supports that too.”

                Data Center Evolution by Scott Herold [Direct2Dell blog, Sept 6, 2012]
                Powering the Possible in Smart Grid by David Lear, Executive Director—Sustainability [Direct2Dell blog, Oct 3, 2012]
                Building a Practical Foundation for Big Data Transformation by John Igoe [Direct2Dell blog, Oct 3, 2012]
                My New Role as CTO of Dell’s Enterprise Solutions Group by Jai Menon, the former CTO of IBM Systems and Technology Group [Direct2Dell blog, Oct 10, 2012]
                Executing BYOD programs by Rafael Colorado Marketing Director, Desktop Virtualization Solutions [Inside Enterprise IT blog from Dell, Oct 10, 2012]

                Let’s start with a common use case of an enterprise customer enabling remote and internal employees to access company resources through various devices and provide more than simple e-mail; they need access to a variety of corporate applications.
                The first variable to consider, Device Management, ensures that governance and policies are applied to all end points. Dell KACE offers a practical device management solution deployed as an appliance or SaaS offering. Additionally, Dell can provide BYOD consulting for organizations that need a more customized solution.
                The second variable, Secure Data, is mission critical because it safeguards the integrity of corporate information. Dell’s SonicWALL ensures secure access to intranet resources with secure SSL/VPN technology to manage encryption across all corporately-managed mobile devices. For a higher level of enhanced security Dell SecureWorks can be added to account for threat management.
                The third variable, Develop and Modernize Applications, helps organizations optimize applications for deployment into BYOD environments. Dell offers AppDev services that provide image optimization and application rationalization services. With PocketCloud, Dell also offers a comprehensive application delivery solution to remotely connect to your desktop with your iOS or Android device. Here’s a quick video on PocketCloud:
                The expanded Wyse PocketCloud family fuses streaming apps and data with search, file management and sharing across personal devices delivering content management from the cloud.
                Finally, Infrastructure Optimization is the variable over which my team, Dell Wyse, has the most influence. Infrastructure Optimization is about providing the backend infrastructure to host and manage your desktops and applications by centralizing data and applications in the cloud or the data center. Dell Desktop Virtualization Solutions (DVS) provides the datacenter infrastructure, including preconfigured networking equipment, storage, and Dell 12G servers to accelerate the adoption of VDI and application virtualization. DVS also offers virtual desktops in Simplified or Enterprise “as-a-service” configurations where virtual desktops are hosted and managed in the Dell Cloud. Finally, DVS offers an assortment of services to help you asses, plan, and roll-out desktop virtualization deployments.

                Dell’s Desktop Virtualization Strategy from Citrix Synergy 2012 [DellTechCenter YouTube channel, June 6, 2012]: [1:10] We are the only company that can offer an appliance, a VDI appliance [(DVS) Simplified appliance]. Nobody else has that. [1:19]

                Rafael Colorado from Dell talks about Dell’s Desktop Virtualization Strategy from Citrix Synergy 2012 in San Francisco.

                Feeling the Energy at Synergy by Janet Diaz Solutions Communications Manager, Desktop Virtualization Solutions – End User Computing at Dell [Inside Enterprise IT blog from Dell, May 10, 2012]

                After viewing a live demo of our Dell Desktop Virtualization Solutions (DVS) Simplified appliance featuring Citrix VDI in a Box software coupled with a Wyse zero client in action, or testing out our DVS Simplified Desktop as a Service (DaaS), or seeing how our Dell Virtual Labs solution is purpose- built to solve the specific IT problems in the education field; our customers came away impressed that Dell’s transformation into a solutions-focused company is gaining major traction.
                As part of the DVS Simplified demo, we are also excited to be showcasing Dell’s partnerships with both Citrix and Wyse, which gives our customers a truly end to end VDI solution that is easy to buy, easy to deploy, easy to manage and easy to scale.  Dell worked closely with Citrix to develop DVS Simplified, incorporating Citrix’s VDI-in-a-Box, to deliver VDI as an applianceBy adding Wyse to the partnership, Dell can now deliver a wide array of plug-and-play, automatically managed thin clients to further extend that simplicity to the end points.  We are very excited to be demonstrating this end to end solution in our booth for all Synergy attendees to see first-hand.

                What the new release of [Citrix] VDI-in-a-Box 5.2 means to you by Rafael Colorado Marketing Director, Desktop Virtualization Solutions [Inside Enterprise IT blog from Dell, Oct 18, 2012]
                – see also: Accelerating desktop virtualization gains [Dell Power Solutions, 2012 Issue 2, May 16, 2012] discussing the issues which lead to the creation of Dell desktop virtualization portfolio of end-to-end solutions—available in Simplified and Enterprise segments—in order to effectively address the diversity of organizations
                – see also: Thin/Zero Client and Virtual Desktop Futures [this same ‘Experiencing the Cloud’ blog, May 30, 2012]
                BYOD: A Love Story by Ann Newman, a technology writer, blogger and editor for Digital Online Marketing at Dell with specialties in BYOD, desktop virtualization, Windows 8 and other high-technology topics. Follow Ann on Twitter at @DellWebWoman  [DellWorld 2012 blog, Oct 26, 2012]

                At Dell, over 15,000 employees use their iOS®-, Android™- and Windows®-based devices at work, worldwide. The company is thriving because the BYOD strategy is built on a solid foundation of mobile device management, application modernization and end-to-end security and networking IT.

                Dell Cloud Client Computing Solutions Support Citrix HDX 3D by Dan O’Farrell Director of Product Marketing, Dell Wyse [Direct2Dell blog, Oct 17, 2012]

                Dell Wyse Cloud Client Manager Eases Consumerization of IT and BYOD Challenges by Rami Karam Product Marketing Manager, Dell Cloud Client Computing [Direct2Dell blog, Nov 7, 2012]
                Release of Dell Quickstart Data Warehouse 2000 Hits Sweet Spot for Mid Market by Matt Wolken [Direct2Dell blog, Oct 17, 2012]
                Unveiling Dell’s next generation converged infrastructure solutions — Active System 800 by Ganesh Padmanabhan [Direct2Dell blog, Oct 18, 2012]
                Converged Infrastructure without the Compromise: Introducing Dell Active Infrastructure and Dell Active System by Dario Zamarian [Direct2Dell blog, Oct 18, 2012]
                Dell developed and acquired IP converge in Active System by Ben Tao [Direct2Dell blog, Oct 22, 2012]
                Taking a more “Active” approach to delivering applications and IT services by Marc Stitt [Direct2Dell blog, Oct 25, 2012]
                One Million Reasons to Celebrate – DCS [Dell Data Center Solutions] Ships its One Millionth Server by Tracy Davis, VP/ GM—Dell DCS Team [Direct2Dell blog, Oct 30, 2012]
                Dell and SAP Hana, or how organizations can harness the power of in memory databases and analytics with joint solutions from Dell and SAP, by Kay Somers  discussing with Mike Lampa, Global Practice Lead for Dell Services Business Intelligence practice and Jeffrey Word, Vice President of Product Strategy at SAP on Direct2Dell blog:

                Part 1, Oct 30: about in memory databases, SAP HANA and how it can dramatically alter organization responsiveness and performance … the capabilities and performance of the SAP HANA platform.

                Part 2, Nov 5: the various ways to add SAP HANA to your database and analytics environment

                Part 3, Nov 11: building the business case for an SAP HANA installation or migration

                Dell Speeds Path to SAP HANA with New Service Offerings in Europe by Andreas Stein [Direct2Dell blog, Nov 12, 2012]
                The Year of the Virtual Desktop- really! by Eric Selken [Direct2Dell blog, Oct 31, 2012]
                Dell Services Introduces New Microsoft Dynamics Solution for Manufacturers by M J Gauthier [Direct2Dell blog, Nov 6, 2012]

                Our manufacturing customers will benefit from the best practices Dell learned from implementing Microsoft Dynamics AX in its own manufacturing supply chain in 2010. Dell’s own implementation generated a 75% reduction in factory IT footprint, 50% reduction in server downtime and a 40% decrease in the IT cost of goods.

                What you may not know about Dell SonicWALL by John van Son [Direct2Dell blog, Nov 13, 2012]
                Dell Acquires Gale Technologies, a Leading Provider of Infrastructure Automation Solutions to help accelerate the momentum of Dell’s converged infrastructure family, Active Infrastructure [Dell press release, Nov 16, 2012]
                Enterprise Business Momentum and Major Milestones by Jai Menon CTO of Dell’s Enterprise Solutions Group [Inside Enterprise IT blog from Dell, Dec 3, 2012]
                Project RIPTide: Business Analytics meets innovation at Dell by Shree Dandekar Director BI Strategy [Direct2Dell blog, Dec 21, 2012]

                Real-time analytics solution for midsized customers is enabled by Dell Boomi and real-time business intelligence capabilities

                Imagine a midsized company collecting data in real time from different sources. Of course they’ll want to convert this data into meaningful insights to improve their business, also in real time. There’s a catch though, they don’t have the IT resources or, necessarily, the expertise to extract those meaningful insights, much less in real time or in plain English.
                Sounds like the right kind of challenge to tackle for Dell’s incubation program.
                With RIPTide, we designed a solution that can assemble relevant data sets (structured and unstructured) on-the-fly, using real-time data integration enabled by Dell Boomi and real-time business intelligence capabilities for reports, dashboards, analytics, and services for easy deployment.
                And it gets even better. This solution simply scales – it can be delivered on a laptop, a server, or an enterprise class platform depending on the customer’s size and needs. A customer also has the option to start off with the Dell Quickstart Data Warehouse and then build the solution on top of it. As part of this project, we’re also exploring to offer this capability as a service for customers to use within their private cloud environment, using Dell managed services.
                We wanted to help customers simplify interpretation of their data – ask a question, get an answer. What is my sales pipeline in real-time? What is my account status with a given customer? What are they saying about me in social media? What does my retail stock look like? Is my fall collection trending on Pinterest?
                We put our project to task, just in time for the two major shopping days of the year – Black Friday and Cyber Monday – with Team Express, a San Antonio-based sporting goods retailer with a small IT staff responsible for maintaining their legacy SQL-based transaction system as well as reporting on daily business activities. Team Express, just like other midsized companies, is challenged with assembling data from various sources, including Salesforce.com and their legacy transaction system, to glean actionable business insights, quickly and easily.

                With the RIPTide solution running on a PowerEdge R720xd 12th generation server, Team Express is now able to capture key business metrics along with new insights, including:

                • Top-performing products by region, customer, and revenue
                • Close-rate per salesperson
                • Sales team productivity
                • Opportunity and lead conversion rates
                Here’s what Brian Garcia, CIO of Team Express … has to say about his experience with this project, “This solution will transform the way almost all of our departments think about how our business is behaving. Now we can see more, we can do more and we will get more with less effort.”

                Dell Retail Announces Industry-Leading Solution to Help Retailers Move to the Cloud by Mike Adams [Direct2Dell blog, Jan 14, 2013]
                2012 – The Channel Perspective by James Wright EMEA Channel Marketing Director at Dell Europe [Direct2Dell blog, Dec 21, 2012]

                It’s almost five years since we started selling through the channel in Europe with Dell PartnerDirect, and it’s safe to say that, while the previous four years were headline years, 2012 has also been outstanding for both Dell and our partners; I want to talk about some of the great highlights that have come out of the Dell PartnerDirect program this year.  Three things really stick out for me – more partners (and more partners growing their Dell business), our continued move from pure PC sales to a far more comprehensive solutions offering for partners and customers, and a steady stream of acquisitions helping to build out our end-to-end solutions portfolio.

                • More than half of Dell’s European sales now go through indirect channels . We’ve now got over 900 Certified Partners in Western Europe. Many are seeing their Dell businesses growing by 30 per cent or more. Now, growth is nothing without volume, but this shows that you can use Dell to survive and thrive in your business despite the current economic climate.
                • We’re building far more complex, integrated solutions. Both server and networking businesses within Dell grew by 14 per cent in Q2. A third of Dell’s revenue, and over half of our profit comes from data centre solutions. In fact, we’re the only major computer vendor to increase server sales in the third quarter, according to both Gartner and IDC. We’re also seeing revenue growth year-on-year in this market. Let’s not forget about the other areas, too. Storage is a big deal for us – and the latest European event proved that it’s a big deal for the channel, too.
                • Thirdly (and this is linked to the point above), we’re acquiring organizations that give us – and our partners – significantly more scope, breadth and reach. Here’s a quick run-down for 2012. While it’s worth understanding what each business does, that is less important than understanding the bigger picture – what we are building in conjunction with partners:
                  • Quest – scalable systems management, security, data protection and workplace management.
                  • AppAssure – streamlined datacentre operations with backup and recovery software
                  • Wyse – client cloud computing. See our earlier blog on what this means for partners here.
                  • SonicWALL – network security and data protection – and one of the most recognised firewall and unified threat management brands in the business.
                What of next year? If anything, it’s likely to be just as eventful for the industry as this and previous years. From my perspective, I’m looking forward to carrying on the great work we began five years ago with our partners; we’ve come an awful long way, but there are also plenty of great places we can go to. One thing I do know: it’s never going to be dull. Here’s to a fantastic, profitable 2013!

                Interview Marius Haas, Dell, about its enterprise strategy [Marco van der Hoeven YouTube channel, Feb 6, 2013]

                Witold Kepinski, editor in chief of Dutch IT Channel, speaks with Marius Haas, president, Enterprise Solutions, at Dell Technology Camp 2013, Amsterdam.

                Marius A. Haas [Dell Executive Leadership Team]

                Marius Haas serves as president, Enterprise Solutions, for Dell. In this role, he is responsible for worldwide engineering, design, development and marketing of Dell enterprise products, including servers, networking and storage systems.
                Marius came to Dell in 2012 from Kohlberg Kravis Roberts & Co. L.P. (KKR) [the leader of the leveraged buyout boom of the 1980s with its biggest LBO deal, still the biggest one in the histroy of mankind, well documented in both a book and a film Barbarians at the Gate: The Fall of RJR Nabisco] where he was responsible for identifying and pursuing new investments, particularly in the technology sector, while also supporting existing portfolio companies with operational expertise. Prior to KKR, Marius was senior vice president and worldwide general manager of the Hewlett-Packard (HP) Networking Division, and also served as senior vice president of Strategy and Corporate Development. During his tenure at HP, Marius led initiatives to improve efficiency and drive growth, including the execution and integration of all acquisitions, and he also managed the company’s strategic planning process, new business incubation and strategic alliances.
                Earlier in his career, Marius held a wide range of senior operations roles at Compaq and Intel Corporation. He also served as a member of the McKinsey & Company CSO Council, the Ernst & Young Corporate Development Leadership Network and as a board member of the Association of Strategic Alliance Professionals.
                Marius has a bachelor’s degree from Georgetown University and a master’s degree in International Management from the American Graduate School of Integration Management (Thunderbird) in Glendale, Arizona.

                Dell sets out enterprise solutions strategy [Tech Central, Feb 4, 2013]

                New software group integrates acquisitions to offer end-to-end solutions

                Dell has set out its strategy to offer end to end enterprise solutions.
                At the Technology Camp 2013 in Amsterdam, Tom Kendra, vice president and general manager of the newly formed Dell Software Group, said the company was “steadily executing the strategy of becoming a full service solution provider to enterprise”.
                Software is the next step in Dell’s evolution, said Kendra in a presentation. Leveraging its core strengths, Dell will provide solutions in the client, services and enterprise spaces, with an emphasis on adding value, differentiation and a focus on growth.
                “Software’s intersection with our core strengths, combined with disruptive market trends, allow us to create relevant solutions for today’s, and tomorrow’s, challenges,” said Kendra.
                Under the headings of data centre and cloud management, information management and mobile workforce, Dell will provide software solutions in Windows Server management, performance monitoring, virtualisation management, data protection and management, application and data integration, business analytics and intelligence, bring you own device (BYOD) and endpoint management.
                The newly formed software group brings together elements from Dell’s recent acquisitions, Kace, SecureWorks, SonicWall, Quest, Gale and Wyse.
                A “tough, rapidly changing market fosters transformation,” said Aongus Hegarty, president, Dell EMEA. “All these capabilities from the acquisitions are coming together to form integrated strategies.”
                Hegarty said that Dell is now established as a key player in enterprise technology, as it boasts more than $1.5 billion (€1.1 billion) in software revenue, a 6,000 member software team, of which some 1,600 are engineers, with a 2 million user community from 100,000 customers.
                Kendra cited an EMA Radar report that classed Boomi as a value leader for cloud integration, an NSS Labs highest overall protection award for SonicWall and 9 software Magic Quadrant appearances from Gartner.
                “Customers asking for end to end solutions, right from SME to mid-market and enterprise,” said Hegarty.
                Dell has clearly stated a position of open standards for its solutions. Stephen Davies, Services Solutions Group EMEA, Dell, said that its cloud offerings would be based on OpenStack. With the aim of protecting customers from vendor lock-in, the approach allows for elements of any solution to come from other vendors or providers, without any loss of capability or performance. Where a customer may have a significant investment in one area, Dell’s approach would be to have its solutions work wherever possible with existing implementations.
                Dell launched two new offerings as part of integrated enterprise strategy, Active System Manager 7.0 and new workload solutions optimised for the SAP HANA platform.
                Active System Manger 7.0 is based on Gale Technologies applications and extends the management capabilities of Active System beyond the physical infrastructure to the virtualised infrastructure and workloads. It will be embedded into an Active System 800 and its associated reference architecture.
                Dell has said that it has certified the first of its server, storage and networking technologies in its pre-integrated systems to run SAP HANA. The systems are high-availability configurations that scale from 1 terabyte to more than 4 terabytes and are based on the same architecture found in its single-server appliances.
                For full products details see page the February issue of ComputerScope, available 8 February.

                What Dell Is Doing Today [VideoLifeWorld YouTube channel, Feb 6, 2013]

                Dell Tech Camp 2013 – Tom Kendra VP & GM SW Group at Dell – Key Themes For What Dell Is Doing Today. Dell’s latest technologies and solutions that address customer issues and challenges around Cloud Computing, Data Insights, Mobility and Converged Infrastructure . Video By Dell’s Official Flickr Page http://www.flickr.com/photos/dellphotos/8450786­781/ creativecommons.org/licenses/by/2.0/deed­.en

                Dell Acquisition Strategy [DellVlog YouTube channel, Oct 25, 2012]

                Dave Johnson VP of Strategy demonstrates how Dell’s recent acquisitions all fit together

                Conversation with John Swainson, President of Dell’s Software Group [DellVlog YouTube channel, Oct 2, 2012]

                On Friday September 28, 2012, Dell announced that we completed the acquisition of Quest Software, an award-winning IT management software provider offering a broad selection of solutions that solve the most common and most challenging IT problems. John Swainson, President of Dell’s Software Group joined us on DellShares to discuss the importance of Quest to Dell’s Software strategy. We invite you to listen to John as he provides perspective on the following: • Quest fit within Dell’s Software strategy • Synergies between Quest portfolio and existing Dell solutions • Platform nature of Quest acquisition and what that means Thanks and we look forward to your thoughts and feedback.

                Dell Completes Acquisition of Quest Software by Tom Kendra [Direct2Dell blog, Sept 28, 2012]

                If you haven’t already heard, I am excited to announce that Dell has completed the acquisition of Quest. This is an important acquisition for Dell Software because Quest helps extend our capabilities in systems management, security and business intelligence software, and it also strengthens our ability to bring industry-leading, differentiated, and easy to manage solutions to our customers around the globe.
                With Quest, Dell will be able to deliver a broad selection of software solutions that will help simplify and solve our customers’ everyday problems and tackle their most challenging IT needs. Quest also brings with it critical mass and key talent. Quest currently has more than 100,000 customers worldwide, 5,000 partners worldwide, 1,500 sales and marketing resources, and 1,300 software engineers. As a relatively young and growing organization, these resources are invaluable to the Dell Software Group.
                The acquisition of Quest is a critical step forward for Dell Software because, with Quest, Dell is better able to provide end-to-end solutions that help our customers simplify their operations, maximize workforce productivity, and deliver results faster. Quest supports heterogeneous and next-generation virtualized and cloud environments which is complementary to Dell’s design approach to develop solutions that scale with our customers’ needs. But most importantly, Quest’s software solutions and key technologies are strongly aligned with Dell’s software strategy to expand, enhance and simplify our capabilities and enterprise solutions in four focus areas: Systems Management, Security, Business Intelligence and Applications.
                Quest will be joining other Dell Software assets Dell KACE, Dell SonicWALL, Boomi, Dell Cloud Business Applications and AppAssure as part of the Dell Software Group. Dell Software helps customers of every size take advantage of new technologies and address organizational challenges to grow their businesses and remain competitive. For more than a decade, Dell has been making strategic software acquisitions and partnering in the industry to support and enable the hardware and services solutions that we provide to our customers.  Our Software Group, now including Quest, will continue to extend Dell’s capabilities in software IP and total solutions offerings, and draw on the strength of Dell’s distribution capabilities and reputation to help clients in every industry achieve better business outcomes.
                Please join me in welcoming Quest to Dell Software, and I look forward to the many opportunities we will have to demonstrate that Quest and Dell are truly “Better Together.”
                For more information about Quest software, go to: www.dell.com/quest

                Software strategy and innovation related excerpts from Cover story: Piloting innovation [Dell Power Solutions Magazine 2012 Issue 4, Dec 7, 2012] the executive Q&A by John Swainson

                make the cloud more accessible
                My vision for the cloud is an intelligent technology that organizations can literally just plug into without the need for excessive configuration, security measures, and other manual interventions. All of these things need to be automated and policy-based, but making this vision a reality will take a lot of invention, systems work, and integration. But, that’s the direction we need to take if cloud computing is to achieve its full potential.
                Cloud environments today, in general, are far too siloed, complex, and inefficient to really deliver on their full potentialBut as we move forward in time, the cloud can become so much easier to use and so much more automated than it is today. We want to give customers the best of both worlds—on-premises access to resources when they want it and access to the public cloud when they need it—seamlessly.
                security solutions
                Right now, our particular focus is on securing the pieces in the middle of the security equation. How can we secure data center access through a firewall? That’s Dell SonicWALL™ software. How can we secure access to applications and databases? That’s where the Quest™ identity and access management solutions come in. How can we measure and monitor all of these parts to build confidence that security has not been breached? Dell SecureWorks provides security monitoring and risk remediation services. And finally, how can we enforce security policies on the endpoints of the data environment? Dell AppAssure™ and Dell KACE™ software address that area. Dell Software is all about making sure that the right people get access to the right data, and that the wrong people do not get access. Risk management and secure access to information are at the core of all of these solutions.
                It’s a big, complicated world out there. A threat environment that once comprised casual hackers has evolved into a complex landscape of advanced persistent threats—including industrialized espionage, or cyber-espionage—in many places around the world. One important aspect of Dell’s comprehensive approach to security is the SonicWALL consulting service, which helps organizations safeguard their valuable data and protect the productivity of their workforce.
                big data analytics
                To help improve efficiency, the Dell Quickstart Data Warehouse Appliance provides a prepackaged solution that combines Dell PowerEdge™ 12th-generation servers, the Microsoft SQL Server® database, Dell Boomi™ cloud-based data integration software, and Dell-provided consulting and training services.
                We also offer database tools that allow organizations to go back and forth between conventional data sources and open source solutions such as the Dell | Cloudera Apache Hadoop solution. Our Dell Toad™ family of products has been enhanced to support big data as well as conventional relational data management tasks. On the services side, we have created Hadoop offerings that enable organizations to gain access to the power of Hadoop without having to set it up themselves. They can deploy Hadoop in production environments quickly and transform large data sets into intelligent information. And our Dell Boomi solution makes it easy for organizations to integrate data from various sources within a single data warehouse for analysis.
                And, we have only scratched the surface. We can do so many other things to make it easy for people without data science skill sets to collect and analyze data for enhanced decision making in business settings. This data analysis area is where we are going to see a lot of investment from Dell over the next couple of years.
                bring-your-own-device (BYOD)
                Looking ahead, the BYOD trend presents an enormous opportunity for Dell to offer additional products that manage personal and mobile devices. It also provides the software and services that help organizations simplify IT and derive added value from their systems. The cloud, mobile devices, converged infrastructure, social media—all of these trends have very positive implications for our customers if they have the tools to manage them securely. And that’s obviously where we at Dell Software come in.

                More information:

                Dell Targeting $5 Billion in Software Sales, Swainson Says [Bloomberg, July 20, 2012]

                Dell plans to build or acquire software in areas including computer security, PC and server management, data analysis and business applications for midmarket customers, he said. … It may also compete with SAP AG (SAP) and Oracle Corp. (ORCL) in some segments of the business-applications market, said Swainson. … “Companies like IBM, HP and Dell have to provide a computing platform with the server and the software as a service,” he said. “That’s what all these acquisitions and vertical integration are about.”

                Dell Outlines Big Software Ambitions [InformationWeek, July 20, 2012]

                Its target buyer is the often overlooked small to medium-sized company with 215-2,000 employees, said Swainson. These companies have small IT staffs with large responsibilities. “The sweet spot for Dell is the mid-market…We want to produce a set of solutions designed for that market,” Swainson declared. … Dell will also get into business applications but it has no intention of going head to head with Oracle or SAP, two of the largest application suppliers. Both tend to address customers above the mid-market and both are key Dell business partners, he noted. … Dell faces a formidable task in training its large direct salesforce and many channel partners to add software products to the long list of Dell hardware they are already trying to sell, said Swainson. IBM spent 20 years converting itself from primarily a hardware company into a server company that also sold services and software. … To get to $5 billion, “it won’t take us 20 years, but it will take us longer than a year and half,” he noted.

                Dell Power Solutions Magazine 2012 Issue 4, Dec 7, 2012

                Special section: Dell Software

                  • Unfolding strategic new dimensions [Jan 27, 2013] excerpts giving a brief overview of the article describing the current software portfolio:

                    – The Quest™ Identity and Access Management family adds to the solid set of Dell SonicWALL™ and Dell SecureWorks assets.
                    – Dell AppAssure. From data centers to the cloud, Dell AppAssure™ software is a backup solution well suited for virtual, physical, and cloud environments.
                    – Dell Boomi. Organizations can deploy Dell Boomi AtomSphere™ software to connect any combination of cloud, software-as-a-service (SaaS), or applications on-premises without requiring appliances, additional software, or coding.
                    – Dell Clerity Solutions provides application modernization, legacy system rehosting, and capabilities that enable Dell Services to help organizations reduce the cost of transitioning business-critical applications and data from legacy computing systems to innovative architectures—including cloud computing.
                    – Dell KACE. Servers, desktops, and laptops can be managed cost-effectively with Dell KACE™ systems management appliances, which provide time-savings benefits for systems management professionals and their organizations. The Dell KACE appliance-based architecture provides easy-to-use, comprehensive, and end-to-end systems management.
                    – Dell Make Technologies. Application reengineering is a key capability in the growing field of application modernization and an important area of investment for Dell Services. Dell Make Technologies offers application modernization software and services that help reduce the cost, risk, and time required to reengineer applications.
                    – Dell SecureWorks provides automated malware detection and analysis with real-time protection, 24/7 monitoring and response by security experts as needed, and security consulting and intelligence to identify gaps or respond to incidents.
                    – Dell SonicWALL dynamic network security and data protection enable Dell to provide comprehensive Dell next-generation firewall and unified threat management solutions as well as secure remote access, e-mail security, backup and recovery, and management and reporting. Its Global Management System (GMS) enables network administrators to centrally manage and provision thousands of security appliances across a widely distributed network.
                    – Dell Wyse desktop and mobile thin clients provide low-energy, highly secure, cost-effective access to data. Dell Wyse PocketCloud™ software—a remote desktop client—provides enterprise-grade access to cloud services along with desktop and enterprise applications, and it helps extend the benefits and security of virtual desktop infrastructure (VDI) environments to mobile phones and tablets. In addition, organizational and end user–owned devices can be managed from profiles that are set up using a single, cloud-based console in Dell Wyse Cloud Client Manager.
                    – Dell OpenManage Essentials. Centralized monitoring of Dell servers, networking, storage, and client systems is available in Dell OpenManage™ Essentials (OME) version 1.1 software—a complimentary download from the Dell Support site. This one-to many hardware management console helps reduce the complexity of common management tasks.
                  • Defending against advanced persistent threats
                  • Gaining holistic insight into enterprise networks
                  • Boosting virtual desktop performance with compact cloud clients
                  • Business analytics: Gaining a competitive edge from the data deluge
                  • Migrating to Windows 8 for heightened productivity
                  • Accelerating the benefits of Windows Server 2012

                BYOD Reality Check: Focusing on users keeps companies ahead of the game by Tom Kendra Vice President and General Manager, Dell Software Group [Direct2Dell blog, Jan 28, 2013]

                If you are involved in the Systems Management business or follow it, you can’t help thinking about the incredible rate of change going on! Advances in Virtualization, Converged Infrastructures, Cloud Computing and an explosion in end-user devices are driving the need for a new generation of management and operations solutions. At Dell, we intend to lead in defining and delivering on that next generation of solutions.

                It is impossible to discuss all of these trends and what they mean in a single article. Over the next couple of months, we will provide points of view on each. Today, let’s start with the trend that many of us actually participate in—bringing our own laptops, phones and smart devices into our work environments.  This is commonly referred to as Bring Your Own Device, or BYOD. Many companies are actively working on their BYOD strategies and we recently conducted a study to get some insight on their approaches.
                The results of our recent global BYOD survey confirm what we have long suspected: organizations that build their BYOD strategies around the users realize a higher sustainable business benefit than those that focus their strategies solely on devices, or are slow to adopt BYOD at all. Survey responses indicate that three-quarters of organizations deploying a mature, user-centric approach to BYOD have seen improvements in employee productivity, customer response times and work processes, giving them a secure competitive advantage over those that don’t.
                We weren’t surprised by this. We know that early on, our customers’ first reaction to employee requests to use their own devices for work produced a scramble to figure out how to manage all those devices. Security was, and still is, of paramount importance. Over time, though, as their BYOD strategies matured, some IT organizations began to realize that by focusing on the users, they could respond quicker to the changing demands of the organization. They didn’t have to address those changes on every smartphone, tablet, laptop and any other device their employees bring to work, and, by focusing their BYOD strategy on managing user identities, they could resolve their concerns about security and other issues like access rights and data leakage, and still give employees everything they need to do their jobs.
                Our survey polled almost 1,500 IT decision-makers across the United States, United Kingdom, France, Germany, Spain, Italy, Australia, Singapore, India and the Beijing region. The results showed that more than 70 percent of those companies have realized benefits to their corporate bottom lines. Even more significantly, 59 percent say that without BYOD, they would be at a competitive disadvantage. Two-thirds of the companies surveyed said the only way BYOD can deliver significant benefits is if each user’s specific rights and needs are understood. Among respondents that both encourage BYOD and deploy a mature, user-centric strategy, this number jumped to three-quarters. They also reported that BYOD provides their employees the benefits of more flexible working hours, and increases morale and provides better opportunities for teamwork and collaboration. Overall, survey respondents with a user-centric BYOD strategy reported significant, positive improvements in data management and security, in addition to increased employee productivity and customer satisfaction.
                The survey results have confirmed for us ─ without a doubt ─ that organizations still trying to address BYOD by managing devices, or that have been slow to adopt BOYD at all, risk competitive disadvantage. The highest competitive edge, in terms of the increased business value gained from greater efficiency, productivity and customer satisfaction, goes to those embracing user-centric BYOD.
                We invite you to explore the key findings of Dell’s survey in our whitepaper, and if you want to “see” how this data reinforces our perspective on the importance of a user-centric management strategy for BYOD, take a look at our new infographic (Note: click on the image below to see a larger version of it, or you can download a copy of the PDF here).

                image

                Dell Names John Swainson President of New Software Group [Dell press release, Feb 2, 2012]

                • Software Group created to enhance solutions capabilities
                • Expanded software focus will extend Dell ability to improve customers’ productivity
                Dell today announced the appointment of John Swainson to serve as President, Software Group, effective March 5, 2012. Mr. Swainson will report to Michael Dell, chairman and CEO of Dell.

                The Software Group will build on Dell’s software capabilities and provide greater innovation and organizational support to create a more competitive position in delivering end-to-end IT solutions to customers. The organization will add to Dell’s enterprise solutions capability, accelerate profitable growth and further differentiate the company from competitors by increasing its solutions portfolio with Dell-owned intellectual property.

                “John is an outstanding leader with an unparalleled record of achievement,” said Mr. Dell. “He brings to Dell extensive experience in leading and growing software businesses, unique expertise in managing complex software organizations, and a passion for listening to and serving customers. I look forward to working with John as he expands our enterprise solutions and builds on our software capabilities.”
                “This is an exciting time to join Dell,” said Mr. Swainson. “As a leading IT solutions provider, Dell brings key assets and advantages to the software sector, including a strong global brand, a diverse global customer base and customer loyalty that creates opportunities to expand relationships with software.”

                The Software Group will bolster Dell’s ability to execute in several strategic areas critical to its customers. The combination of strong internal development capabilities in hardware, software and services gives Dell the ability to serve the largest possible group of customers within the $3 trillion technology industry.

                “The addition of software, both within the Software Group and across all of Dell, will help catalyze our transformation,” Mr. Dell said. “As software will be a part of all of our products and services, the group’s success will be largely be measured by the success of Dell overall.”

                Most recently, Mr. Swainson was senior advisor to Silver Lake, a global private equity firm. Prior to Silver Lake, he was CEO and director of CA, Inc. from early 2005 through 2009. Under his leadership at CA, the company significantly increased customer satisfaction, its operating margins, and revenue.

                Prior to CA, John worked for IBM Corp for more than 26 years, holding various management positions in the U.S. and Canada, including seven years as general manager of the Application Integration Middleware Division, a business he founded in 1997. During that period, he and his team developed the WebSphere family of middleware products and the Eclipse open source tools project. He also led the IBM worldwide software sales organization, and held numerous senior leadership roles in engineering, marketing and sales management.
                Mr. Swainson holds a bachelor’s degree in engineering from the University of British Columbia, Canada.

                John Swainson [Forbes profile, Aug 10, 2010]

                … Mr. Swainson is also a Senior Advisor to Silver Lake Partners, a global private equity firm, which he joined in June, 2010. Mr. Swainson advises Silver Lake’s portfolio companies on value creation activities. …


                The Indian case as a proofpoint of readiness 

                ‘Software’s becoming key to our biz, and so is Bangalore’ [The Times of India, Jan 9, 2013]

                Marius Haas President, enterprise solutions, Dell
                As Dell works to transform itself into an enterprise solutions and services company, Marius Haas has a pivotal role. He heads the $63-billion company’s enterprise solutions business. He joined Dell last year from investment firm Kohlberg Kravis Roberts & Co. Prior to that, he was senior VP in Hewlett-Packard. Haas was recently in India, where Dell has a quarter of its 1.1 lakh employees, and spoke exclusively to TOI.
                How important is the India enterprise market for Dell?
                The top ten markets in the world represent 70% of the total spend in the enterprise space for the things that we do. Out of the top ten markets, three markets represent 60% of the incremental spend over the next three years. And those three are India, China, and the US. So the India market is very, very important to us. You can imagine that we are gonna be focused quite a bit on what we can do for this market.
                What segments of industry do you see demand coming from?
                In India I think 80% of the growth comes in customers that are 500 employees or less. So clearly we need a small business led market strategy, and for the solutions we create. You will see us with solutions that bring together server, storage, networking in a very scalable way, so that you buy what you need, at the scale that you need, at the price points that you need. They are pre-integrated, pre-configured, and designed to run specific workloads. For small businesses, it will save a lot bother in trying to put together systems from different components.
                Several IT vendors today talk of pre-integrated stacks. Do you see customers opting for such stacks?
                The estimate is that 30% of the enterprise purchases in 2016 will be with a systems view (pre-integrated, pre-configured stacks). There will be cannibalization of the traditional silo selling mode – of buying servers, storage, networking separately. All of a sudden a big part of how people are thinking is, I want to buy the cloud solution that enables me to run application X, Y and Z. So we recently announced our Active Systems infrastructure family that brings together server, storage, networking all in one chassis with one common management capability. It requires 75% fewer steps from the time you receive it to the time you are actually running workloads. We have optimized all components to work together for specific workloads in such a way that it generates 45% better performance per watt than what’s out there from the competition. Saves money for our customers.
                Is your India R&D contributing to these systems?
                Clearly if you are going to go towards a more systems view, there will be a lot more focus on software. Software provides the value add to servers, storage and networking coming together. Our Bangalore team has capabilities in servers and specifically around software. A big part of the management capabilities built into the system is done by a team here in India. The skill sets and capabilities in India are part of the core competency that we need today. Indeed, one of every four of our servers sold worldwide is sold with work done in Bangalore. And that’s what gives us the confidence to do more here.

                SME Channels : Ajay Kaul, Head, GCC Dell India talking about the company’s growth strategy [smechannels YouTube channel, Feb 6, 2013]

                Watch Ajay Kaul, Head, GCC Dell India talking about the company’s growth strategy … interview taken by Sanjay Mohapatra, Editor, SME Channels

                + [8:39] I believe Dell is moving to the services business …
                + [10:38] How would you help partners create their own brands?
                + [12:20] How fast are you in integrating all the products and go to market?
                + [13:58] How do you engage your finance arm to enable the partners?
                + [16:30] What is your strategy around cloud computing for the partners?
                + [17:36] What is your investment roadmap in terms of technology for this year?

                Dell’s 7 strategies to stay top of mind for channel partners By Ajay Kaul [The DQ Week, Feb 5, 2013]

                What are the strategies that the companies can adopt to ensure that they keep their channel partner programs alive and thriving?
                Putting together an effective channel partnership program to take the company’s products and services can be just as challenging as rewarding. A good channel partner program does not end with identifying and enrolling like-minded and trustworthy resellers. It goes on to nurture and nourish these relationships through a host of incentives, training initiatives and many long-term measures.
                Those who recognize the economies of scale that such programs bring are also aware of how vital it is to stay top of mind at all times. In order to leverage the considerable boost that these can bring to revenues and sales, companies need to ensure that their resellers acknowledge them as a priority over the competition. This is easier said than done. Channel partners sell what they know best and in today’s competitive landscape, where resellers have the choice of dozens of brands, it becomes imperative to stay top of mind at all times.
                What strategies can companies adopt to ensure that they keep their channel partner programs alive and thriving? While most dealers and distributors will always be more attracted to methods that help them boost margins; they are also enthusiastic about measures that will help them address their challenges of training and retention of sales staff, competition, product and service expertise or growing consumer loyalty.

                Here are seven strategies from Dell that can help ensure a win-win environment for both reseller and your company:

                Invest in your channel partner’s success: Channel partners need to know that they are an important part of your company strategy and they need to feel the benefits of their association with you, through better margins, training and other initiatives that create success opportunities for them.
                Focus on their profitability and they will focus on yours: The conditions you create for your partners needs to be win-win for both sides. Last year, Dell announced a new GCC (Global Commercial Channel) structure, which is a single point of contact for partners, with an aim to increase productivity and improve time cycles and enable more customized programs for partners support. The new structure protects partner profitability by bringing consistent pricing across different Dell commercial businesses and offers the partners growth opportunities with solution centric offerings and a broader end customer base.
                Provide Product Support: The more your partners know of your products and services the easier they will find it to sell. Partners who have access to information and the means to understand your company offerings are more likely to push your products with their customers. Structured programs to boost product knowledge and bring to the forefront product and service USPs will equip partners with the right knowledge to sell your products.
                Continuous education programs for channel partners: Channel partners need to be constantly reminded about your product or service. What better way than through education programs? Dell offers over 100,000 training sessions a year to all partners globally and Dell’s Engineers Club further invests in the development of individual engineers and partners by bringing together technical experts and pre-sales and post-sales engineers across the IT industry to network, exchange ideas, and share industry trends and best practices with the channel partners.
                Listen to your partners: They can keep you in-tune with the pulse of the market. Structured listening programs will give partners a platform to voice recommendations and act as an additional source of market information.
                Incentivise your partners: Create exciting incentives for sales, profits, rewards & recognition. Dell’s PartnerDirect program features a structure which rewards certification and training, including new rebates for premier partners, expanded deal registration terms, financial incentives, and marketing and technical assistance. Dell has 115,000 partners globally, in its highly successful PartnerDirect model. Dell has also doubled its channel sales force and has added more enterprise specialists enabling and supporting the partners to address customer needs and optimally provide solutions within limited IT budgets.
                Make sure your program is high visibility and high impact: Don’t forget that your competition may be wooing your partners away from you. Your partner program needs to be more visible, more impactful and needs to give your partners what they need to sell for you.
                A satisfied channel partner will push your brand with their customers, protect your margins and will also be more accommodating to your needs. Needless to say, a poor channel relations strategy will have just the opposite impact on your company margins and sales.

                Dell GCC Engineers’ Club Now in India [SME Channels, Jan 11, 2013]

                To build on existing GCC initiatives to strengthen and showcase its commitment to its partner community
                Dell’s Global Commercial Channel (GCC) has launched the Dell Engineers Club in India, as part of their long-term commitment to channel partners in the country. The platform will enable technical experts across the IT industry to network, exchange ideas, and share industry trends and best practices.
                This club will also help train channel partners and their engineers to be knowledgeable in Dell’s advanced server, storage, security, networking and cloud solutions, announced the company’s press release.
                The company further announced that Dell’s long term aim is to qualify its partners to become not just the solutions provider but to be considered IT consultants for their end-customers. Dell believes in empowering their customers with the ‘Power to do more’, and therefore aims to create and offer real solutions with the intention of making technology smarter, more effective, and in service of its end-customers.
                Ajay Kaul, Director & GM (Global Commercial Channel), Dell India, said, “Dell’s GCC business is very committed to the Indian market and the Engineers Club aims to strengthen the enterprise knowledge of our partner community, helping them become consultants for their end-customers.”
                Dell offers over 100,000 training sessions a year to all partners globally and the Dell’s Engineers Club will further build on this initiative to invest in the development of individual engineers and partners.
                Dell’s Global Commercial Channel (GCC) division retains around 1700 commercial relationships in India. The division takes care of programs and policies relevant to channels, which cover all types of business entities such as public companies and large-/medium-sized companies.

                See also:
                Dell Global Commercial Channel Launches Dell Engineers Club in India [Dell India press release on BusinessWire India, Jan 10, 2013]
                After China, Dell introduces Engineers Club in India [The DQ Week, Jan 10, 2013] from which the following excerpt adds to the above important information:

                Ajay Kaul, director and GM, global commercial channel, Dell India, informed, “This program has been extended by Dell to the Indian market to cater to the market potential in India and we feel it is important for us to bring the Indian channel partners at par with their global counterparts. As a start, the Dell’s Engineers Club is by invitation only. Partners with a certain level of certification already attained from us through the Partner Direct program will be sent an invitation to join this club. In that invitation, we will include details on where and how to sign up. Once their registration is approved, they will have access to all the programs and activities under this initiative. At the start of the program, we will be looking a limited number from the top 8 and will expand the program to more partners from the top 11 cities by the end of the month.”
                With the recent acquisitions of companies like Quest Software, SonicWALL and Wyse, Dell has been able to add extensively to its solutions portfolio with leading management, security, virtualization and cloud capabilities. Hence, the focus on these enterprise solutions and services creates tremendous opportunity for its channel partners and therefore the necessity to ensure that partners receive the required training to help them understand the extended portfolio of solutions and services and provide customers with the right solutions and advice. The Dell Engineers Club is designed to provide maximum training about datacenter solutions so that the partners are better informed and can rise up to becoming IT consultants to the end-customers rather than just being a solutions provider.
                “Our channel partners play a significant role in our business, 25 to 50 percent of our commercial business, depending on country to country. In some countries, it’s 100 percent and we see it growing further. India is a very important market as far as our partner community is concerned. We engage with our partners in this region at the highest level ensuring that the programs and policies designed are favorable to their benefits which leads to their overall growth,” said Kaul.

                See also:
                DELL Partners with HCL Infosystems for Distribution of Enterprise Products [HCL Infosystems Ltd. press release on BusinessWire India, Jan 10, 2013]

                  • DELL enters into a strategic partnership with Digilife Distribution and Marketing Services (DDMS), distribution arm of HCL Infosystems
                  • DELL takes the next leap in enhancing its commercial and enterprise solutions offering through this new distribution partnership and which is a further expansion of Dell’s PartnerDirect program which has developed a significant amount of the commercial channel partners in India
                  • Partnership to target Mid-Market customers

                Dell’s Global Commercial Channel (GCC) division retains around 19,000 commercial partners in the Asia Pacific region. The division takes care of programs and policies relevant to channels, which cover all types of business entities such as public companies and large-/medium-sized companies. In India, Dell currently engages with 1700 commercial channel partners, and this agreement will further strengthen the reach of its enterprise solutions to key markets.

                The partnership will enable DDMS to supply the complete range of Dell Enterprise Products and Services. HCL‘s DDMS will help boost the growth of Dell, through the distribution providers in the market. HCL Infosystems widespread network of distributors will further ensure a robust funnel to Dell products and services.
                In the past two years, Dell has made 15 strategic acquisitions to enhance its capabilities as an end to end solutions provider and has carefully aligned its channel program with the acquisitions it makes. To enhance Dell’s security capabilities, the company recently acquired SonicWALL, Inc. Having an immense focus on the distribution of its products and its channel partners, Dell has offered SonicWALL’s existing channel partners, an opportunity to join the company’s current PartnerDirect program, which will enable them to preserve the investments made with SonicWALL. Also, in order to offer best to the channel partner community the company will take the best of SonicWALL channel programs and model and combine it with Dell’s PartnerDirect program. This move has not only provided the best for the channel partners but also Dell has expanded its own channel team’s customer relationships by further enabling its existing partners to sell SonicWALL solutions.

                Ajay Kaul to head Dell’s Global Commercial Channel biz in India [exchange4media News Service, Nov 8, 2012]

                Dell India has announced that Ajay Kaul, Director & General Manager, will lead the Global Commercial Channel (GCC) business for Dell in India. Kaul’s focus as business leader will be to oversee the expansion of Dell’s partner community and its growth in the upcountry markets. As the GCC Business Head, Kaul will also focus on strengthening the company’s relations with its partner community.
                During his seven-year tenure at Dell India, Kaul was Director – Sales for the Public, Education and Healthcare business from February 2009 to August 2012 in the South & West Region across Central / State Government, PSU, defense and covering all products of Dell for revenue, margin and market share growth. As the Regional Enterprise Manager from 2007 to 2009, he headed the pre-sales team and managed the servers and storage business in North and East region across large enterprises and government segment. Kaul had joined Dell in May 2005 and managed key global accounts to grow revenue and profitability covering all products.
                Dell’s Global Commercial Channel (GCC) division was created in early 2011 with an aim to be a single contact point for its commercial channel partners, thereby leading to higher productivity and improved time cycles and enabling more customised programmes to support the partners in the market. The GCC team is responsible for designing and implementing profitable schemes and policies for Dell’s channel partners and collecting and using channel feedback to execute best structures for its channel partners.
                Dell currently engages with 1,700 commercial channel partners in India, which cover all types of business entities such as public companies and large-/medium-sized companies.

                Dell’s position on the Indian market two years ago, and the approach taken by the company to achieve that is well described in How Dell conquered India [CNNMoney, Feb 10, 2011] in the end of which the summary of the position is given as:

                For Dell, India has emerged as a local and global service delivery hub. It is the only market outside the U.S. with all business functions—customer care, financial services, manufacturing, R&D, and analytical services—operational at the local level and giving global support. “We evaluated market trends and growth potential, enabling us to invest ahead of the curve in India, resulting in our phenomenal growth,” says Midha. It is a growth story that resonates around the world.

                Dell India has made not only big progress relative to that position but in the enterprise business as well. See CIO CHOICE 2013 Awards Recognizes Dell for its Outstanding Performance in Server, Storage and Data Center [Dell India press release on BusinessWire India, Feb 4, 2013]

                Dell’s commitment to addressing CIO needs with their best in class technology and customer commitment wins them accolades

                Bangalore, Karnataka, India, Monday, February 04, 2013 (Business Wire India)
                Dell India has been awarded the CIO CHOICE 2013 award for their solutions in Server, Storage – Hardware, Data Centre Consultant and Data Centre Transformation Services categories. The CIO Choice Awards is a B2B platform positioned to recognize and honour products, services and solutions on the back of stated preferences of CIOs and ICT decision makers. These awards demonstrate Dell’s “best-in-class” ability and commitment to meeting CIOs evolving needs in today’s dynamic business environment.
                The process for the “CIO CHOICE award” is conducted via an independent advisory panel of eminent CIOs and an independent survey voting from across the country with CIOs and ICT decision makers.
                Sameer Garde, President and MD, India Commercial Business, said “Dell has been investing in its enterprise capabilities and building solutions that address the business goals of customers. Being honoured by the CIO Choice award so early in our transformation into an end-to-end solution provider is truly a cherished achievement and a testimony to the efforts of the Dell India team. It shows that our open, scalable and affordable solutions have resonated well with customers and that we are well on our way to becoming the preferred choice for enterprise solutions.”
                Commenting on Dell’s success in the enterprise space Venu Reddy, Research Director IDC India said, “The infrastructure market has been showing some positive sights in the current marketplace. This is due to some segment specific traction and focus by key vendors like Dell. In the server market the stabilization and growth is driven by key industries like Finance & Insurance, Distribution, and Manufacturing which have driven a 12% growth Year-on-Year for the 1st 3 quarters of 2012. While in the storage market the additional momentum has come from mid-size organizations which have started investing in key infrastructure that is helping them drive faster growth and better ROI.”
                With the strongest ever enterprise product line up, Dell today is innovating and expanding its enterprise offerings to customers. Moving out of their legacy systems is one of the biggest challenges most Indian CIO’s are faced with. Dell works closely with customers to help them move out of their existing applications to newer platforms without hurting their IT budgets.
                “Dell has been our partner in data centre management and has helped us focus our resources on our business and customers instead worrying about our IT infrastructure. Dell’s solutions in storage, servers and data centre bring more flexibility, resilience and optimize security and costs while lowering downtime. We would like to congratulate Dell on winning the CIO award, which is a demonstration of Dell’s ability to understand and deliver on CIO needs in these changing markets.”Rinosh Jacob Kurian, Enterprise Architect, UST Global
                “In today’s always-on marketplace and turbulent business environment, a partner like Dell is truly an asset. Dell helps us manage our datacenter and server and storage requirements to deliver better business results and market success. Over the past years of our association with Dell, they have demonstrated a strategic insight into the emerging global business scenario and have been instrumental in helping our IT department gear up to meet these challenges. Dell is truly deserving of the CIO Choice award, and we extend our congratulations and best wishes to the team at Dell.”Subodh Dubey, Group CIO, Usha International.

                BYOD trends vs. Mobile enterprise platform trends

                With the literal explosion of mobile computing devices there is a huge challenge both on the enterprise computing vendor and customer sides. The easiest way of looking at those challenges is analyzing the so called BYOD (Bring your own device) and mobile enterprise platforms trends on the market where customers and suppliers meet each other.

                Note as well that these are all parts of a bigger trend, the so-called “consumerization of IT” which I already covered from an overall leading vendor point of view in the Pre-Commerce and the Consumerization of IT [Sept 10, 201] post on this site. Please read that before looking at the current trends discussed here in the below detailed sections. Then I will recommend to read The Changing Of The Enterprise Guard [TechCrunch, Jan 19th, 2013] article by the CEO of Box.com, the most successfull rising star in the enterprise IT vendor space. Even the ex MS leader Steven Sinofsky was recommending it in his Twitter meassage as:

                Interesting thoughts on enterprise computing http://techcrunch.com/2013/01/19/the-changing-of-the-enterprise-guard/ … from Aaron @levie

                Note that the BYOD trend I will present mostly through the Middle-East area where to solve the BYOD issue properly for the true enterprise space is the most pressing one in the world.


                BYOD trends

                Bring your own device [Wikipedia article, started on Jan 1, 2012]

                History

                BYOD first entered in 2009 courtesy of Intel when it recognized an increasing tendency among its employees to bring their own devices to work and connect them to the corporate network.[5] However, it took until early 2011 before the term achieved any real prominence when IT services provider Unisys and software vendor Citrix Systems started to share their perceptions of this emergent trend.

                In 2012 the Equal Employment Opportunity Commission adopted a BYOD policy, but many employees continued to use their government-issued BlackBerrys because of concerns about billing, and the lack of alternative devices.[6]

                Issues

                BYOD has resulted in data breaches.[citation needed] For example, if an employee uses a smartphone to access the company network and then loses that phone, any unsecured data stored on the phone could potentially be retrieved by untrusted parties.[7]

                It is important to consider damage liability issues when considering BYOD. If an employee brings their personal device to work, and it is physically damaged through no fault of their own it is unclear whether the company is responsible for repair or replacement.[citation needed]

                Pros

                Business

                A business that adopts a BYOD policy allows itself to save money on high-priced devices that it would normally be required to purchase for their employees. Employees may take better care of devices that they view as their own property.[citation needed]Companies can take advantage of newer technology faster.[citation needed]

                Employees

                Employees who work for a business with a BYOD policy are able to decide on the technology that they wish to use for work rather than being assigned a company device. This is thought to improve morale and productivity.[8] Exclusive control of features is given to the employee.

                Cons

                Business

                Company information will often not be as secure as it would be on a device exclusively controlled by the company.[citation needed] (Security professionals have termed it ‘Bring Your Own Danger‘ and ‘Bring Your Own Disaster‘.[9]) The company may have to pay for employee devices’ phone service, which they use outside company time. BYOD is an extreme case of the end node problem.
                Employees

                Due to security issues, the employees often do not have true full control over their devices[citation needed], as the company they work for would need to ensure that proprietary and private information is secure at all times. It is an out-of-pocket expense for the employees. They would be responsible for repairs if their devices were damaged or broken at work.[citation needed]

                Businesses that fall under compliancy rules such as PCI or HIPAA must still comply when using BYOD.[citation needed]

                Prevalence

                The Middle East was reported to have one of the highest adoption rates of the practice worldwide in 2012.[10]

                [10] El Ajou, Nadeen (24 September 2012). “Bring Your Own Device trend is ICT industry’s hottest talking point at GITEX Technology Week”. AMEinfo.com. Retrieved 26 September 2012.

                Frost & Sullivan: Consumerisation of Smart Phones and Bring Your Own Device (BYOD) are the biggest trends driving the Network Security Market in the Middle East [Frost & Sullivan press release, Nov 12, 2012]

                Dubai, the U.A.E., 21 November, 2012 – With an increase in the number of Advanced Persistent Threats (APTs), information security risks are becoming a major concern for organisations globally. Enterprises are swiftly adopting and deploying applications and new services to combat the same. In their quest to obtain high levels of security assurance and develop advanced intelligence technologies, organisations in the Middle East are increasingly adopting methods such as virtualisation and cloud computing. Over the past few years, this has led to increased Government investment in information and communication technology (ICT)-related projects in the Middle East and this is expected to proliferate further in future. To address these threats to enterprise security and brainstorm best-in-class Enterprise Security Solutions and Strategies, Frost & Sullivan convened the best minds in enterprise security at its Middle East Enterprise Security Summit 2012 on November 21, at Habtoor Grand Beach Resort, Dubai, U.A.E.

                Held for the first time in the Middle East, the Summit was attended by CIOs, CISOs, CTOs, Vice Presidents, General Managers, Network Managers, Enterprise Security Architects, Internet Security Architects, Compliance Officers, and Department Heads from across a variety of industry sectors such as Banking, Finance & Insurance (BFSI); Telecom; IT; Manufacturing; Government; Education; Healthcare; Media and Entertainment; Retail; and Automotive and Logistics.

                According to Frost & Sullivan, consumerisation of smart phones and Bring Your Own Device (BYOD) are the biggest trends driving network security issues in the Middle East today. The network security market is in a high-growth stage. Frost & Sullivan anticipates that technology convergence, regulatory compliance, and continuous growth of network infrastructure will continue to drive up sales for security suppliers in the Middle East during the period 2012-2018.

                Frost & Sullivan’s Middle East Enterprise Security Summit 2012 Summit began with an inaugural address by Andy Baul-Lewis, Director, ICT Practice, Frost & Sullivan, describing the prevalent enterprise security landscape in the Middle East. “Building security for electronic assets is one of the most critical tasks facing organisations today. In a converged world, where the threats of each system are multiplied; getting advice, sharing best practice, and talking to partners is a vital part of the construction process. This is what Frost & Sullivan endeavours to provide through this interactive Summit,” stated Mr Baul-Lewis at the Summit.

                The Summit included in-depth discussions and case studies on enterprise security management. The first of these was, ‘The Evolving Role of a Chief Information Security Officer’ by Roshan Daluwakgoda, Senior Director – Security Strategy Planning Risk Assessment and DR at Emirates Integrated Telecommunication Company, du, Dubai, the U.A.E. This was followed by a thought-provoking presentation on ‘Information Security Management – When the Going Gets Tough,’ by Kamran Ahsan, Head of Information Security, Injazat Data Systems, the U.A.E. Bashar Bashaireh, Regional Director, the Middle East, Fortinet, gave a presentation ‘How to Make your Security Aware in a BYOD World’. Thameem Rizvon, IT Director, Kamal Osman Jamjoom Group LLC (KOJ) presented, ‘Learn from your Peers: Security Implementation in a Retail Environment’. The session on Secure the Cloud,’ by Joe So, VP Business Sales, Huawei;was followed by a panel discussion on ‘Security Convergence and its Impact on Business.’

                Speaking on the occasion, Kamran Ahsan stated, “Information security is increasingly emerging as a critical concern in today’s modern business environment. This trend is very much evident in the Middle East, where enterprises have experienced information-related threats such as infiltration, data leakage, and cyber warfare among others. Injazat Data Systems will highlight how enterprises can proactively address these challenges and mitigate risks associated with business assets and services of enterprises. Moreover, with the best minds in enterprise security attending this Event, we expect to have an in-depth discussion of new trends and developments in information security in the Middle East.”

                Sharing his views on the Summit, Bashar Bashaireh said, “Information Technology has become central in driving the business processes of enterprises. However, as trends such as mobility, cloud computing, and BYOD are fast gaining momentum in the U.A.E., helping drive business profit and innovation; they are also bringing forth new challenges to IT security. Organisations in the U.A.E. should act now to regain control of their IT infrastructure by strongly securing their network and applying granular control over users, devices, and applications. The Summit organised by Frost & Sullivan is a great platform for us to share with end customers our insights on the new approach aimed towards IT security.”

                Talking about Securing the Cloud, Dong Wu, Vice President, Huawei Enterprise Middle East said, “As organisations roll out cloud-based models into their business infrastructure, the issue of security becomes an ever increasing concern.  The Middle East Enterprise Security Summit is a way for Huawei and other industry leaders to come together and discuss how businesses can be better secured and protected from the fast-evolving cyber threats that exist today. At the summit, we look forward to sharing our insights on how organisations can improve their planning processes before making their move into the cloud.”

                The Summit was supported by Injazat as Platinum Partner, while Fortinet and Huawei were the Event’s Silver Partners. Telecom Review, Teknotel and Connect-World Magazine supported the Summit as Media Partners; with Tech Channel MEA as the Online Partner for the event.

                If you are interested to know more about insights shared at the Middle East Enterprise Security Summit 2012 then send an e-mail to Tanu Chopra/Deepshri Iyer, Corporate Communications, at tanu.chopra@frost.com/deepshrii@frost.com, with your full name, company name, title, telephone number, company e-mail address, company website, and country.

                For more information on the Summit, please visit: http://www.frost.com/EnterpriseSecurityMiddleEast

                About Frost & Sullivan

                Frost & Sullivan, the Growth Partnership Company, works in collaboration with clients to leverage visionary innovation that addresses the global challenges and related growth opportunities that will make or break today’s market participants.

                Our “Growth Partnership” supports clients by addressing these opportunities and incorporating two key elements driving visionary innovation: The Integrated Value Proposition and The Partnership Infrastructure.

                • The Integrated Value Proposition provides support to our clients throughout all phases of their journey to visionary innovation including research, analysis, strategy, vision, innovation, and implementation.
                • The Partnership Infrastructure is entirely unique as it constructs the foundation upon which visionary innovation becomes possible. This includes our 360-degree research, comprehensive industry coverage, career best practices, as well as our global footprint of more than 40 offices.

                For more than 50 years, we have been developing growth strategies for the global 1000, emerging businesses, the public sector, and the investment community. Is your organisation prepared for the next profound wave of industry convergence, disruptive technologies, increasing competitive intensity, Mega Trends, breakthrough best practices, changing customer dynamics, and emerging economies?

                Mobile application management [Wikipedia article, started on Oct 17, 2011]

                Mobile Application Management (MAM) describes software and services responsible for provisioning and controlling access to internally developed and commercially available mobile apps used in business settings on both company-provided and “bring your own” smartphones and tablet computers.

                Mobile application management differs from Mobile device management (MDM) in the degree of control that it has over the managed device. As the names suggest; MAM focuses on application management, but stop short of managing the entire device. MDM solutions manage the down to device firmware and configuration settings and can include management of all applications and application data.[1]

                History

                Enterprise mobile application management has been driven by the widespread adoption and use of mobile devices in business settings. In 2010 IDC reported that smartphone use in the workplace will double between 2009 and 2014.[2]

                The BYOD (“Bring Your Own Device”) phenomenon is a factor behind mobile application management, with personal PC, smartphone and tablet use in business settings (vs. business-owned devices) rising from 31 percent in 2010 to 41 percent in 2011.[3] When an employee brings a personal device into an enterprise setting, mobile application management enables the corporate IT staff to download required applications, control access to business data, and remove locally cached business data from the device if it is lost, or when its owner no longer works with the company.[4]

                Use of mobile devices in the workplace is also being driven from above. According to Forrester Research, businesses now see mobile as an opportunity to drive innovation across a wide range of business processes.[5] Forrester issued a forecast in August 2011 predicting that the “mobile management services market” would reach $6.6 billion by 2015 – a 69 percent increase over a previous forecast issued six months earlier.[5]

                Citing the plethora of mobile devices in the enterprise – and a growing demand for mobile apps from employees, line-of-business decision-makers, and customers – the report states that organizations are broadening their “mobility strategy” beyond mobile device management to “managing a growing number of mobile applications.”[5]

                MAM system features

                An end-to-end MAM solution provides the ability to: control the provisioning, updating and removal of mobile applications via an enterprise app store, monitor application performance and usage, and remotely wipe data from managed applications. Core features of mobile application management systems include:

                • App delivery (Enterprise App Store)
                • App updating
                • App performance monitoring
                • User authentication
                • Crash log reporting
                • User & group access control
                • App Version management
                • App configuration management
                • Push services
                • Reporting and tracking
                • Usage analytics
                • Event management

                The Middle East angle #1:
                Mitigating the Risks of BYOD with MAM [ITP.net, Nov 14, 2012]

                Organizations need to decide how to manage BYOD, says Johnny Karam, Regional Director, Middle East and French Speaking Africa, Symantec

                According to a recent Symantec survey, 59% of enterprises are making line-of-business applications accessible from mobile devices in an effort to increase efficiency, increase workplace effectiveness and reduce time required to accomplish tasks.

                The average annual cost of mobile incidents for enterprises, including data loss, damage to the brand, productivity loss, and loss of customer trust was $429,000 for enterprise. The average annual cost of mobile incidents for small businesses was $126,000.

                According to Symantec’s State of Mobility Survey, 67% of companies are concerned with malware attacks spreading from mobile devices to internal networks. In addition, Symantec’s latest Internet Security Threat Report highlighted that mobile vulnerabilities increased by 93% in 2011.

                To manage or not to manage:

                The first question every business must ask around BYOD is: How much management of user-owned devices connecting to corporate resources does the company want? This is critical because the degree to which an enterprise is involved in managing various aspects of user-owned mobile devices has consequences. For example, a key anticipated benefit of implementing BYOD means often no longer having to fully manage employees’ mobile devices. In return, support costs are hopefully reduced.

                However, fully managing user-owned devices often results in intruding on the personal information and activity of those devices. This might include enforcing device-level authentication and encryption policies and complete device remote locking or wiping, including users’ personal content.

                Delivering corporate [apps and] resources

                Securing corporate [apps and] resources once they are delivered

                The Middle East angle #2:
                BYOD is not a new problem
                [Gitex Review 2012 published on ITP.net, Nov 18, 2012]

                Cloud and big data were the big talking points during GITEX Technology Week 2012. Leading UAE and global companies discuss those trends.

                Florian Malecki, head of product marketing at Dell SonicWALL, says that enterprises need to be prepared to allow employees to use their toys.

                Ilike to be a bit controversial over the growing BYOD trend. If you listen to the analysts; IDC, Gartner, Forrester; they are all predicting that the number of smartphones being sold by 2014-2015 will outgrow the number of laptops being sold.

                We all say that the employees want to use their own device, but if you look at what they want to use, it is either a tablet or a smartphone, so companies and IT managers have to accommodate all users needs.

                We did a survey and we looked at what devices our customers are supporting or are open to support, and there is no clear winner. If you look at it from a device point of view, there are people who want to use tablets (about 60%), people who want to use smartphones and people who want to use laptops.

                How to start
                A good way to start BYOD and try to minimise risks is by using an SSL VPN gateway. The beauty of an SSL VPN gateway is that you are able to identify the user and the user profile as well as identifying the device and setting up a profile for the device. You could have a profile that is a managed device or a personal device, but registered within the corporate ID system. Any organisation whether an SMB or enterprise, if they don’t really know where to start the BYOD journey, if they start looking at implementing an SSL VPN solution like the Dell SonicWALL solution then they probably meet 90% of employees requirements when it comes to BYOD.

                How to control BYOD
                The threat of personal devices on a corporate network is a big problem, according to Darren Gross, EMEA senior sales director, Centrify, and companies must be able to control information on those devices.

                Security compliance experts Centrify have released mobile device management software, which integrates one single identity for each individual employee within an organisation, so wherever they go the company can control where they are going and what they are doing, through policies and security settings.

                “There is a lot of competition in that space, but we are quite unique because we come from an angle of joining the system to Active Directory, so if I leave my iPad on the train, help desk can go and remotely wipe that device so there is no threat to the enterprise,” says Darren Gross, EMEA senior sales director, Centrify.

                Enterprises also need to look at mobile device configuration to prevent viruses from accessing the corporate network.

                … <LONG>

                People that use mobile devices tend to have no passcodes on them. Centrify is able to enforce passwords and encryption on a personal device accessing the corporate network.

                Cloud
                The company is also developing authentication for off premis cloud software and service type applications so for example SalesForce and WebEx.

                “Users will be able to sign on with one identity within Active Directory so you control what a user is doing and see where they are going, there is full accountability to what individuals are doing within the organisations,” said Gross.

                Disaster recovery in the region
                Yasser Zeineldin, CEO, eHosting DataFort, says the company is offering regional enterprises the opportunity to develop DR sites.

                We offer clients both in UAE and the Middle East region the ability to have a hot disaster recovery site where data is replicated between their production system and the disaster recovery system that is hosted with us. This means that in real time if there is a failure in the primary system they can switch over to the secondary system.

                <BIG DATA PART OF THE REPORT>


                Mobile enterprise computing platform

                Hal’s (Im)Perfect Vision on a possible (and much needed) further direction by Microsoft :
                There is no ARM in Windows RT [Jan 2, 2013]

                Windows RT is the name of Microsoft’s version of Windows 8 for ARM processors, right?  It’s aimed primarily at Consumers, right?  It’s role in business is primarily in the BYOD realm, right?  That’s so 2012!  Let’s talk about strategy and where I think Microsoft will go with Windows and particularly Windows RT.  And how their strategy may become more obvious in 2013.

                The name Windows RT wasn’t chosen to convey a message about Windows moving to ARM processors.  Nor was it chosen to convey that it was a Tablet OS.  The name appears to have been chosen primarily for one reason, it is an operating system devoted to running Windows RunTime apps.  It splits the mainstream Windows product into two families.  Windows for running Win32 “desktop” and Windows RunTime applications and Windows RT that drops the legacy Win32 application support.  Windows RT is Microsoft’s go forward client operating system, while Windows is the operating system Microsoft will need to keep selling and enhancing for a transition that will last a decade or more, but it will eventually be considered a legacy.

                I know I just sent a lot of people’s blood pressure through the roof because today they either (a) dislike Metro/Modern/whatever-you-call-it ,Windows RunTime, or the Start Screen and/or (b) the new environment isn’t really suitable for their usage scenario.  But keep in mind I’m talking about where things are going over several releases of the re-imagined Windows.  There will be many refinements, improvements, and changes before Windows RT replaces Windows as Microsoft’s primary client operating system offering.

                The desktop lives forever, right?  Well, on Windows yes but not on Windows RT.  Today Windows RT only needs the desktop for two reasons.  First, many traditional utilities from the File Explorer to much of system management are only available as desktop apps.  Second, Microsoft Office is only available as desktop apps.  But in each release going forward this will become less true.  A Metro File Explorer will become standard.  More and more system management will move to the new model.  And eventually Microsoft will remove the desktop from Windows RT.  Then it will be able to remove many pieces of legacy (including Win32), making Windows RT smaller, faster, and more secure (via smaller attack surface) than it’s Windows sibling.

                Microsoft started the ball rolling with Windows RT on ARM because that was the most practical thing to do.  With ARM unable to run existing x86 apps Microsoft had to decide if it would evangelize conversions of existing applications to ARM or put the energy into getting developers to write new Metro/Modern apps.  And without a library of Modern apps it was unlikely that any of the x86-oriented OEMs would create an x86 Windows RT system.   No rational amount of pricing difference on Microsoft’s part would encourage a OEM to use an operating system with no applications when they could just as simply use one with a huge, if aging, library.  ARM thus became the obvious place to introduce Windows RT.

                As the library of applications in the Windows Store grows it becomes more and more likely that Microsoft will introduce Windows RT for x86 systems.  Will that happen in 2013?  By the end of 2013 the Windows Store will likely have in excess of 150,000 Apps.  Perhaps in excess of 200,000.  Assuming that the quality is there (meaning they are the apps people want and are equal to their iPad and Android equivalents) the market for systems with no need to run legacy desktop apps will have grown dramatically.  Microsoft, many of its OEMs, and Intel (of course) will want the option of using Clover Trail (and its follow-ons) in those systems.  So it is quite possible that Microsoft makes Windows RT available for Clover Trail-based systems in 2013, and it seems a certainty for 2014.

                As a side note this is something that Paul Thurrott will probably not be happy about.  Paul has called on Microsoft to use Clover Trail in its next generation of the Surface so that it would have the full Windows experience.  But I expect that if Microsoft did use Clover Trail in a Surface (as opposed to Surface Pro) replacement that system would still run Windows RT.  Sorry Paul :-)

                If Windows RT for x86 is speculative in 2013 here is something I think is a surer bet.  Windows RT will expand into a family that mirrors the editions of Windows.  I expect that in 2013 we will see a Windows RT Enterprise (and perhaps Pro as well) edition.  Why?  Well the current edition of Windows RT is missing some key functionality that would accelerate its adoption within Enterprises.  And I’m not even talking about UI or Windows RunTime changes that would increase the application space it was applicable to.  I’m talking purely about lower level operating system features.

                Being able to participate in a domain is part of Microsoft’s secret sauce for enterprises, and today Windows RT can’t do that.  A Windows RT Enterprise edition would bring the ability to join a domain, use DirectAccess, use BitLocker, fully participate in Microsoft’s management capabilities, etc.  Whereas the solutions introduced in 2012 are acceptable for BYOD situations and some limited application scenarios, an Enterprise edition would allow Windows RT systems to participate as full members of the enterprise computing environment.

                Windows RT Enterprise will not allow side-loading of desktop applications, but it may allow side-loading of limited types of system software.  As great as DirectAccess is (and given my involvement in it I’m biased, but then I also lived with it as my “VPN” for a year so know how fantastic the user experience is) most enterprises use Cisco VPNs.  And while Windows RT is certainly adequately protected with Windows Defender, IE SmartScreen, etc. most enterprises will want at least the management capabilities of enterprise-oriented security products and probably the ability to use their corporate standard (i.e., Symantec, McAfee, etc.) products and infrastructure.  Unless Microsoft addresses these adoption of Windows RT will be much slower than desired.

                And what about requirements for access to desktop applications on Windows RT systems?  Many, perhaps most, enterprises are fine with using VDI to allow users of these systems to access desktop applications.  Some are downright enthusiastic.  But many do not want that access occurring off their corporate network.  Hence the need for the ability to join a domain, and use DirectAccess or VPNs when users need remote access.  You then run VDI over the corporate network.

                Now we get to another wildcard in all of this, Office.  Today’s situation with Office being a desktop Win32 application on Windows RT, and only being available in the Home and Student edition, represents a major drag on Microsoft’s ability to move Windows RT forward.  Microsoft needs to either allow upgrade of the edition of Office on Windows RT to an Enterprise edition (including, for example, making Outlook available) or to move Office fully to Metro/Modern (likely in multiple editions).  They may do both given the time it could take to create a true Office RT.

                An Office RT would benefit the entire Windows RT  and Windows 8 market and is the logical direction for Office to go.  But I find it hard to believe they can get to full equivalence with the Win32 Office apps in a year, let alone in a traditional longer release cycle.  We’ll see some, perhaps substantial, movement in this direction in 2013 but I don’t know how far Microsoft will get.  In the mean time they may find it prudent to release Office 2013 Enterprise (standalone and/or as based part of Office 365) for Windows RT systems.  However this rolls out, Microsoft will substantially improve the Office for Windows RT situation in 2013.

                Finally, let me reinforce a point I’ve blogged about before.  Microsoft is moving to annual (or more frequent) updates as a (at least unofficial) corporate standard for release cycles.  There may be exceptions from time to time, but I’d expect pretty much every actively developed product to have annual releases.  That means faster evolution in smaller chunks is the norm.  You don’t like how the Start Screen works today?  By the end of the year there will no doubt be improvements that address major complaints.  Windows RunTime missing an API that keeps you from creating a Metro/Modern version of your App?  You might have it later this year.  Can’t stand that the Share contract doesn’t work with Outlook?  Again, a solution may appear faster than Microsoft customers have ever imagined possible.

                2012 was an exciting year for Microsoft and its customers.  2013 may be even more exciting, and delightful.

                But there are new contenders for the enterprise IT space not based on any earlier paradigms, neither on the enterprise desktop and notebook (like Microsoft’s Professional and especially Enterprise editions of Windows) evolved from the PC platform, nor on the web browser based enterprise thin client (from the Java-like Apex code programmable Force.com PaaS platform usable along with standard HTML, JavaScript and CSS in the browser, to a wide range of JavaScript frameworks of a kind of “enterprise quality” which include even versions for mobile browsers) evolved from the web platform.

                A typical new contender, differing from both of the two earlier platforms in that by its very nature of cloud based file sharing can best exploit the power of new mobile computing devices, is the Box (service) [Wikipedia article, started on Nov 15, 2006]

                Box (formerly Box.net) is an online file sharing and Cloud content management service for enterprise companies. … A mobile version of the service is available for Android, BlackBerry, iPhone, iPad, WebOS, andWindows Phone devices.[4]

                Products

                The core of the service is based around sharing, collaborating, and working with files that are uploaded to Box. Box offers 3 account types: Enterprise, Business and Personal.[12] Depending on the type of account, Box has a number of features such as unlimited storage, custom branding, administrative controls and 3rd party integrations with applications like Google apps, Gmail, NetSuite and Salesforce. The service also has a variety of social features such as discussions, groups and an update feed.

                Sharing

                Box is a file sharing network, which saves and stores the information uploaded by the customer to their web site. They have the full legal right to demographic information about their customers, sales, and traffic to their partners and advertisers. Even though this company does not have the right to give, sell, rent, share or trade any personal information uploaded to their web site by their customers unless consent is given by the user of an account, a third party may be able to view some information. For which some terms and policies have been set forth, to protect the web site as well as the customers alike to establish a full functioning informative and well organized sharing network.[22]

                With the users consent, and if they are to choose they can share their private details with other customers such as:[22]

                To see your name, Email address, Photo, Profile information

                Chosen files to share –where comments can be made, and others can contact the user by email. People you invite as editors can also edit your shared files, upload documents and photos to your shared files, share those documents outside of Box, and give other users rights to view your shared files.[22]

                On the website its platform services for Enterprise IT are described in the following framework:

                Consolidate File Services: Consolidate All Your Content Services on Box

                Box – the single, secure solution for content access, sharing and collaboration – lets you replace a myriad of file transfer systems and unsecured, consumer-focused tools like YouSendIt and Dropbox. Bottom line: You reduce content silos, lower costs and give users the simplicity and functionality they want with the security IT requires. Learn more

                • Replace NFS, FTP, MFT and consumer file-sharing and sync tools
                • Streamline system administration and reporting
                • Reduce IT resource requirements while effortlessly meeting increasing storage needs

                Enterprise Mobility: Support Mobile Content Management

                Box works with any mobile device, giving remote workers access to critical content they need to succeed. Simultaneously, Box features a comprehensive and sophisticated security suite – and its seamless integration with third-party mobile device management tools like Good Technology and MobileIron provide an additional layer of data protection.Learn More

                • Users get anywhere, anytime access to critical content; and that content is synced across all their devices
                • IT enjoys remote device management coupled with auto logout and locking while sanctioning the use of specific mobile devices and apps
                • IT also gains a new level of content visibility, with insight into how content is managed and accessed in the organisation – and beyond

                Cloud Content Management: Discover Content Management in the Cloud

                As a Web-based service, Box is up and running in minutes and deployed in days. There’s no hardware to maintain or software to update and it complements existing content management platforms.
                Learn more

                • Start working in the cloud immediately: no on-premise installation, provisioning, maintenance or DMZ setup
                • Enable employees to access and share enterprise content quickly and securely, both internally and with external partners and vendors
                • Significantly lower hardware and storage costs

                Security and Architecture: Ensure Your Corporate Information is Secure

                It’s true: Box is a leader in content management security and makes ongoing investments in the safety of our data centres and corporate operations. Box has been issued an SSAE 16 Type II report, and our solution also features Safe Harbor certification and provides easy-to-use configuration tools, so you can tailor Box to meet your security requirements. Learn More

                • Global permission controls and detailed audit trails
                • Full data encryption plus data centre backups and redundancy
                • Guaranteed 99.9% uptime

                The Box Platform: Extend Box With Our Platform and Integrations

                Box is more than just a Web application; our comprehensive yet flexible platform lets you easily integrate, extend and customise your cloud deployment. Connect Box to the leading SaaS applications you already use, integrate it into your IT infrastructure or build apps designed to do whatever your business needs. Learn more

                • Easily connect to other business applications like Salesforce, NetSuite and Google Apps
                • Extend Box to meet additional needs with our 120+ Box Apps including eFax, DocuSign, FedEx and mobile Box Apps like Quickoffice
                • Create custom mobile, Web and desktop applications powered by Box

                Professional Services: Deploy Easily With Professional Services

                Our Customer Success team offers a comprehensive range of professional and client support services, from end-user training to systems integration and performance tuning. Learn more

                • Content migration services transfer your existing data to Box quickly and securely
                • Custom implementation road maps streamline deployment across the enterprise
                • A dedicated Customer Success representative gives you the responsive, personalised support you deserve

                The current state was described in Box Platform: Announcing v2 API in GA and Year in Review [on box blog by Chris Yeh, VP of Platform, Dec 14, 2012]

                2012 has been an amazing ride for the Box platform, and I’m excited to announce that we’re ending the year on a high note with the general availability of the Box v2 API. First released back in April in beta, we’ve made tremendous strides to bring our partners, developers and customers a simple, elegant and intuitive API that will power the next generation of business collaboration.

                Our v2 API represents a major step forward for Box. It is RESTful, implements the OAuth2 spec to standardize user authentication, has much improved error handling and it is well documented. Our Platform Manager Peter Rexer has a deep dive into all the details of the v2 API here. We’re also introducing Box developer accounts, which offer developers access to all of Box’s enterprise features through both the Box web app and the API. In celebration of our new API, we’re offering 25GB of Box free for any developer account created before January 18, 2013.

                API Momentum in 2012

                Our new API is being launched at a time of tremendous platform growth for Box. In 2012, every API metric that we tracked grew significantly. Here’s just a sample of some of the massive traction we’re seeing with the Box API:

                • 129%: growth in third party developers using Box
                • 140%: growth in number of third party API calls per month
                • 133%: growth in apps in the Box Apps Marketplace
                • 200%: growth in number of weekly users of third party apps on Box

                Of course, we wouldn’t have seen such strong platform growth and API engagement without the efforts and work driven by the amazing Box platform team and our ecosystem of third party developers. We built industry-first products including Box OneCloud and Box Embed, travelled the world meeting amazing companies along the way and got together as a community to hack some pretty cool projects. Here’s a brief look back at an amazing 2012.

                Box OneCloud

                In April, we introduced Box OneCloud for iOS, the first mobile cloud for the enterprise. OneCloud helps you discover useful productivity apps that are deeply integrated with Box for productivity on common business tasks like document editing, PDF annotation, e-signature, etc. We launched on iOS with 50 apps and shortly thereafter brought this to Android. By year’s end we’ll have nearly 300 OneCloud app integration partners across both iOS and Android. 40% of all Box’s Fortune 500 customers are using Box OneCloud.

                Box Platform on the Road – New York & London

                In New York this spring, we announced our v2 API in beta, 100 new OneCloud apps and partnerships with General Assembly and TechStars. We welcomed over 650 attendees to Skylight West to hear from Box CEO Aaron Levie, Take Two Interactive CEO Strauss Zelnick and former Editor-in-Chief of Wired Chris Anderson. Later, everyone danced to cool tunes spun by Elijah Wood. Our friends in New York include the Bizodo team, which makes a great form-filling app that puts content into Box. We also hung out with the Handshake team, which created a rich order-taking app useful in many business and retail settings. When the Handshake logo appeared on our OneCloud billboard on the 101, they tweeted that it was the startup equivalent of your voice dropping. One of the most interesting things about New York is the concentration of enterprise-focused startups. For example, we’re really pleased to support Jonathan Lehr’s NY Enterprise Technology Meetup and Nick Gavronsky’s New York City Startup Weekend, which just occurred last weekend.

                In late August, we parachuted into the middle of Carnival week in London to talk to analysts, press, London-based startups and supporting government organizations. We hosted a developer meet-up at Shoreditch House and were awestruck by the energy in London, particularly in Tech City. We spent time in Google’s shared space in London, where we first met Ben Wirtz, CEO of Unifyo, which brings together multiple sources of customer data to provide enterprises with a singular view of customers. We wandered down to Chelsea to meet Will Lovelace, CEO of Datownia, a company that allows the easy translation of Excel spreadsheets into APIs for external consumption. And we visited the lofty digs of the Chelsea Apps Factory, a super high quality app consultancy and production company.

                It’s great to meet with so many wonderful people and even better when you can get together and build some really cool things.

                Box Hack Event

                Full disclosure, our first public hack event at Box HQ was not intended to be thematically linked to astronauts shooting each other, but that’s another story. At this event, called “Redefine Work,” 150 hackers stayed overnight creating more than 40 contest entries. Participating technology partners included TokBox, Firebase, Mashery, Twilio, Parse, Iron.io and SendGrid. Our winning hack, called OMGHelp, is an application that improves the technical support experience by allowing a customer to use a smartphone camera to show a technical support person what they’re doing. If you’re interested, here is a really nice recap of our event that was created by Mashery’s Neil Mansilla on Storify.

                We closed out our active year in October with…

                BoxWorks Dev Day

                At BoxWorks, we announced a brand new technology that lets you quickly and easily extend the full Box experience anywhere you work. We call it Box Embed, our robust HTML5 framework for adding Box directly into the user interfaces of other applications. We launched with ten partners, including NetSuite, Jive, Conur, Oracle and others and we plan to continue adding to that number. Box Embed is particularly exciting to us because it’s one of the easiest ways for our partners to help make the content you have stored on Box accessible from anywhere.

                We also ran an un-conference-like Developer Day where hundreds of developers joined us to hear about the latest web development technologies and learn about enterprise development. We ran a well-attended startup camp with Boxers from various departments (design, sales, marketing in addition to developer evangelists) providing consulting. And we concluded with one of my favorite reporters/writers, Drew Olanoff of TechCrunch, interviewing one of my favorite “startup” CEOs, Jeff Lawson of Twilio, about the ways that developers should think about using APIs in their apps.

                We were fortunate to have many of our platform partners join us at BoxWorks this year. Jesse Miller and the attachments.meteam met with Box customers on the main show floor. David Klein and the SlideShark team presented in one of our sessions, as did Milind Gadekar from CloudOn.

                As you can see, we’ve had an amazing year. Thanks to all of our platform partners, big and small, for working with us. We look forward to reaching the next level in the new year.

                2013: Looking Forward

                As 2013 approaches, we’re working on making it even easier for developers to work with Box by focusing on our SDKs and other developer tools. We’re also excited to be building new platform products. On one front, we’re working on new developer-focused metadata tools. On another, we’re looking at allowing developers to hook into workflow products that will allow content to move through Box in various business flows.

                We’re sure that it will be a fun ride. Happy holidays to all and we’ll look forward to working with you in 2013!

                Regarding the most demanding enterprise customers of Box.com here are few excepts from Why Box.com is king of enterprise cloud storage [CNET, May 15, 2012]

                It may be known to some as the Dropbox-for-the-enterprise, but Box.com could be forgiven for insisting on its own identity.

                With more than 120,000 customers, including 82 percent of the Fortune 500, the company has made a name for itself as one of the leaders in the enterprise cloud storage and data management space. And though Box.com has Microsoft, and more recently, Google breathing down its neck, CEO Aaron Levie doesn’t appear the least bit nervous.

                That may be because the company has spent seven years building its business and solidifying a technology platform that gets more sophisticated — and cost-effective — every day. And as it has evolved into occupying a sizable Silicon Valley building, and employing more than 400 people, Box is now setting its sights on new businesses, including providing customers with the infrastructure on which to build cloud-based applications.

                Last week, the 27-year-old Levie sat down with CNET in a conference room at Box.com headquarters for an interview about the state of his company, the competitive landscape in the cloud storage and service space, and even the value of wearing a hoodie in a meeting with potential investors.

                How do you pitch Box.com to customers?
                Levie: So many different kinds of businesses out there are all going through the exact same challenge and transition. It’s almost counterintuitive how predictable everybody’s situation is. Because whether you’re in construction or finance or real estate or consumer or media tech, every CIO we talk with, and these are companies that are 5,000, or 10,000, or 50,000 employees, they’re going through the same kind of transition and they’re at the same junctures as organizations, where they have decades of legacy technologies that they’re still managing. And it’s, How am I going to build an IT and technology strategy for the next five to ten years. And often, if you look at how vast the change has been in the landscape, the technology strategy they’re going to end up with is very different than the one they just came from.

                So what is Box.com?
                Levie: The vision of Box is to make it easy for customers to share, manage, and access information from anywhere. That means we need lots of different kinds of technologies to make that happen, including technology that will sit on your iPhone, your Mac, your Android device or your Blackberry. And we just announced something with Nokia with their Windows Phones and tablets. We’re a 100 percent enterprise-focused company, and all the technology we’re building goes towards asking how do we make it easier or more scalable, or simpler, and just a better way for businesses to share and manage and access this data.

                Any regrets on being 100 percent enterprise?
                Levie: God, no. Our thesis is basically that if you look at the cost of storage, it goes down roughly about 50 percent every 18 to 24 months. So our hard costs are about a tenth of what they were when we started the company seven years ago. And you can predict that in the next five to ten years, we’ll have another 10x improvement in storage density and performance. Eventually you’ll get to a point where storage is infinite and free, because companies like Google, and Microsoft, and Apple can essentially subsidize the cost of storage for their consumers because it’s so cheap and the value of keeping people locked into their system is so great for them. But in the enterprise, storage is critically important, so we had to give people lots of space, but what you pay for is the security, the platform value, the collaboration, and the integration into your enterprise, and this is where we can build differentiated technology instead of just being measured on how much storage we give you and at what price.

                Who poses the biggest threat to your business?
                Levie: I would say Microsoft knows the most about the enterprise of any of these players. Google has a phenomenal brand, but it’s getting to be a broader brand, because it’s everything from your wallet to your car to your TV to your phone. The other thing that gets lost in the entire conversation because Google and Microsoft and Apple are so aggressive about this space, is the big transition companies are going to do from Oracle, IBM, EMC, and a lot of these traditional enterprise infrastructure players. Because as these dollars, and as your computing goes to the cloud, it moves away from implementing on-premise systems. It’s not going to be that Dropbox or Apple or Google loses. It’s going to be a lot of the legacy systems that we were spending lots of money on. As the $290 billion enterprise software market moves to the cloud, an entire new landscape of players and vendors are going to be the beneficiaries of that, unless these legacy vendors really get their act together.