Home » Cloud Computing strategy (Page 11)
Category Archives: Cloud Computing strategy
Less focus on feature phones while extending the smartphones effort: further readjustments at Nokia
June 25, 2012 12:14 pm / 4 Comments on Less focus on feature phones while extending the smartphones effort: further readjustments at Nokia
Update as of Aug 9, 2012: … Lumia direction … camera direction …
… Asha positioning vs. Lumia and Android:
[3:19] First of all what we’re working on with Windows Phone is to take it as low end price point as we possibly can. Having said that, the Nokia Asha devices have really been developed with the emerging market consumer in mind. We’ve brought a lot of smartphone like features to the user interface, as well as investing in making access to the Internet possible for consumers who have real affordability constraints, for data compression in our browser etc. We are working to continue to invest there so that Asha is a relevant competitor to the lowest end Android devices. [4:05]
see: The BGR Show – Nokia’s Smartphones Guru [iamOTHER YouTube channel, Aug 9, 2012]
- Speculations about Nokia
- Nokia and the Windows Phone Summit
- Nokia Q&A conference for financial analysts and investors, June 14, 2012
- Nokia announcements, June 14, 2012
- Scalado acquisition
- Asha Touch family of mobile devices
Closely related information:
– Windows Phone 8 software architecture vs. that of Windows Phone 7, 7.5 and the upcoming 7.8 [June 22, 2012]
– The new, high-volume market in China is ready to define the 2012 smartphone war [Jan 6 – Feb 17, 2012]
– Tech investment banking expertise to strengthen the unique value focus of growing the HTC brand and to achieve high growth again [April 18 – 25, 2012]
Other related information:
– The Where Platform from Nokia: a company move to taking data as a raw material to build products [April 7, 2012]
– Nokia under transition (as reported by the company) [March 11-30, 2012]
– Nokia’s strategy for “the next billion” based on software and web optimization with super low-cost 2.5/2.75G SoCs [Feb 14 – April 23, 2012]
– Nokia trying the first Lumia month in China with China Telecom exclusive [March 28, 2012]
– MWC 2012 day 1 news [Feb 27, 2012]: Samsung and Nokia [Feb 28 – March 1, 2012]
– China-based second-tier and white-boxed handset makers targeting the emerging markets [Feb 13, 2012]
–Nokia CEO: salespeople to deliver true WP7 retail experience supported by improved product management, marketing and accelerated global coverage with a full breadth of products [Jan 29, 2012]
– Nokia’s Lumia strategy is capitalizing on platform enhancement opportunities with location-based services, better photographic experience etc. [Jan 12 – April 27, 2012]
– Smarterphone end-to-end software solution for “the next billion” Nokia users [Jan 9, 2012]
– The precursor of 2012 smartphone war: Nokia Lumia vs. Samsung Omnia W in India [Jan 3 – 23, 2012]
– Be aware of ZTE et al. and white-box (Shanzhai) vendors: Wake up call now for Nokia, soon for Microsoft, Intel, RIM and even Apple! [Feb 21 – March 25, 2011 ]
Speculations about Nokia
A “short-term surprise” technology speculation: Nokia Air [s60betalabs YouTube channel, June 24, 2012]
A “short-term” investors’ speculation: Nokia Takeover Seen As Collapsing Shares Signal Bottom: Real M&A [Bloomberg, June 15, 2012]
… Nokia plunged 18 percent yesterday after forecasting a wider second-quarter operating loss from handsets and saying it will cut as many as 10,000 jobs as it cedes market share to Apple Inc. (AAPL)’s iPhone and Samsung Electronics Co. devices. After wiping out about $100 billion in market value, Espoo, Finland- based Nokia trades at a 38 percent discount to its net assets, the least expensive on record, according to data compiled by Bloomberg dating back to 1995.
…
Nokia started out as a wood-pulp and paper company in 1865 before expanding into rubber, electronics and eventually telecommunications. The company’s market capitalization, which was more than 300 billion euros in 2000, has tumbled more than 90 percent since the iPhone was introduced five years ago, valuing it at 6.8 billion euros ($8.6 billion) as of yesterday.
…
Nokia had $12.4 billion in cash and short-term investments as of March 31, topping its market value of $8.6 billion yesterday, the data show. After accounting for debt, Nokia’s net cash position of $5.9 billion is still the equivalent of 68 percent of its market capitalization.
“Close to half of the market cap is cash — that’s cheap no matter what’s going on,” Falcon Point’s Mahoney said. For private-equity firms, “it’s cheap enough. When you are at this type of level, you don’t even need to cut costs that much to get value out of the transaction.”
…
“Who would buy them at this point?” Lars Soederfjell, a Stockholm-based analyst with Bank of Aaland, said in a telephone interview. “You need to stabilize the business. There’s too much uncertainty. It’s more like buying a lottery ticket than anything else.”
Still, industry analysts at Gartner and Framingham, Massachusetts-based IDC say they expect the Windows Phone platform will make inroads as Microsoft develops it further.
Carolina Milanesi, an analyst at Gartner, said the firm forecasts Windows Phone will be the second-biggest smartphone ecosystem after Android in 2015. IDC expects Windows Phone to pass Apple’s iOS in 2016.
…
Nokia seen to be running out of money and time [Helsingin Sanomat, June 15, 2012]
A result warning and the decision to cut 10,000 jobs by the end of next year is a stark indication of how badly Nokia’s business activities and financial position have weakened this year. Furthermore, there are no indications of improvement any time soon.
The Finnish mobile telephone manufacturer expects its operating loss to increase in April-June, and the complicated turns of phrase in the result warning suggest that the spiral will not end in the third quarter either.
…
Nokia is cutting 3,700 jobs in Finland. A mobile phone factory in Salo, which employs 850 people, is to be shut down, and the research and development unit in Oulu is to be scaled back. In addition, a research and development unit in the German city of Ulm, which has 730 employees, is to be shut down.
Job cuts are also likely to affect locations in Finland other than Salo and Oulu, but Nokia gave no further details on the matter on Thursday.
The aim is to reduce the operational costs of the mobile phones unit to EUR 3 billion by the end of next year. In 2010 the costs were EUR 5.4 billion.…
By the end of March Nokia’s net assets (liquid assets minus debts with interest) had declined by more than EUR 2.1 billion. In the present quarter this trend has probably accelerated.
The depletion of assets and the significant reduction in sales could eventually lead to insolvency and possibly even bankruptcy. To avoid this, costs need to be reduced to better correspond to shrunken business activities.
After the reductions in jobs, Nokia has about 44,000 employees at work around the world [sans Nokia-Siemens obviously]. The last time that the company had such a small work force was in 1998. The serious problems are attributed to miscalculations by the management and the board of directors that have been made since 2005.
The company lost competitiveness because it held on too long to the antiquated Symbian operating system.
Nokia and the Windows Phone Summit
Closely related information:
– Windows Phone 8 software architecture vs. that of Windows Phone 7, 7.5 and the upcoming 7.8 [June 22, 2012]
Nokia at Windows Phone 8 Microsoft Dev Summit 2012 [Camb078 YouTube channel, June 21, 2012]
Nokia at the Windows Phone 8 unveiling [Nokia Conversations, June 20, 2012]
Today, Nokia’s Kevin Shields – who leads product development for Lumia smartphones – joined Microsoft representatives on stage at the Windows Phone Summit in San Francisco to outline how we’re working together to build a winning Windows Phone ecosystem and bring new experiences to Lumia phones.
Microsoft’s Terry Myerson, Kevin Gallo and Joe Belfiore started the morning by previewing Windows Phone 8, which will come out later this year, and unveiled new hardware specs like NFC, multi-core processors and improved screen resolutions.
With years of experience building NFC experiences on Symbian, the Nokia N9 and most recently the Nokia Lumia 610 NFC with Orange, we’re excited about what we can do with NFC across the whole of Windows Phone. Just take another lookat the Nokia Play360 NFC speakers if you have any doubts.
Microsoft also revealed that Nokia would continue bringing unique innovations to Windows Phone 8 through our hardware, services and apps.
![]()
Nokia Lumia owners to get new Windows Phone 8 experience
But the news doesn’t stop there. For those who already have Lumia devices with Windows Phone 7.5, you’ll be able to update your phones with some of the new Windows Phone 8 features like the start screenand download new apps from companies like Zynga, whose Words with Friends and Draw Something will be available in Autumn.
Microsoft also announced today that the Windows Phone Marketplace has reached 100,000 apps, and with Windows Phone 8 sharing the Windows 8 core, millions of Windows developers will also be able to develop for Windows Phone ecosystem. Lumia customers can expect thousands more apps to be introduced across Windows Phone platform.
The announcements underline the progress Nokia is making with the Lumia family of smartphones and in building a winning ecosystem with Microsoft. It shows how the Windows Phone experience continues to evolve at a faster pace than the competition and how Nokia’s continued investments in great location-based services benefit partners, developers and consumers.
New apps and functionality for Lumia owners
Finally, Kevin revealed that Nokia will deliver existing Lumia customers exclusive new Marketplace apps like digital Camera Extras to bring new possibilities to your Lumia including panorama shots, a self-timer, Action Shot for capturing movement and Smart Group Shot for creating the perfect group shot from several different images; new features for Nokia Drive and Nokia Transport; and also distributing a pattern of updates like WiFi tethering and flip-to-silence.
To learn more about those updates and when you can expect to see the new apps in Marketplace, check out this dedicated post. There’s a lot of exciting new stuff coming to Lumia now and in the future and we’ll keep you posted.
Camera Extras for Nokia Lumia — More Options for Capturing Great Pictures [Nokia YouTube channel, June 20, 2012]
Nokia Drive for all Windows Phone 8 smartphones [Nokia Conversations, June 20, 2012]
Location-based services, as Nokia announced last week, are becoming more and more core to our strategy. We’re focusing on location-based services, not just at Nokia, but bring our extending our services across many industries.
Today, we are making Nokia Drive available to other Windows Phone 8 partners to offer a turn-by-turn navigation experience for people in over 110 countries. Nokia Driveis one of the key experiences on Nokia Lumia smartphones, thanks to its ease of use and the experience that has gone into developing our location-based services. With Nokia Drive on Windows Phone 8, we will make drive navigation effortless.
Nokia Drive is one of the major apps that on Nokia’s location platform. Today, we are also making this platform and its unrivalled quality of data and richness of features available on Windows Phone 8 for all partners. This means that Nokia’s Location platform will be central to the Windows Phone 8 experience, with the intention of developing smartphones that bring advanced location experiences. Windows Phone 8 partners and developers will be able to use our location assets to build location-based apps and experiences of superior quality.
Nokia has more quality location data than any smartphone manufacturer in the market. Our platform is the most advanced mobile location platform in the world because it offers true offline functionality (for the past six years), fast client-side map rendering (50 fps) and only requires 10 per cent of the bandwidth when compared to traditional server-side map platforms.
The Nokia location platform is the biggest in the world:
We have maps data for more than 190 countries in more than 50 languages and navigation in more than 110 countries
We collect information from Nokia Drive users and local authorities to provide traffic alerts in 26 countries, and also allow dynamic rerouting
We have venue maps in over 5,000 shopping malls, train stations, airport, sports venues, etc. in 35 countries
We support multi-modal routing: by car, on foot (including footpaths, shortcuts, etc. in over 400 cities) and by public transportation (over 100 cities)
Also, Nokia’s location data is not confined to smartphones and computers. Our data already powers four out of five cars with in-car navigation and our customer list includes top brands in the tech and auto industries: Bing, Yahoo!, BMW and Ford.
All of these elements are coming together to form the ultimateWhere experience, connecting individuals with the world around them. At Nokia, we are working on constantly improving that experience, and striving to deliver novel and meaningful customer interactions with our location platform, content and apps.
*Image credits: Samsung and HTC respectively. This is a mockup of what Nokia Drive might look like on different Windows Phone 8 devices.
Nokia Q&A conference for financial analysts and investors, June 14, 2012
Nokia’s Elop: Lumia price cuts will help us take on Android in retail war [ZDNet, June 14, 2012]
Consumers do actually like Nokia’s Windows Phone Lumia device, but retailers are proving harder nut to crack, according to Nokia chief exec Steven Elop – as he set the scene for a price war with Android.
For the relatively small number of consumers the Lumia has reached in its short existence, the phone has been “well received”, Elop told analysts on a conference call Thursday.
…
With “specific support from Microsoft”Nokia will aim to increase its appeal by pushing the price of the Lumia line below the entry level Lumia 610 as part of its “low end price point war” with Android.
The real challenge, Elop said, is convincing retailers to bring the device out of the shadows.
“How do you get a preferred position on a shelf, how do you make sure the lights on your device are brighter than the ones from down the road?” asked Elop.
While the aim is to get more Lumia devices into the hands of consumers, Nokia will in fact narrow its direct sales and marketing efforts to select markets, palming off less significant ones to distributors to be managed through a central hub.
The US, UK, China and “certain” Asian and European nations would remain in focus with more effort placed on carrier partnerships, said Elop.
“We’re deliberately going through a cycle of concentrating on some markets at the expense of others.”
…
While mapping and navigation have become commoditised, Elop said, Nokia’s location-based services would give it an edge over rivals, pointing to Nokia City Lens, its augmented reality application, and its public transport mappingsystem.
Elop blamed Nokia’s inability to differentiate the Nokia experience on Windows Phone to date on its late entry on the platform but added that Windows Phone 8 (Apollo) and Windows 8, both expected to be released by the end of summer or thereabouts, will be “key milestones” for Nokia.
Nokia to End “Meltemi” Effort for Low-End Smartphones [AllThingsD, June 14, 2012]
One of the casualties of Nokia’s latest cuts is Meltemi, the company’s effort to create a new Linux-based operating system for low-end smartphones.
…
Nokia never officially confirmed the existence of Meltemi, so it likewise isn’t confirming its demise. However, sources tell AllThingsD that the project has been shelved, though elements of it may live on in other efforts.
…
Asked about Meltemi on a conference call Thursday, Nokia CEO Stephen Elop said that he had never talked publicly about a development project by that name, but noted that Nokia was ending some development projects.
In its press release, Nokia also took pains to note its continued focus on its current low-end smartphone platforms, known as Series 30 and Series 40. Last week, the company announced new all-touch phones in its Series 40-based Asha line.
…
Richard Kerris, who helps lead Nokia’s efforts with developers, said that Thursday’s moves, while difficult, should allow the company to put more resources into its key projects.
“We have awesome products in the pipeline, and our developers are going to love them,” Kerris said.
Nokia is also exploring alternatives for another of its development environments, known as Qt, which today is used largely in embedded devices.
“We’re fans of Qt, and we’ll continue to support it in the near term, but are being open about looking for opportunities which may be best for this developer framework,” Kerris said.
More information:
– Nokia Meltemi survivors suggest axed OS was nearly ready [SlashGear, June 21, 2012]
– No Meltemi, what about Smarterphone? What is there beyond S40? What of Qt? [My Nokia Blog]:
According to TheRegister, the Smarterphone team will work on S40 instead.
– MediaTek lands 2.5G handset solution orders from Nokia, say sources [DIGITIMES, May 21, 2012]
MediaTek reportedly has landed orders for 2.5G handset solutions from Nokia with shipments to begin in the third quarter of 2012, according to industry sources. MediaTek declined to comment.
Given that global demand for 2.5G handset solutions still reaches one billion units a year, there is room for MediaTek to further expand sales in the segment although the company’s sales of 2.5G solutions have been turning weak recently, indicated the sources. MediaTek shipped 550 million 2.5G solutions in 2011.
With a revised goal of shipping 75 million 3G solutions in 2012, mostly to first-tier handset makers in China, MediaTek is expected to post strong revenue growth in the second half of the year, the sources noted.
Nokia announcements, June 14, 2012
- Nokia sharpens strategy and provides updates to its targets and outlook [Nokia press release, June 14, 2012]
- Nokia announces executive changes; renews Leadership Team [Nokia press release, June 14, 2012]
Nokia’s own diagnosis:
During the second quarter 2012, competitive industry dynamics are negatively affecting the Smart Devices business unit to a somewhat greater extent than previously expected. Furthermore, while visibility remains limited, Nokia expects competitive industry dynamics to continue to negatively impact Devices & Services in the third quarter 2012. Nokia now expects its non-IFRS Devices & Services operating margin in the second quarter 2012 to be below the first quarter 2012 level of negative 3.0%. This compares to the previous outlook of similar to or below the first quarter level of negative 3.0%.
Nokia’s strategic changes:
Nokia plans to:
- Invest strongly in products and experiences that make Lumia smartphones stand out and available to more consumers;
- Invest in location-based services as an area of competitive differentiation for Nokia products and extend its location-based platform to new industries; and
- Improve the competitiveness and profitability of its feature phone business.
Nokia City Lens for Nokia Lumia: Augmented Reality Browser (Beta) [Nokia YouTube channel, May 8, 2012]
In Smart Devices:
Nokia plans to extend its strategy by
- broadening the price range of Lumia; and
- continuing to differentiate with:
– the Windows Phone platform,
– new materials,
– new technologies and
– location-based services, including navigation and visual search applications such as the recently announced Nokia City Lens- acquisition of assets from Sweden-based Scalado, which currently has imaging technology on more than 1 billion devices. This acquisition is aimed at strengthening Nokia’s imaging assets.
Scalado’s Vision [ScaladoInc YouTube channel, June 1, 2012]
In Mobile Phones:
Nokia aims to
- further develop its Series 40 and Series 30 devices, and
- invest in key feature phone technologies like the Nokia Browser, aiming to be the world’s most data efficient mobile browser. Early results of this innovation can be found in Nokia’s latest Asha feature phones which offer a full-touch screen experience at lower prices.
Additional reductions in Devices & Services:
Nokia plans to pursue a range of planned measures including:
- Reductions within certain research and development projects, resulting in the planned closure of its facilities in Ulm, Germany and Burnaby, Canada;
- Consolidation of certain manufacturing operations, resulting in the planned closure of its manufacturing facility in Salo, Finland. Research and Development efforts in Salo to continue;
- Focusing of marketing and sales activities, including prioritizing key markets;
- Streamlining of IT, corporate and support functions; and
- Reductions related to non-core assets, including possible divestments.
Nokia plans to reduce up to 10,000 positions globally by the end of 2013.
Taking into account these planned measures the company now targets to reduce its Devices & Services non-IFRS operating expenses to an annualized run rate of approximately EUR 3.0 billion by the end of 2013. This is an update to Nokia’s target to reduce Devices & Services non-IFRS operating expenses by more than EUR 1.0 billion for the full year 2013, compared to the full year 2010 Devices & Services non-IFRS operating expenses of EUR 5.35 billion. This means that in addition to the already achieved annualized run rate saving of approximately EUR 700 million at the end of first quarter 2012, the company targets to implement approximately EUR 1.6 billion of additional cost reductions by the end of 2013.
Joining the Nokia Leadership Team effective July 1, 2012:
- Juha Putkiranta as executive vice president of operations
In place of Niklas Savander, executive vice president of Markets who led Nokia’s sales, marketing, supply chain, manufacturing operations and information technology teams
Formerly, Putkiranta was senior vice president, supply chain. “Juha has demonstrated exceptionally strong leadership in leading our supply chain operations,” said Stephen Elop. “His breadth of experience at Nokia will help with our focus.”
- Timo Toikkanen as executive vice president of Mobile Phones
In place of Mary McDowell, executive vice president of Mobile Phones, who is leaving to pursue other opportunities outside of Nokia
Formerly, Toikkanen was vice president, business development, programs and special projects. “Timo is well known as an engaging leader with valuable business acumen and keen insights into delivering customer satisfaction,” said Stephen Elop. “These attributes will be key as we progress through our transformation.”
- Chris Weber as executive vice president of sales and marketing
In place of:
– Jerri DeVard, executive vice president and chief marketing officer [joining Nokia just in January 2011] who led Nokia’s marketing and brand management as a member of the Nokia Leadership Team, and who is leaving to pursue other opportunities outside of Nokia
– Niklas Savander, executive vice president of Markets who led Nokia’s sales, marketing, supply chain, manufacturing operations and information technology teams [Note that Savander had sales duties just for the last two months, taking over from the long-term Nokia veteran Colin Giles who decided “to leave the company to be closer to his family”]Formerly, Weber was senior vice president Markets, Americas. “Chris has made tremendous strides in kick starting our re-entry into the US and his track record of driving results will serve Nokia well,” said Stephen Elop.
In addition to that there were two additional new appointments:
Tuula Rytila as senior vice president and chief marketing officer
In place of Jerri DeVard, executive vice president and chief marketing officer [joining Nokia just in January 2011] who is leaving to pursue other opportunities outside of Nokia. Rytila, who will report to Weber, was formerly senior vice president of portfolio and business management.
Susan Sheehan as senior vice president of communications
Sheehan, who reports to Elop, was formerly vice president of communications.
There were new single word expressions of the executives’ views on the second day of a meeting considering the options, according to Helsingin Sanomat:
Exhausting week for Nokia CEO
Difficult decisions made late Wednesday…
- love – “Love for design and love for future products”: Marko Ahtisaari, Design
- passion: Henri Tirri, Chief Technology Officer
- purpose: Michael Halbherr, Location and Commerce
- consumer: Chris Weber, Sales and Marketing
- consumer: Tuula Rytilä, Chief Marketing Officer
- commitment: Jo Harlow, Smartphones
- result: Timo Toikkanen, Mobile Phones
- result: Juha Äkräs, Human Resources
- success: Stephen Elop
Scalado acquisition
Scalado Remove – Capture a clear view [ScaladoInc YouTube channel, Feb 13, 2012]
Nokia to acquire developers, technologies and intellectual property for imaging from Scalado [Nokia press release, June 14, 2012]
Acquisition aimed at enhancing imaging experiences for Nokia Lumia devices
Espoo, Finland and Lund, Sweden: Nokia is announcing plans to acquire world-class imaging specialists as well as all technologies and intellectual property from Scalado AB.
“Nokia has been working with Scalado for more than ten years and they’ve contributed to many of our leading imaging applications,” said Jo Harlow, executive vice president, Smart Devices at Nokia. “This transaction would enable us to combine our leadership in camera devices with their expertise in imaging, helping people move beyond taking pictures to capturing moments and emotions and then reliving them in many different ways.”
The Lund site is planned to become a key site for Nokia’s imaging software for smartphones, in addition to Nokia’s existing locations in Espoo and Tampere, Finland.
“This is a great opportunity for many of our people to show their leadership in imaging and to continue to build its future,” said Håkan Persson, chief executive officer of Scalado AB. “Doing this as part of Nokia, already a leader in mobile imaging, will reinforce the strength of the technologies and competences developed at Scalado. We are very excited about this opportunity, which is a natural next step in our longstanding relationship with Nokia.”
The transaction, which is subject to customary closing conditions, is expected to close during the third quarter of 2012. The terms of the transaction are confidential.
Scalado Rewind – The perfect group shot 3 [ScaladoInc YouTube channel, July 15, 2011]
Nokia to acquire developers, technologies and intellectual property for imaging from Scalado [Sacalado press release, June 14, 2012]
Lund, Sweden: Scalado is announcing plans under which Nokia would acquire world-class imaging specialists as well as all technologies and intellectual property from Scalado
“This is a great opportunity for many of our people to show their leadership in imaging and to continue to build its future,” said Håkan Persson, chief executive officer of Scalado AB. “Doing this as part of Nokia, already a leader in mobile imaging, will reinforce the strength of the technologies and competences developed at Scalado. We are very excited about this opportunity, which is a natural next step in our longstanding relationship with Nokia.”
The Lund site is planned to become a key site for Nokia’s imaging software for smartphones.
Scalado AB will continue to exist. All present customer agreements and obligations will remain with Scalado AB. The main task of Scalado AB will be to continue to work with our customers honoring our delivery and support obligations and fulfill any and all obligations in relation to its existing customers.
“We are very pleased to have signed an agreement with Nokia” said Anders Lidbeck, Chairman of the Board of Scalado AB. “We believe that this not only creates value for Scalado employees and shareholders but also ensures the future development and use of the Scalado heritage and technology”
The transaction, which is subject to customary closing conditions, is expected to close during the third quarter of 2012. The terms of the transaction are confidential.
Home /ABOUT US
Scalado is a world leader in the mobile imaging industry, with a long history of developing innovative platform-independent imaging solutions. Based on Scalado’s unique Random Access JPEG and more than 50 patent and patent pending technologies, these innovations are currently being used by the world’s leading global telecom and platform players in over 1 billion mobile devices, a figure that is growing with over 500 million devices each year.
Since the start, the company’s mission has been to help camera and mobile phone manufacturers to significantly improve their imaging solutions in order to deliver an optimal end-user experience. Scalado helps companies shorten their time-to-market and differentiate their products through imaging solutions that offer top of the line advantages in editing, enhancing, viewing and sending images.
Scalado’s technology has gained worldwide recognition by all of the major players in the IT industry. The company already licenses its solutions to the top five tier 1 mobile phone manufacturers, top 10 ISP/Sensor companies, and most leading platform providers. As a result, when someone is using a camera phone, it’s very likely that Scalado’s patented imaging platform is onboard.
Through the years, the company has received several awards for innovation and excellence in export. In 2010, Scalado was bestowed the Hermes Export Award by the Swedish Chamber of Commerce on World Trade Day, and listed as one of the fastest growing companies in EMEA.
Scalado has offices in Sweden (HQ), Korea, China, Taiwan, Singapore and the United States. The company employs around 110 people, most of them working at its Swedish HQ in Lund. Scalado has been doubling its revenues year on year since 2007.
Introducing PhotoBeamer [ScaladoInc YouTube channel, May 29, 2012]
The first product from Scalado was the imaging web tool Scalado™ ImageZoom™. Today we still work with imaging but have changed our focus to benefit the telecom industry and hand held devices.
2012
- Scalado™ now included in more than 1 billion mobile devices (over 500 million/year)
- Scalado™ Remove showcased live at Mobile World Congress in Barcelona
- Scalado™ will co-host 6Sight, Future of Imaging Conference, in June in New York City
2011
- Scalado™ now included in more than 900 million camera phones
- Scalado™ releases Camera & Album application
- Scalado™ expands into China and Singapore
- Scalado™ is one of the fastest growing companies in EMEA (Europe, Middle East and Africa) according to Deloitte
2010
- Scalado™ is awarded with the prestigious prize “Export Hermes”
- Scalado™ now included in more than 500 million camera phones
- Scalado™ relocates to new and larger head office
- Scalado™ employs a new CEO, Håkan Persson
- Scalado™ changes logotype and graphic profile
- Scalado™ expands into Japan and Taiwan
- Scalado™ starts collaboration with Qualcomm Incorporated
- Scalado™ opens regional office in USA
- Scalado™ nominated for the Great Export Prize 2010 (Sweden)
- Scalado is nominated at the “Mobile Gala” for “Innovative Technology of the Year”
2009
- Scalado™ shows its financial strength with highly-prized “triple-A” credit rating
- Scalado™ now included in more than 400 million camera phones
- Scalado™ featured in the 100 million club (published by Vision Mobile)
- Scalado™ nominated in GSMA’s Mobile Innovation Grand Prix EMEA Tournament
- Scalado™ expands into North America
- Red Herring names Scalado™ as winner in the Europe 100 awards
- Scalado™ and Kodak join forces to enhance next-generation imaging solutions
- Scalado™ optimizes new imaging solution for Windows Mobile® 6.5
- Scalado™ SpeedTags™ technology in Fujitsu Microelectronics’s products
2008
- Scalado™ launch the Scalado™ PhotoFlow™ application plug-in
- Scalado™ registers several patents for Scalado™ PhotoFlow™
- Scalado™ launch Camera Solution including Scalado™ SpeedTags™
- Scalado™ registers Scalado™ SpeedView™ as registered trademark
- Scalado™ signs a Scalado™ CAPS™ licensing agreement with Symbian
2007
- Scalado™ signs major agreement on Scalado™ CAPS™ with Sony Ericsson and Motorola
- Scalado™ launch Scalado™ SpeedView™ and registers patents for its technology
- Scalado™ signs agreements with LG and HTC
2006
- Scalado™ signs a strategic agreement with Teleca Mobile for global sales and customer services
- Scalado™ launches a new release of its Camera Phone Solution, CAPS™ 3.1
- Scalado™ signs a global agreement with a Korean Mobile Phone manufacturer licensing Scalado™ CAPS™
- Scalado™ opens regional office in Korea
- Scalado™ signs an agreement with a leading Taiwan Mobile Phone manufacturer licensing Scalado™ CAPS™
2005
- Scalado™ celebrates its 5th Anniversary
- Scalado™ attracts a new investor: the Danish VC firm, IVS A/S
- Scalado™ delivers imaging solutions to several tier1 manufacturers of camera phones
- Anders Cedervall becomes chairman of the board of directors
- Scalado™ wins the Series 60 Challenge Awards with PhotoTwister™
2004
- Scalado™ licenses mobile imaging technology to Sony Ericsson
- Launch of imaging application Scalado™ PhotoTwister™
- Scalado™ attracts new investors
- Scalado™ gains access to advanced Nokia technical and marketing support
- Scalado™ changes strategy, and focuses only on imaging solutions for the telecom industry
- Scalado™ launches Scalado™ CAPS™: an enhanced imaging software platform for mobile phones
2003
- Launch of Scalado™ ImageZoom™ Generator
- Scalado™ signs OEM agreement with Axis Communications
- Scalado™ signs of license agreement with CNN.com
- Scalado™ signs of exclusive distribution agreement with Macromedia for ImagePilot™
- Launch of first camera phone application, AutoRama™
- Scalado™ enters partnership with Symbian
- Scalado™ signs of license agreement with Sony Ericsson
- Awarded Prize for Scalado ImageZoom™ by the European IST
Prize evaluation group2002
- Launch of ImageZoom™ 1.0
- Registration of the Scalado™ ImageZoom™ patent
2001
- Development of the web tool Scalado™ ImageZoom™
2000
- Scalado™ is founded by Fadi Abbas, Maziar Jahanshahi, Sami Niemi and Pierre Elzouki
Scalado SpeedTags Camera Solution Latest Version [scal99 YouTube channel, May 29, 2009]
Home /ABOUT US /Innovations
Over the years Scalado has developed a wide range of imaging technology solutions, all with the aim of making imaging fun, fast, and efficient. A selection of Scalado’s innovative technology is presented below.
Scalado Remove solves common photographic problems with unwanted objects in captured images, such as people getting in the way of our camera shot. Remove detects and selects the unwanted object and simply removes them automatically or by touching the selections on the screen after capturing the image.
Scalado has continued its research around effective image capturing, and has lately released the Scalado Camera Framework, which makes it both effective and simple to achieve the same performance as SpeedTags on any platform.
This technology deals with a problem that photographers have been struggling with since the first group picture was taken: to get everyone to look their best at the same time. With Rewind, users can take the best facial expressions from several shots and combine them in the same picture.
TimeWarp takes a bust of images, where the capturing starts even before the user presses the button! The user can then decide which picture to use, by browsing back and forth in time.
2008–1st zero shutter lag capturing device
Zero Shutter Lag is the only software-based innovation that enables the user to capture photos instantly. Zero Shutter Lag is the first of many innovations based on the SpeedTags-technology. Its follow-on products include Rewind, Shot-to-shot, Burst, HDR and several others.
2007–SpeedTags: Revolutionizing capturing
SpeedTags is one of the innovations that really make a difference in the mobile camera industry. Its introduction changed the way images are captured on a mobile device, and greatly enhanced the user experience and quality of a digital imaging. The top 10 world players in the Sensor/ISP industries already integrate SpeedTags in their devices, and the technology is now available in 450 million new sensors and SOC’s (System-on-a chip) every year.
2005–The 1st super-fast album viewing for mobile phones
Image viewing technology has never been easy, but in 2005 Scalado introduced a technology that made it faster and easier to view, search and organize images. The market was impressed and inclined to ask the question that Scalado is frequently asked – how can you do this on a mobile device? This early technology has been further developed and more advanced solutions have been released through the years.
2004–The 1st advanced mobile editor
This product is unique; not only because it was the first advanced image editor on a mobile phone, but also because it was the first time that a mobile handset manufacturer used imaging in its marketing campaigns. During the campaign for N90, Nokia marketed the phone and the image editor using the sentence: “Editing on-the-fly? I could not believe it either”.
2004–CAPS™–Making imaging faster and more efficient
CAPS™ is Scalado’s flagship product and with its unique features, it has made imaging in mobile devices faster and more efficient since its launch in 2004. CAPS™ is a software development kit that enables developers to produce imaging solutions that are extremely CPU and memory efficient and that drastically decrease processing time when managing multi-megapixel images.
2003–The 1st panoramic images on a phone
Scalado created the first imaging technology in the world for capturing and viewing panoramic images on a mobile phone. It is based on the random access technology and Scalado managed to integrate this tool in Sony Ericsson’s T610.
Scalado has several patents for random access technology. The best known is Random Access JPEG (RAJPEG), which is a very effective way to process images. RAJPEG saves a huge amount of working memory and increases performance significantly. It enables an astonishing user experience on any device, independent of the existing memory and CPU performance.
This was Scalado’s first innovation. Regardless of the size of the image, ImageZoom makes it possible for the user to access – using any internet connection – and zoom into a specific part (or parts) of a megapixel image, without having to download the entire image.
Asha Touch family of mobile devices
Nokia Asha 311: fun, fast and always connected [Nokia YouTube chnnel, June 5, 2012]
Nokia accelerates the journey to mobile internet with the introduction of Asha Touch device range [Nokia press release, June 6, 2012] [3″ WQVGA, i.e. 240×400]
Fun, colorful range of touch screen phones will bring fast mobile web browsing, social networks and gaming to millions
Bangkok, Thailand – Nokia has today taken another step towards connecting the next billion consumers by unveiling the Asha Touch family of mobile devices, taking the full touch experience to new price points. The three new phone models – the Nokia Asha 305, Nokia Asha 306 and Nokia Asha 311– further expand the successful Asha family, first introduced in October 2011. Today, there are 10 Asha devices available in more than 130 markets, providing young, social consumers with a choice of phones to match their own lifestyle.
These latest phones have been designed to provide an incredibly rich, smartphone-like experience to consumers who want to be set free from excessive data consumption costs and short battery life. The Nokia Asha 305, Asha 306 and Asha 311offer a new, fully re-designed touch user interface, combining the proven ease-of-use from Nokia’s heritage with digital design innovations specifically fit for the purpose.
The beautifully crafted Nokia Asha 311 is a fast and fluid 3.5G capacitive touchscreen [3″ WQVGA, i.e. 240×400] device, powered by a 1GHz processor to provide a great internet experience. The bright and edgy Nokia Asha 305 is a fun and affordable phone, featuring the exclusive Easy Swap dual SIM. Its sister, the Nokia Asha 306, is a single SIM model, and becomes Nokia’s most affordable Wi-Fi handset to date.
“By introducing the Asha Touch phones to the market, we’re accelerating our commitment to connect the next billion consumers,” said Mary T. McDowell, Nokia’s executive vice president for Mobile Phones. “These phones deliver on what young, urban people value most — a great-looking device; and an intuitive and affordable experience for connecting to the internet, to their friends, and to a world of entertainment, web apps and content.”
Great for fast, affordable mobile internet and gaming entertainment
The new devices take full advantage of the Nokia Browser 2.0, a major recent updatewhich uses Nokia’s cloud technology to reduce data consumption by up to 90%, meaning that consumers can enjoy faster and cheaper internet access. Web sites load up to three times faster in comparison to devices without cloud-accelerated browsing, making it simple for users to find and select from more than 10,000 web apps available for download. They deliver a richer and more interactive consumer experience whilst using less data than a stand-alone internet connected app.
Consumers can easily stay connected with friends and family at the touch of a button as well as share files and links across their social networks. Furthermore, the Nokia Browser’s Download Manager feature helps consumers to manage external content easily, saving music, video or pictures on a memory card, while surfing the internet.
The Asha family is also getting positive support from developers and consumers. Nokia Store has just broken the 5 billion downloads landmark. From January to April, 42% of all content downloaded from Nokia Store was delivered to Asha and other Nokia devices based on the Java ecosystem. Just one year ago, that number was 10%. Also, there are 410 Nokia developers with apps which have achieved more than 1 million downloads. India Games and Pico Brothers just passed 100 million.
As well as providing a great, social online experience, the Nokia Asha 305, Asha 306 and Asha 311 have been created with entertainment in mind. All users will receive an exclusive gift of 40 EA games to download for free* and keep forever. These games range across action, arcade and sports, and include titles such as Tetris®, Bejeweled®, Need for Speed(TM) The Run and EA SPORTS(TM) FIFA 12. The Nokia Asha 311also comes with 15 levels of Angry Birds pre-loaded onto the phone, perfect for making the most of the touchscreen and 1GHz processor.
“Nokia is taking another interesting step forward in connecting consumers to the Internet, seeking to improve their experience through a new touch user-interface that is allowing the company to compete in new mass-market price bands. The mass-market is a competitive segment, but we believe Nokia’s upgraded Asha portfolio has included an attractive package that can enable consumers to have lower running costs, taking advantage of things like its compressed browser and a long-life battery”, says Neil Mawston, Executive Director of devices research at Strategy Analytics. “It is also interesting to see how Nokia is promoting its Asha strategy with global launches taking place in important high-growth markets such as Asia. Nokia resonates well there and the response from local consumers is likely to be positive”.
Product details
The Nokia Asha 311 is a colourful, compact touch screen device that comes
with all the features you’d expect for a fun and easy mobile experience. It boasts a bright and colourful, scratch resistant capacitive glass screen with polarization filters ensuring users get the best experience from the unique and visually entertaining user interface. The Nokia Asha 311also features a 3.2MP camera and pre-installed Nokia Maps, in addition to the 15 level pre-bundled version of Angry Birds.
The pre-loaded social client makes accessing Facebook, Twitter and many other global social networks simple while Nokia Browser makes using mobile internet fast and affordable. It also includes the most popular messaging services. “WhatsApp has a clear vision of creating a reliable and easy to use cross-platform messaging application that enables people to stay in touch with their family and friends from all around the world,” said, Brian Acton, Co-Founder of WhatsApp Inc. “By partnering with Nokia whose worldwide reach in mobile is well established, WhatsApp becoming available for the Asha Touch devices will enable us to further realize our core mission”.
The Nokia Asha 305 is a fun and entertaining Easy Swap dual SIM phone, helping users make the most of their phone while retaining control of their costs. The phone features a bright and colorful 3″ WQVGA resistivetouch screen along with Bluetooth and Dual Band connectivity. Forty EA games are available for download with every phone as well as a 2MP camera, Nokia Maps and the revolutionary Nokia Browser which helps significantly lower data costs.
The Nokia Asha 306 is the sister device to the Nokia Asha 305. Along with all
the great features that come with its sister, such as bright and colorful, 3″ WQVGA resistive touch screen and 40 EA games for download – a Nokia exclusive offer, the Nokia Asha 306 also provides WLAN, enabling users to stay connected while on the move. It also supports video streamingthrough both GPRS and WLAN, meaning this handset truly is a fun way to stay in touch.
The estimated retail price for Nokia Asha 305 is EUR 63 [US$ 79] and it’s expected to start shipping in the second quarter of 2012. The estimated retail price for Nokia Asha 306 is EUR 68 [US$ 85]. The Nokia Asha 311 has an estimated retail price of EUR 92 [US$ 115]. Both devices are expected to start shipping in the third quarter of 2012. Above mentioned prices exclude taxes and subsidies.
The new devices images are available at Nokia.com/press.
*Data costs may apply.
Nokia Asha 306: discover a fun way to stay in touch [Nokia YouTube chnnel, June 5, 2012]
Have a Touch: Nokia’s new Asha phones [Nokia Conversations blog, June 6, 2012]
A great new touch screen experience, fast web browsing, games and social networks lie at the heart of three new mobiles phones being launched today by Nokia.
The Nokia Asha 305, Nokia Asha 306 and Nokia Asha 311are a colourful range of mobile phones designed for young, urban and social people to get online faster, better and cheaper.
The devices will run on Asha Touch, which is a new, fun and playful touch screen interface that builds on Nokia’s swipe heritage.
Asha Touch will provide aspirational young people a first ‘smartphone-like’ experience, for example, through the notification bar and the 10,000 web applications that are available. This is on top of more than 25,000 regular apps already in the Nokia Store.
Nokia Browser 2.0
All the phones also feature a major update of the Nokia Browser, which uses cloud technology to reduce data usage by up to 90%. The benefits are numerous: web pages will load faster, battery life is improved and mobile Internet access becomes much more affordable.
Keeping in touch with friends on social networks is also central to the three new members of Nokia’s Asha family. They all come pre-loaded with applications for Facebook and Twitter, and there is also email and instant messaging.
People who want great games will not be disappointed either. Each phone will come with a gift of 40 free games from EA to download and keep forever. The games will include Tetris®, Bejeweled®, Need for Speed™ The Run and EA SPORTS™FIFA 12.
So, now we know that the Nokia Asha 305, Nokia Asha 306 andNokia Asha 311come with great features and dozens of free games.
What are the differences between the phones themselves?
The Nokia Asha 305 and the Nokia Asha 306:
- 3.0” WQVGA resistive touch screen
- 2 MP camera
- Music player and FM radio
- Built-in speaker
- Plug and Play, easy PC connection and file transfer
- GPRS/EDGE connectivity
- Nokia Mapsand Nokia Life (in selected markets)
- Colours: Silver White, Red, Mid Blue and Dark Grey (varies by market)
In addition, the Nokia Asha 305will benefit from Nokia’s Easy Swap Dual SIM technology, which allows SIM cards to be swapped without opening up or turning the phone off. This is useful for storing different numbers on your SIMs and for taking advantage of different operator rates.
The unique feature of the Nokia Asha 306is its Wi-Fi capability. Indeed, it is set to be Nokia’s most affordable Wi-Fi handset device.
The Nokia Asha 311
- 3.0” scratch resistant, capacitive glass screen
- Polarised display filters for better usability in direct sunlight
- WLAN
- 3.2 MP camera
- 1GHz processor
- Music Player, FM Radio and Internet Radio
- Plug and Play, easy PC connection and file transfer
- HSPA connectivity
- Nokia Mapsand Nokia Life (in selected markets)
- Colours: Dark Grey, Rose Red, Blue, Brown and Sand White (colours will vary by market)
The Nokia Asha 311 … is a 3.5G mobile phone powered by 1GHz processor to make for a speedy online and gaming experience. As well as the 40 free EA games, the Nokia Asha 311 will also include 15 preloaded levels of Angry Birdsfor you to enjoy.
Taken together, these new mobile phones, the Nokia Asha 305,Nokia Asha 306 and the Nokia Asha 311are another big step in the quest to connect the next billion.
For approximate costs, before local taxes or operator subsidies:
Nokia Asha 305 – 63 euros / 85 USD, available in Q2 2012
Nokia Asha 306 – 68 euros / 93 USD, available in Q3 2012
Nokia Asha 311 – 92 euros / 121 USD, available in Q3 2012Nokia Asha 305 will be available in the second quarter of 2012. Nokia Asha 306 and Nokia Asha 311 are arriving in the third quarter of 2012.
Infographic: How Nokia Browser saves you time and money [Nokia Conversations blog, June 7, 2012]
Searching for content on the Internet from your mobile phone is now faster and cheaper, with the Nokia Browser.
It uses Nokia’s unique compression technology, which means the pages that you visit are reduced by up to 90%, making surfing the Web faster, and more importantly, cheaper.
Check out our Nokia Browser infographic for more reasons as to why you should all be using Nokia Browser.
Initially, Nokia Browser was created to help the next billion internet users connect for the first time to the Netwithout compromises.
While it’s still vitally important for people in emerging markets to have a great, affordable browsing experience, Nokia Browser is also great for everybody else, too, proving to be up to three times faster then other browsers.
As you can see by the infographic, this means that when you use your Nokia Browser, you would be saving a day every year. This has to be good news for everyone.
To save yourself some money, and browsing time, be sure to check out the latest Nokia Browser 2.0 which is already preloaded on your Asha phones as the recently launched Nokia Asha 305, Asha 306 and Asha 311. Nokia Browser is available in 87 languages and in 250 countries.
If you have a Series 40 phone you can update to the latest version of the Browser. Check it out!
Nokia Browser 2.0 update available now [Nokia Conversations blog, April 23, 2012]
Getting online fast, and affordably, is crucial for Internet users everywhere. Now that experience is about to get even faster and easier with an update for all existing Nokia Browser users, covering phones across the Nokia Asha range and Series 40 devices.
We know Nokia Browser is often the first, and main, way of accessing the Internet for millions of you in dynamic fast-growing parts of the world. Since the Browser launched last yearyou’ve been able to access all the information you need, without the headache of worrying about your data bill.
Nokia Browser condenses data by up to 90%. That makes loading web sites faster, and cheaper – in fact, our cloud-accelerated browsing makes loading web sites up to three times faster. If you’re on a pay-per-use contract you’ll enjoy cheaper browsing, or if you’re on an operator data plan you’ll be able to do more web surfing without exceeding your monthly usage limits.
Download Manager
As well as needing less data to show the same web pages you also want to do different things at the time. We know you are busy. Using the new Download Manager you can save music, video or pictures on a memory card, while you’re surfing the Internet.
The update also includes a host of new features to make searching, discovering and sharing content even easier…
![]()
Better searching and sharing
The Browser now has a new, more intuitive, user interface with one-click access to top local sites from the start page. A new feature enables multitasking while browsing, meaning that you can switch between text messages and the web.
Nokia Browser 2.0 makes it even easier to share content across social media: You can post any page URL via Facebook or Twitter from within the browser, including a comment directly from the options menu. If you’re in China, you’ll be able to do the same for Sina Weibo and RenRen.
The Browser makes it simple to find, install and use Web Apps, which provide you with a more desktop-like internet experience. Launched in mid-2011, the Nokia Browser is the first browser of its kind to support Web Apps, and now boasts a catalogue of more than 10,000 apps. Nokia Series 40 users have downloaded more than 35 million Web Apps in total, with the most-downloaded app – ‘Free Wi-Fi Locator’ – having been downloaded more than 2 million times alone.
The update supports all forms of Nokia Series 40: Touch, QWERTY and Non-Touch, including the Nokia Asha range, as well as popular devices such as the Nokia C3-00, Nokia C2-03 and Nokia X3-02. The update will be pre-loaded on some current and all future Nokia Series 40 devices, while for existing users the update arrives as a free, optional over-the-air download. New users can download it from the Nokia Store.
image credit: webwizzard
Nokia makes internet access faster and easier with new browser for Series 40 devices [Nokia press release, April 23, 2012]
– Nokia Browser 2.0 delivers enhanced speeds and a new user interface for a faster, better way to explore the web
– Powered by cloud-based servers, it delivers accelerated browsing and reduces data consumption by up to 90%, without compromising the internet experience
– Web apps from the expanding catalog are easier than ever to explore and install right in the browserEspoo, Finland – Nokia has today announced the availability of Nokia Browser 2.0, a major update dedicated to Nokia Series 40 devices. The new version reduces data consumption by up to 90%, meaning that consumers can enjoy faster and cheaper internet access. Web sites load up to three times fasterin comparison to devices without cloud-accelerated browsing and consumers will also benefit from a number of other enhanced capabilities.
From the first look, consumers are easily able to discover new web content and enjoy one-click access to top, local sites via the Nokia Browser’s inviting and intuitive start page. We have optimized the browser to enable users to easily stay connected with friends and family at the touch of a button as well as to share files and links across social networks. The new and improved Download Manager helps consumers to manage external content easily, saving music, video or pictures on a memory card, while surfing the internet.
Free Wi-Fi Locator – Smartphone-like web app on an Asha device – consumer don’t have to compromise
An app that showcases many features of the platform is this Movie Review app.
The browser includes a revamped, modern user experience that makes it simple to find, install and use interesting web apps that offer a richer, more desktop-like internet experience. Launched in mid-2011, the Nokia Browser is the first browser of its kind to support web apps, and now boasts a catalogue of more than 10,000 of the latest apps. Several publishers have experienced over a million downloads in a matter of months, demonstrating strong consumer demand.
With this update, developers will find new monetization capabilities, more extensive user interface options for their web apps and productivity improvements for Nokia Web Tools so they can continue delivering engaging, connected experiences to the ‘Next Billion’ consumers.
The update supports all forms of Series 40: Touch, QWERTY and Non-Touch, including the Nokia Asha range, as well as popular devices such as the Nokia C3-00, Nokia C2-03 and Nokia X3-02. The update will be pre-loaded on some current and all future Nokia Series 40 devices, while for existing users the update arrives as a free, optional over-the-air download. New users can download it from the Nokia Store. The browser is available in 87 languages in over 200 countries and territories.
Nokia Browser 2.0 makes use of cloud-based servers which adapt standard web pages so that they perform better on Nokia Series 40 devices. Since web pages are compressed and cached in the cloud, end users can access web sites in a manner which is faster and requires significantly less data to be sent over their mobile network. For pay-per-use contracts this will result in more cost-effective browsing, while users on an operator data plan will be able to do more web surfing without exceeding their monthly usage limits.
“With our new version, we’ve created a newer, faster, better browsing experience. As many consumers around the world will experience the internet for the first time through a mobile phone, this is a great step towards our goal to connect the ‘Next Billion’,” explains Dieter May, senior vice president of mobile phones services, Nokia.
New in the Nokia Browser 2.0
- Faster browsing with speed improvements throughout the experience.
- Easier access to new and popular Web apps to enable a richer and more engaging internet experience.
- New, intuitive user interface offers one click access to search, most popular content and most valuable features.
- Media handling enhancements provide an easier way to enjoy video, audio and images. Users can download in background mode while continuing to browse the web or queue downloads for later when performance or rates are better. Downloads can be saved to memory cards or phone memory for later offline viewing or listening.
- One-click share on Social Networks by remembering Facebook and/or Twitter login to easily share any page URL and comments from your browser.
Developers can find out more about how the updated browser will enable them to build rich standards-based web apps at: http://www.developer.nokia.com/Develop/Series_40/Series_40_web_apps/.
Consumers can download the Nokia Browser 2.0 at: http://store.nokia.com/content/51924
Giving up the total OEM reliance strategy: the Microsoft Surface tablet
June 19, 2012 3:09 pm / 11 Comments on Giving up the total OEM reliance strategy: the Microsoft Surface tablet
Follow ups:
– Microsoft Surface: its premium quality/price vs. even iPad3 [Oct 26, 2012]
– Microsoft Surface: First media reflections after the New-York press launch [Oct 26, 2012]
Updates #2: As the result of this sudden turn of direction 9 months ago, the previously closely cooperating with Microsoft OEMs are now (March’13) working with the company in the most cautious way:
Brand vendors cautious about Microsoft when it comes to hardware design [DIGITIMES, March 25, 2013]
Notebook brand vendors have turned cautious about revealing their new products’ industrial designs for next-generation Windows as they are concerned that Microsoft may use their designs for the benefit of its new Surface products, according to sources from the upstream supply chain.
The sources noted that the brand vendors have already lost their trust in Microsoft and the software giant’s strategy of pushing Surface tablets is starting to impact itself.
Although Microsoft only had sales of about 1.5 million Surface tablets so far, the company continues to expand into the retail channel with its branded products and has even established an online store for ordering the devices.
To avoid from design leakage, many brand vendors have hidden their important designs and will only showcase the prototype of the new mobile devices during Computex 2013 to minimize the risk.
China market: Microsoft to launch Surface Pro, say Taiwan makers [DIGITIMES, March 29, 2013]
Microsoft, following the launch of the10.6-inch Surface RT in the China market, will launch the 10.6-inch Windows 8 Surface Pro there on April 2 at a retail price of CNY6,500 (US$1,045) for the 64GB version and CNY7,300 for 128GB, according to sources with Taiwan’s supply chain.
Surface RT is priced at CNY3,688-4,488 plus CNY800 for a touch cover, the source indicated.
According to previous estimation by market observers, Surface RT and Surface Pro shipments to the global market would have reached one million units and 500,000 units respectively so far since their launch, but the actual volume for the two models so far is estimated at about one million units in total, the sources said.
Viewing that Microsoft has not placed additional orders for Surface RT, an estimated one million units of Surface RT remain in the inventory, the sources indicated.
Microsoft has talked with partners about developing second-generation Surface models, but those partners have generally been conservative, the sources noted, adding that Microsoft is inviting notebook and chip vendors to co-develop tablets based on Windows-ARM platform but those vendors have been reluctant.
Updates #1:
Microsoft Surface : Assembly in China [NIDA ISM YouTube channel, June 23, 2012]
Annual Report for the Fiscal Year Ended June 30, 2012 [Microsoft Corporation, July 19, 2012]
ITEM 1A. RISK FACTORS [p. 14]… our Surface devices will compete with products made by our OEM partners, which may affect their commitment to our platform. …
Microsoft’s radical new business plan is hidden in plain sight [Ed Bott on ZDNet, July 30, 2012]
Microsoft is reimagining its entire business model, and they’ve laid out the details for anyone to inspect. You just have to read between the boilerplate sections in the company’s most recent 10-K.
…
In the Sinofsky regime, Microsoft isn’t interested in hobbies or side projects. The company’s motto is “Go big or go home.” Earn a billion dollars. Get a billion users. Don’t think small.
I expect a massive marketing push behind Surface, and I would be shocked if we don’t see more PC hardware from Microsoft in the next 12 months.
Deal with it, OEMs.
Microsoft plans to pick up the pace. Dramatically.
Microsoft has a reputation for being too slow to respond. This year’s 10-K contains a new section that suggests that’s all about to change:
Many of the areas in which we compete evolve rapidly with changing and disruptive technologies, shifting user needs, and frequent introductions of new products and services. Our ability to remain competitive depends on our success in making innovative products that appeal to businesses and consumers. [emphasis added]
…
Microsoft unveils Windows 8 OEM licensing charges [DIGITIMES, July 11, 2012]
Microsoft has released licensing rates for OEM Windows 8, including US$60-80 for Windows 8, US$80-100 for Windows 8 Pro (with Office) and US$50-65 for Windows RT (with Office), according to Taiwan-based notebook supply chain makers.
Microsoft also confirmed the launch schedule of Windows 8 at the end of October with the RTM version of Windows 8 to be released in the first week of August for testing.
Sources from notebook players pointed out that the supply chain is placing high hopes on Windows 8 and expect the operating system to help resurrect consumer demand for traditional notebooks; however, due to remaining uncertainties, most players are still taking a conservative attitude about the launch.
Sources also noted that Windows 8 is unlikely to help significantly boost PC demand before 2013 since the new operating system will increase hardware costs due to some components needing to feature additional functions such as touchscreens to allow the operating system to perform fully, while the addition of the operating system’s licensing costs, the increasing expenses are expected to boost Windows 8-based products’ end prices to a rather unfriendly level.
However, as the notebook supply chain will gradually shift their production to touchscreen models with costs to start to see drops, the sources expect demand for Windows 8-based products will see an obvious increase starting mid-second quarter 2013.
Steve Ballmer, Jon Roskill, Kurt DelBene, and Tami Reller: Worldwide Partner Conference 2012 Day 1 Keynote [Microsoft, July 9, 2012]
Steve Ballmer: …
… there’s over 1.3 billion Windows systems on the planet. We’ve sold over 630 million Windows 7 licenses. … In the next 12 months, most forecasts would be for 375 million — 375 million new Windows PCs to be sold. That’s bigger than any phone or any other single device ecosystem. It is a stunning number. And all of those represent new opportunities as they move to Windows 8. …
But Surface is just a design point. It will have a distinct place in what’s a broad Windows ecosystem. And the importance of the thousands of partners that we have that design and produce Windows computers will not diminish. We have a mutual goal with our OEM partners to bring a diversity of solutions, Windows PCs, phones, tablets, servers, to market. And what we seek to have is a spectrum of stunning devices, stunning Windows devices. So, every consumer, every business customer can say, “I have the perfect PC for me.”
And we’re excited about the work from our OEMs. We may sell a few million, I don’t know how many, of the 375 million, but we need partners to have that diversity of devices. We’re excited about the work our OEM partners are doing on Windows 8, and we’d really like to show more of that today to you and everybody collected here, Rich. …
Tami Reller, Chief Financial Officer and Chief Marketing Officer, Windows and Windows Live Division: …
Today, as we sit here, more than 50 percent of enterprise desktops are running Windows 7. …
Windows 8 is on track to RTM, or release to manufacturing, the first week of August. (Applause.) And Windows 8 will reach general availability at the end of October. (Applause.)
General availability means that new Windows 8 PCs will be available to buy and upgrades will also be available starting late October. …
Microsoft OEM head change related to Surface, say Taiwan makers [DIGITIMES, July 4, 2012]
Microsoft has announced the replacement of Steven Guggenheimer with Nick Parker, originally vice president of OEM Sales and Marketing, for the position of corporate vice president for OEM Division. The personnel shuffle is related to Microsoft’s plans to launch Surface tablet PCs, representing Microsoft’s long-term business model of stepping into hardware, Taiwan-based supply chain makers have guessed.
The personnel change has caused worries among Taiwan-based PC vendors and ODMs, because it signals that Microsoft’s launch of Surface is not a short-term promotion for Windows 8 but marks a new “software + hardware” business model which is expected to bring troubles for hardware partners, the sources analyzed.
As Microsoft will step into the hardware business, it is naturally no longer concerned about the long-term close relations established by Guggenheimer with hardware partners and therefore has decided to change his position, the sources claimed.
Microsoft Surface chassis suffers low yields [DIGITIMES, July 9, 2012]
Microsoft reportedly planned to adopt unibody magnesium-aluminum chassis for its Surface tablet PCs originally, but affected by chassis makers’ limited capacity, the company has instead turned to adopt a magnesium chassis and use MegVapor technology for surface treatment to allow the device to feature a similar exterior to traditional metal chassis; however, due to the method having a rather low yield rate, is has greatly affected Microsoft in trying to mass produce its new tablet PCs, according to sources from the upstream supply chain.
Microsoft has not confirmed the rumors.
The sources pointed out that before Microsoft launched Surface, the company has inquired at several metal chassis makers about their available capacity and revealed to these makers that its orders for Surface tablet PCs will go as high as five million units before the end of 2012; however, the chassis makers were forced to give up because of lack of capacity.
Although Microsoft’s current chassis design for Surface allows the device to feature a similar exterior and sturdiness as traditional magnesium-aluminum, while having several color choices, the drawback of the design is that the device will be heavier.
The sources also pointed out that the chassis is supplied by a China-based supplier, but since the company is a second-tier maker, its low yield rates are causing Microsoft to pay a lot of attention to the supplier’s manufacturing process hoping for improvements.
Samsung Said To Plan Windows RT Tablet For October Debut [Bloomberg, July 7, 2012]
… The decision to support Windows RT follows Samsung’s earlier announcement that it will back another version of Windows. … Samsung’s Windows RT tablet will feature Qualcomm Inc. (QCOM)’s Snapdragon processor …
Apple led the tablet market at the end of the first quarter, with 11.8 million units shipped, or a 58 percent share, according researcher IHS ISuppli Inc. Samsung was second, with 11 percent, followed by Amazon.com Inc., which had 5.8 percent. …
HP, Dell to launch 10.1-inch Windows RT tablet PCs in 4Q12 [DIGITIMES, July 6, 2012]
Hewlett-Packard (HP) and Dell will launch 10.1-inch Windows RT tablet PCs equipped with processors developed by Texas Instruments and Qualcomm respectively in the fourth quarter of 2012, according to supply chain makers.
In addition to the two US-based brand vendors, Lenovo, Toshiba and Asustek Computer are all preparing to release Windows RT-based tablet PCs.
Meanwhile, although Acer is preparing to release Windows 8-based tablet PCs, the company currently has no plans to launch Windows RT-based models in 2012, while Sony and Samsung Electronics are turning conservative about developing Windows RT-based tablet PCs, according to the two firms’ current component supply status.
The sources pointed out that both Windows 8- and Windows RT-based tablet PCs are expected to be priced starting from US$599 and could go as high as US$1,000, while the machines’ major competition will be Apple; however, the sources hope the tablet PC competition will no longer revolve around price and instead attract demand from enterprise users and consumers that are used to the Windows operating system and its strong software compatibility.
End of updates
Surface by Microsoft [surface YouTube channel, June 19, 2012]
#1 excerpt:
Microsoft Announces Surface: New Family of PCs for Windows [Microsoft press release, June 18, 2012]
Two models of Surface will be available: one running an ARM processor featuring Windows RT, and one with a third-generation Intel Core processor featuring Windows 8 Pro. From the fast and fluid interface, to the ease of connecting you to the people, information and apps that users care about most, Surface will be a premium way to experience all that Windows has to offer. Surface for Windows RT will release with the general availability of Windows 8, and the Windows 8 Pro model will be available about 90 days later. Both will be sold in the Microsoft Store locations in the U.S. and available through select online Microsoft Stores.
…
Contributing to an Expanded Ecosystem
One of the strengths of Windows is its extensive ecosystem of software and hardware partners, delivering selection and choice that makes a customer’s Windows experience uniquely their own. This continues with Surface. Microsoft is delivering a unique contribution to an already strong and growing ecosystemof functional and stylish devices delivered by original equipment manufacturers (OEMs) to bring the experience of Windows to consumers and businesses around the globe.
…
Suggested retail pricing will be announced closer to availability and is expected to be competitive with a comparable ARM tablet or Intel Ultrabook-class PC. OEMs will have cost and feature parity on Windows 8 and Windows RT.
Microsoft’s unique contribution to an already strong and growing ecosystem is well demonstrated by the following images provided by Microsoft (the accompanying text was also provided by Microsoft):

Conceived, designed and engineered entirely by Microsoft employees, and building on the company’s 30-year history manufacturing hardware, Surface is designed to seamlessly transition between consumption and creation, without compromise.
Surface features a built-in kickstand that lets you transition Surface from active use to passive consumption.
The 3 mm Touch Cover represents a step forward in human-computer interface. Using a unique pressure-sensitive technology, the Touch Cover senses keystrokes as gestures, enabling you to touch type significantly faster than with an on-screen keyboard. It will be available in a selection of vibrant colors.
#2 excerpt:
Microsoft Announces Surface: New Family of PCs for Windows [Microsoft press release, June 18, 2012] (data higlights are mine to denote the essential differences)
Surface for Windows RT Surface for Windows 8 Pro OS: Windows RT Windows 8 Pro Light(1): 676 g 903 g Thin(2): 9.3 mm 13.5 mm Clear: 10.6” ClearType HD Display 10.6” ClearType Full HD Display Energized: 31.5 W-h 42 W-h Connected: microSD,USB 2.0, Micro HD Video, 2×2 MIMO antennae microSDXC,USB 3.0, Mini DisplayPort Video, 2×2 MIMO antennae Productive: Office ‘15’ Apps, Touch Cover, Type Cover Touch Cover, Type Cover, Pen with Palm Block Practical: VaporMg Case & Stand VaporMg Case & Stand Configurable: 32 GB, 64 GB 64 GB, 128 GB
(1), (2). Actual size and weight of the device may vary due to configuration and manufacturing process.
The product introduction/overview part of the event keynote:
Steven Sinofsky
[President, Windows and Windows Live Division]Today when you have your tablet, you want to be entertained, you have to hold it. You’re always sitting in an awkward position or perhaps you have to choose from a seemingly endless variety of add on stands and cases that solve a relatively simple problem but by adding weight, adding fitness.
What if I just want to watch movie or listen to music and do something else. We think that this should be an integral part of the design. We think that a stand should be integral. So we built a stand into the device.
This stand is made of the same VaporMg as the rest of the case. And it’s completely integrated into the device. The hinge design is like that of the finest luxury car and when not in use it just fades away. No extra weight, no extra thickness, no separate add on. It’s integrated just like the software and the hardware integrated into Surface.
And then once you have this kickstand you can sit back and enjoy a truly hands free experience. You could go and just put the Surface on a table, lay back and watch a movie. And that’s really what entertainment should be about with the Surface. But you know Surface is designed to be mobile. We designed Surface to be rugged and move around but with VaporMg and Corning Gorilla Glass 2.0 you do not need to worry at all, but we know many people preferred to have some sort of cover. A cover that helps to just act like an easy on/off switch at least.
So Surface has a cover. We designed the cover to be an integral element of the PC. We built a magnetic connector into the device to hold it very securely.
So let me attach the cover, click — you heard that it’s solid — click, close the cover it’s integrated into the device. It’s made from a fine northwest pola? tech. Feels great in your hand like a book, it just fits there. And when we looked at the whole Surface on the cover, we challenged ourselves to do more. This cover is just 3 mm. Combined with Surface they are just over 12 millimeters that’s less than 0.5 inch. And we said why not do something with this Surface. Why shouldn’t we just take this Surface and make it a full multi touch keyboard.
This Touch Cover is not just a full multi touch keyboard, but it’s a modern track pad with left and right buttons. It even has the keys for the Windows 8 Metro Style UI. This keyboard combined with the kickstand form the hallmark of just hands on creativity. On average typing is twice as efficient as typing on glass. And it’s certainly more comfortable. Now of course the innovative on screen keyboard in Windows is still there and you can mix and match. The choice is really going to be yours. Just put them on the table and you’ve got a great stand.
Let me go over here and show you a different Surface. This Surface is connected to external HDMI. That’s built into the device. I’m going to go here and now I’ve got the Touch Cover connected. Now with front and rear facing cameras on this device, I can record videos. I’m going to start the camera application. So now I can go here and I could tilt this around and angle it, so I could see it. This camera is angled at 22 degrees, but angling at 22 degrees everybody at the table their head is perfectly framed into the picture or when I’m sitting at the seat, I can do a Skype call and I am perfectly framed. But this device also has Windows on it or Office on it. So I go into the desktop and I see here is Word running.
Now what is really neat, as I could also have using the multitasking capabilities I could dark the camera out there and now I can record a video or a interview and take notes, I could record my self and read from my notes. And that integration is really cool, in fact I could even use the USB port and plug in an external speaker and microphone even though it has dual array mics and dual speakers built in, and I could get super high quality recording. And so that’s a quick look at Surface.
Now there is so much more to show you today. Now imagine if you will that we took all of those capabilities of Surface and we build them so that you could use all the applications that you’re familiar with. You could use Photoshop or you could use other applications. Those applications would be built using the latest of the Intel Core Processor. Now that in addition to the Surface that we’re releasing today for Windows RT, we also have a Surface that’s designed with these latest Intel processors. So, in addition to working on the NVIDIA ARM processor we’re also working with on a Surface for Windows 8 Professional. I would like to introduce Mike Angiulo now, who’s going to come up on stage and show us a little bit of the next generation of Surface.
Mike Angiulo
[corporate vice president of Windows Planning, Hardware and PC Ecosystem]Thank you very much Steven. I’m proud to introduce you to another member of the Surface family. This is Surface for Windows 8 Pro. The Windows ecosystem has always been about choice. And for the millions of professional desktop users out there, people who use their PC everyday to design and to create things, this is a great choice for you. It shares the same design principles that Steven was talking about. It’s a stage for Windows. It shows the same pride in craftsmanship. It’s less than 2 pounds and less than 14 millimeters, it’s a full PC.
Now this also has a ClearType display. Steven’s PC had a ClearType HD. This is a ClearType full HD display, and what that means is three things. It’s a combination of a very specific pixel geometry, rendering and an optical bonding process that together create the effect that your eye can’t distinguish between the individual pixels at normal viewing distances, in this case 17 inches, less than ARMs length.
This ClearType display also reduces Z-height [alternate term for X-height] and conserves battery power. It has some of the other high performance features you saw too. It’s got that 2×2 antenna technology. This is the first in tablets. It has dual high performance antennas and receivers so that you get the best Wi-Fi performance possible no matter how you hold it. It also has a chassis that’s build out of that same durable and elegant VaporMg that enables features like the 0.7 millimeter thin kick stand less than a millimeter. It’s got the same compatible accessory spine that Steven had, so if you take a Touch Cover like he had, it just clicks in, it clicks in the same. It has that same design and feeling because the entire Surface family of products was designed together. Even close like this, this is still less than 17 millimeters, this PC has specs that rival those of the finest Ultrabooks that have ever been announced. And it delivers the power and the flexibility that you would expect of a high end PC. This PC is powered by Intel’s third generation Core i5 processor, the Ivy Bridge processor.
This is their 22 nanometer process that results in a CPU that’s faster, a GPU that has double the 3D graphics throughput, all while using less power than today’s Core i5s. With that power comes a unique design challenge, how do you design a PC that you might be holding in any different way or have a cover in the front and the back to integrate active cooling. There is no obvious place to put a vent, so here is our solution. This is called perimeter venting. You see this groove that goes all the way around the outside of the case. There is a good shot of it up on the screen. This allows air to be uniformly distributed across the entire PC when necessary in a way, that you never block it with your hands. In fact you never even feel it, which makes the PC really comfortable to hold which is really helpful in doing things like flipping back your keyboard and taking notes with digital ink.
Surface for Windows 8 Pro supports digital inking. Windows apps of all kinds can support inking. So here what I’ve done is, I can go back for the desktop and show you what I launched. I launched the Windows Reader and this is a PDF file of one of Steven’s blog posts. So you could see I can pan and zoom. What I can really do here is I can come and I could do ink. I’m going to come and say this is great.
Now what you’ll notice when I ink and I zoom in, as I zoom in that ink stay smooth. That’s because it’s being sampled at 600 GPI, that sub-pixel accuracy for ink. What that does is that keeps your hand writing very smooth and hopefully yours is a little better than mine.
One of the neat things about this too is, as I’m inking from here I can see the tip of the pen almost feels like it’s writing exactly on the screen. Since this screen is optically bonded, we eliminated the layers in between the thin covered glass in the screen. So it feels like you’re inking write on the page. The distance between the stylus and where I see the ink is only 0.7 millimeters. That’s the thinnest and closest distance of any tablet PC, any inking tablet ever.
Now one of the other things that’s going on here is as I am moving my hand, you see the page is not moving underneath my hand. That’s because Windows has palm block technology. This Surface has two digitizers. It has one for touch and a separate one for digital ink.
And what happens is as when I bring the pen close to the screen, Windows sees the proximity of the pen, and stops taking touch input. So my hand doesn’t mess up what I’m waiting. And when I’m done with the pen, you can see the little magnetic charging connector there. It just clicks in. So that’s one of the cool things on Surface for Windows 8 Pro and inking.
The apps that I’d be showing you, they look really great in the native resolution of the screen, the 1080 resolution. But if you want to unlock the highest possible resolutions that Ivy Bridge supports. Even higher resolutions that are possible on via HDMI out. We have DisplayPort. So now with DisplayPort, I can take this PC. I can docket and I basically have a full professional workstation with the power of a desktop PC.
I have one here that’s plugged in and synced up to the show monitor and this kind of a PC is powerful enough to run big applications. Applications like Photoshop, Autodesks, Solidworks, enterprise applications that require a TPM [Trusted Platform Module] chip. In this case, I’m going to copy some higher res photos on to the PC and edit them in Adobe’s Lightroom. So on copying on to the desktop and what you’ll see here, this is the five-second copy. That’s a whole gigabyte. That’s a whole gigabyte of pictures. They just copied in five seconds.
Surface has support for really fast USB 3.0 and the new USB SuperSpeed drives, a gigabyte file copy in five seconds is five times faster than USB 2.0, which makes sense with this PC because they will be using it to do big jobs whether you’re editing big photos like this, and – or you’re dealing with big video files or you’re doing in Steven’s case a big job might be typing a super-long blog-post that you may have read. Surface is up for the tasks.
Now let’s say you are in fact doing one of those big typing jobs. You’ve seen already, Steven talked a little bit about Touch Cover and the improvements it makes for typing. Let’s say you’re really fast touch typist or maybe you just prefer the feel of tactile keys.
Well, we’ve got another Surface choice for you. This is Surface Type Cover. It shares the same full-pitch layout as Touch Cover. But what we’ve done is we’ve taken a key switch that has a 1.5-millimeter travel and we built it into the thinnest possible package. So you can touch type – I can touch type on this as fast as I can touch type on any keyboard. Fully compatible with Windows; you see the shortcut keys here. It has a full modern trackpad with clicking buttons and this completes the Surface family of products. I’d like to pull all the Surface family together, all at one point.
Panos, would you join us with the colors of Touch Cover Surface for Windows RT, Surface for Windows 8 Pro and a handful of the Touch Cover colors that we’re going to have it launched. That’s the complete Surface family.
Thanks Steven. Now that’s how we feel to in Panos especially, Panos Panay is the leader of the team that created Surface and has some great stories in some more detail about the product and how it came to be. It’s all yours.
Panos Panay
[General Manager, Microsoft Surface] Thank you.
Super cool – super cool. Thank you. Thank you for having me. I’m unbelievably humbled right now and flattered to be up here. But truthfully I’m recognizing an entire team that’s back in Redmond right now waiting to see your blog posts, to see what you have to say. We have a team full of designers, development engineers, manufacturing engineers, hardware testers, all working on these products right now as we speak.
Before I get into what I’m going to talk about today, I’m just going to show you a little [bit more about the design,] …
#3 excerpt:
Microsoft Announces Surface: New Family of PCs for Windows [Microsoft press release, June 18, 2012]: the 1st image was provided ny Microsoft, the next two are from the Microsoft provided video record
Advances in Industrial Design
Conceived, designed and engineered entirely by Microsoft employees, and building on the company’s 30-year history manufacturing hardware, Surface represents a unique vision for the seamless expression of entertainment and creativity. Extensive investment in industrial design and real user experience includes the following highlights:
- Software takes center stage: Surface sports a full-sized USB port and a 16:9 aspect ratio – the industry standard for HD. It has edges angled at 22 degrees, a natural position for the PC at rest or in active use, letting the hardware fade into the background and the software stand out.
![]()
- VaporMg: The casing of Surface is created using a unique approach called VaporMg (pronounced Vapor-Mag), a combination of material selection and process to mold metal and deposit particles that creates a finish akin to a luxury watch. Starting with magnesium, parts can be molded as thin as .65 mm, thinner than the typical credit card, to create a product that is thin, light and rigid/strong.
![]()
- Integrated Kickstand: The unique VaporMg approach also enables a built-in kickstand that lets you transition Surface from active use to passive consumption – watching a movie or even using the HD front- or rear-facing video cameras. The kickstand is there when needed, and disappears when not in use, with no extra weight or thickness.
![]()
- Touch Cover: The 3 mm Touch Cover represents a step forward in human-computer interface. Using a unique pressure-sensitive technology, Touch Cover senses keystrokes as gestures, enabling you to touch type significantly faster than with an on-screen keyboard. It will be available in a selection of vibrant colors. Touch Cover clicks into Surface via a built-in magnetic connector, forming a natural spine like you find on a book, and works as a protective cover. You can also click in a 5 mm-thin Type Cover that adds moving keys for a more traditional typing feel.
The product design part of the event keynote:
Panos Panay
[General Manager, Microsoft Surface]… [I’m just going to] show you a little bit more about the design, show you a little bit more about the culture of how these products were build. So I think it might be interesting for you to hear that. I really want to share with you more of our team. So just watch this video really quick and I’ll be right back.
[Video Playback]
You’re going to get to meet a lot of the people you just saw on the video in just a few minutes. They’re actually backstage right now, preparing to show you more details of the product and give you a few minutes to put your hands on it, talk a little bit about the design.
Let me start by doing that to just give you a quick preview of what you might see backstage in just a few minutes. You’ve heard Steven and Mike both said this was build as the stage for Windows 8. That was part of our core vision for the product. It is very important for us that we had the hardware fade to the background for this product. It was important, so the Windows software could rise to the Surface. It gives you the best experience possible. When the hardware fades away and what comes to the Surface is that entertainment PC one when you’re using the device. Note the chamfered angles on the side of this product either chamfered at 22 degrees. That’s two things. One, it’s a physical manifestation of the actual stage itself. You can see as it falls away, just as we intended for the hardware to do. But two, it actually sits perfectly comfortable in your hands.
And let me call it by something. I’d say perfectly a lot. I’d say perfect a lot. As part of our team culture, what was really important for us as we had so many parts of the design that had to be in detail and be simple and be right that we always tried for perfection on every sub-component of this product, it includes this chamfered angle.
What it does is, it sits in your hand very comfortably, in a way that when you hold it, it feels like, it’s feels airy. Most importantly, you can use it all day in comfort. It’s really important when you talk about the hardware fading to the background that the hardware is not in your way to accomplish what you want to do. It’s meant to move you forward, which you think this product does.
Now when we talk about hardware fading to that back, another thing that’s super important is a seamless lines throughout the product. When you look at this product, you’ll see lines going throughout it, every line calculated, every line built, formed perfectly on the device.
But there is one challenge. Our vision for the product beyond being a stage for Windows was also that we had to bring creativity and productivity to folks such as yourselves.
The opportunity to transform this device well, to transition it to the state of getting things done. Putting this kickstand in the product, flies right in the face seamless lines and getting it perfect. But we really spent a lot of time here. We knew that if we do not get the kickstand perfect, this device would not work. We could not take any chances. Take a look at the three hinges that you see within this device. This is a really simple example of the details of the product. These are three custom-made hinges, mind you there are over 200 custom parts built from the inside out of this product to make it come to life.
But these hinges, they respect just as Steven told you. They respect to feel and sound like a high-end car door. When you close the device, the kickstand just goes away. It’s not in your way. When you needed the device, it’s there, just in time. You want to get something done, just open it and it feels great.
The spec we created was around sound. We iterated over and over again in our anechoic chamber. This is a critical point. We’ve really wanted to get the sound rights. So you get that – this full feeling, that emotional attachment to your product when you open this kickstand and close it. It makes it yours, it goes away when you don’t need it and it’s there when you do.
Now, we talked about VaporMg a few times. Now let me bring VaporMg to life just a little bit here. So you can understand a little bit more about what we did. VaporMg essentially becomes what lets us, get our product design and create life out of it. You can see the break up behind me, let me just explain a few things that we have going on.
I’m holding up my room key, it feels weird to hold at my room key. But if you look at this quickly, what you’ll see is 0.77 millimeters of thickness. This is an important point. If you can’t see it, that’s all right, same as a credit card, pull it out, your credit cards likely somewhere between 0.75 or 0.85 millimeters thick. It’s just a illustrated point. VaporMg is a process where we start with an ingot of magnesium and we melt it down to a molten state. Within injection mold the magnesium, there are some tools and we’re able to actually mold the intricate details that are needed for Surface. We mold down to 0.65 millimeters of thickness in any given part. 0.75 … [he means the credit card thinkness just mentioned], we mold to 0.65, this is important to understand, because for us to get to the design we needed for this product, to get the kick stand, integrated seamlessly and hold this line throughout the product we had to be able to mold to those tolerances.
Every micron matters within Microsoft Surface. we’ve actually stacked up every part from designing from the inside out, so tightly in the product and so cleanly that even if you stuck a piece of tape in the middle of the device, it would bulge, it would bulge out. That tells you how strong this product is, how much strength comes with it, how light it feels in your hands, all those parts play into each other.
The best part about VaporMg is not just that we can mold a 0.65 and get the intricate details like the 0.65 millimeters angles that go around the product this radial. The best part is the smoothness of the finish that comes out of the tools. After approximately 152 steps to get the VaporMg looking just like you see now, you find that the Surface finish on this product and as Mike says, bright in craftsmanship is perfect, it’s seamless. It screens watch quality finish and when you put it in your hands, it feels elegant, when you touch it, you’re going to want to hold it, I promise you.
Now I’m proud of VaporMg and I’m proud of the team for the product that they’ve done, but nothing, nothing stirs me more, nothing gets me more excited than Touch Cover. I really want to walk you through Touch Cover for just a few moments. This is an important technology that came out of our group. I’m going to walk you through it in two ways, the first way is through the experience and the second way I’m going to talk about is the technology.
Let’s do the experience first, we explained you what we try to do with Touch Cover from the get go, you notice I’m going to connect it now to my blue Touch Cover. So I just click it in, as you would expect. The Surface turns blue along with my Touch Cover and you have a beautiful integration of hardware and software. My Surface knows what is connected to it. I can now bring to life the vision that is Touch Cover for this product. The vision that lets you produce content when you want it, how you want it as fast as you’ve always done it, that’s what this product was designed for.
Let me give you one more second on this, on a little bit of the experience. The thing that was so critical for us in creating Touch Cover was that it had to be 3 millimeters thin. This essentially is at odds of any other keyboard you’ve used and still have a great typing experience. It also had to be a cover you wanted to connect, something you always had with you, something that gave you confidence just like the kick stand to bring this product to life.
We designed flex magnets in this product, that’s a combination of alignment in clamping magnets. You could actually never miss connecting this device, you can’t miss, we force you to not miss. We do that to give you confidence. You close it, it feels like a book, we design this organically like a book; we wanted it to feel just like that. What has more covers on it than books themselves? This spine feels like a book. When you put it in your hand and you walk away with your product, you’ll hold it like a book. When you carry it against your books, it will feel like it’s another book, it’s just light enough and it feels just perfect.
Now that said, I think you’re going to fall in love with Touch Cover. I know I have. I mean I’m seriously in love with it outside of my wife, Touch Cover is number two. It’s very important to me. Now, I never want to take Touch Cover off, and I’d argue that you don’t need to and you never have to.
You saw Mike move his Touch Cover to the back. Now when he did that I’m sure every single one of you thought like wait a minute, how do you move it to the back? Well, Touch Cover is pretty smart; it has an accelerometer built into it. The moment you fold it back, we know you fold it back, we know when you’re not using it and it’s turned off for you.
So you never have to take it off and underneath your fingertips, it feels great. So now you’ve got a comfortable device with Touch Cover that’s yours, it’s personalized to you. You saw the beautiful colors that we have coming to market and essentially what’s brought to you is an experience like none other with Touch Cover and Surface together.
Now I showed you the experience, but I wanted to show you the technology, because it really is important that you understand it and quite frankly, we have a bit of a mad scientist, who many of you know, named Stevie Bathiche. Stevie actually invented Touch Cover, the fact that we have 30 years of input experience using mice and 15 years creating keyboards, we really understand how to create a great typing experience. We also knew that if we brought you Touch Cover, and Touch Cover wasn’t any good, boy, what a breaking moment. But we’ve actually evolved this technology to a point through Stevie and his work to come to a place where we’ve brought you an experience that’s amazing at typing. There’s actually seven layers squeezed in, pressed right into Touch Cover to keep it 3 millimeters thin. Now that’s super thin, but critical for you to have a great experience when folding it back.
Let me explain to you how the technology works just ever so slightly and quickly. So what you’re going to see is I’m going to put my hands down on this machine here and, what you’re seeing is this is Surface for Windows RT, and my hands are down on Touch Cover. You’ll notice that my hands are laying flat on Touch Cover right now yet nothing is happening. If this was in fact a capacitive screen or the phone you might have in your pocket or some other device you might have, the keyboard would take up off the screen and you put your fingers down and it would look something like that.
Now that’s me actually pressing on Touch Cover, and it knows the grams of force coming off my fingertips, on to Touch Cover. Why is this critical? When you type in touch type speed, you have to find your home position and rest your hands. To do that, your keyboard can’t fire when you put your hands down, it’s comfortable, you can rest your hands and note as I put pressure on the J key, how the pressure goes up as I push harder and as I release, the pressure comes off.
It’s actually measuring every gram of force coming off my fingertips and as I start to type, it knows how many keys I’ve hit. This keyboard actually measures 10 times faster in scanning from a keyboard matrix than any keyboard, guarantee that you use today. It is super fast and brings great, great opportunity for you to be productive and get stuff done.
Obviously, I have a lot of pride in this product. I hope you’ll love it. I can’t wait for you to get your hands on it back there, and I really mean that. Steven, thanks for having me up here today.
Steven Sinofsky
That was a moment for our team for sure. I do want to talk a little bit about some availability and pricing information and things like that I know people want to know. Surface for Windows RT, I still say that there will be much more information available on the web and available shortly. So Surface for Windows RT will be available in both a 32 and a 64-gigabyte model and will be priced like comparable tablets that are based on ARM. Surface for Windows 8 Professional will come in 64-gigabyte and 128-gigabyte storage models and will have a retail price comparable with competitive Ultrabook-class PCs. Additional specifics on pricing and packaging will be announced as we get closer to retail availability.
Now of course, retail availability for the Surface PCs will be around the time of – for the Windows RT PC, will be at the time of the Windows 8 general availability and for Windows 8 Pro about three months later. Surface will be available through the Microsoft’s physical stores here in the U.S. and will be available through the select online outlets of the Microsoft store as well.
So welcome everybody to Surface. I just want to invite Steve Ballmer back up on stage one more time and thank you, thank you very much.
Steve Ballmer
I want to thank Steven and Mike and Panos and their team. This has been an unbelievable journey. We’ve invested significantly as you can see in talent, in time, in capital to bring the Surface to market. I was asked in the last few days here why now, why now? We took the time to really get Surface in Windows 8 right to do something that was really different and really special.
We’re very proud; very, very proud of the Surface just like we’re very proud of Windows 8. Because of Windows 8, because of Windows 8 the Surface is a PC, the Surface is a tablet, and the Surface is something new that we think people will absolutely love. We really want those of you here to have a chance to see and touch the Surface and talk with some of the people who are involved in designing the product.
We have several stations set up next-door where you can see the work that went into the creation of the Surface, and we hope you’ll stay and join us for that. Today has been the fun for us to put on for you very, very exciting and I want to thank you all for being part of today’s event. Thanks.
The justification part of the event keynote (was the general introduction, i.e. the first part of the event): i.e. how and why Microsoft decades long hardware innovation history has now been expanded by PC/tablet level innovation, why after Windows 8 innovation Microsoft needed a matching innovation in hardware as well?
Steve Ballmer
Well, good afternoon and welcome, I certainly want to thank everybody for joining us for today’s event. The past several years have seen great change in the industry and great innovations coming from Microsoft. We’ve helped usher in the new era of cloud computing, we’ve embraced mobility, we are redefining communications and attempting to transform entertainment. In all that we have done Windows is the heart and soul of Microsoft from Windows PCs to Windows Servers to Windows Phones and Windows Azure. Windows is proven to be the most flexible general-purpose software ever created spurring on an ecosystem of unrivaled success.
When Microsoft was founded our vision was odd and broad: a computer on every desk and in every home. And while certainly we are optimists to the core Windows has exceeded even our most optimistic predictions. It now powers well over 1 billion PCs from desktops to laptops to ATMs to NASA workstations and more: in homes, in businesses, in schools and in governments literally around the world.
With Windows 8 we’ve re-imagined the Windows product. We re-imagine Windows from the chipset to the user experience, to power a new generation of PCs that enable new capabilities and new scenarios. We approached the Windows 8 product design in a forward-looking way. We designed Windows 8 for the world we know, in which most PCs are mobile and people want access to information and the ability to create content from anywhere anytime.
People want to do all of that without compromising the productivity that PCs are uniquely known for: from personal productivity applications, to technical applications, business software and literally millions of other applications that are written for Windows that work perfectly on Windows 8. We are incredibly gratified by the enthusiastic response to Windows 8 from our partners, our OEM partners, thousands of developers and literally millions of people consumers who’ve downloaded our previews.
Excitement is high with the new X86 and ARM SoC support. The new Metro User Interface and the new Store all getting very broad interest.
Today, we want to add another piece, another bit of excitement and another piece to that Windows 8 story.
At our foundation Bill Gates and Paul Allen made a bet, a bet on software, at the same time it was always clear that our unique view of what software could do would require us to push hardware sometimes in ways that even the makers of the hardware themselves had yet to envision. That’s the nature of the dynamic between hardware and software pushing each other and pulling each other forward. In fact, our number one revenue product actually the year I joined Microsoft 1980 was a hardware product, something known as the SoftCard. Let’s just take a little bit of a look back at the role of hardware at Microsoft.
[Video Playback]
We believe that any intersection between human and machine can be made better when all aspects of the experience hardware and software are considered in working together. Just let’s take the mouse as an example.
To be successful Windows 1.0 really needed a mouse so we built one. Early reviews of mice were not very positive as people struggled to understand the real value. In fact actually it was so new the Canadian Customs quarantined the Microsoft mouse at the border for four weeks thinking that it was alive.
Our most successful hardware product has been the Xbox and with Kinect we’ve created a whole new user experience. And now developers are pushing Kinect, viewing more exciting and even cooler things for both the game console and for Windows PCs. This combination of hardware, software and peripherals in the Xbox case work together to deliver an absolutely amazing experience.
We see that sort of combination working also today in our PC ecosystem. We believe in the strength of that ecosystem, of software and hardware companies that work together to deliver selection and choice that makes your Windows experience uniquely your own. Those partnerships are essential to the re-imagination of Windows. We’ve worked with the component companies, Intel, AMD, NVIDIA, Qualcomm and Texas Instruments.
Of course the ultimate landing point of this PC experience is through our partnerships with OEMs: HP, Dell, Asus, Acer, Samsung, Sony, Lenovo, Toshiba and many, many more. They will deliver more PCs to market in the year 2013 than in any previous year. IDC estimates that number at over 375 million Windows PCs. That will ensure that software developers and content creators have a larger number of new systems to target with their Windows 8 applications than any other non-phone platform.
However, with Windows 8 we did not want to leave any scene uncovered. Much like Windows 1.0 needed the mouse to complete the experience, we wanted to give Windows 8 its own companion hardware innovation. What is this innovation? It’s something new, it’s something different, it’s a whole new family of computing devices from Microsoft.
[Video Playback]
This is the new Microsoft Surface. It embodies the notion of hardware and software really pushing each other. People do want to create and consume, they want to work and they want to play, they want to be on their couch, they want to be at their desk and they want to be on the go. Surface fulfills that dream. It is a tool to surface your passion, to surface your ideas, to surface your creativity and to surface your enjoyment. I really want you to take the time today to get to know Microsoft Surface. So let’s now learn more from Steven Sinofsky and the Microsoft Surface team.
Steven Sinofsky
Just as we’ve re-imagined Windows we also have a vision for re-imagining the tablet.
We see a tablet that is designed the way that Windows has been designed. We see a tablet that represents a unique vision with a seamless expression of entertainment and creativity. A tablet that works and plays the way that you want to, a tablet that’s a great PC, a PC that’s a great tablet, a new type of computing, Surface.
Surface is a stage for Windows. Surface is designed for the software experience to take it, have it take centre stage. Surface is super thin at 9.3 millimeters. It’s just thin enough for this full size USB port for peripherals or just charging your phone while you are at the hotel. The edges are bevelled away at 22 degrees, so the PC itself fades into the background. It feels natural in your hands.
Surface is the first PC with a full magnesium case. Through unique process the liquid metal is formed into an ultra rigid, yet ultra light frame. It is incredibly in strong and it’s airy at under 1.5 pounds, just 676 grams, and it’s finely balanced. We didn’t stop there, the case is one of a kind. It’s made from a physical vapor deposition process. It results in a permanent scratch and wear resistance for Surface. This VaporMg case is a first of a kind, and it accentuates the unique feel of Surface.
Surface is of course great for entertainment. It has access to all of the Windows apps for music, for video, for Xbox and gaming. We can see here I’m running Internet Explorer. I can browse smoothly, use see great pages using ClearType and have a great experience just with all the – with browsing. It’s 10.6 inch optically bonded, wide screen display, is custom designed for Surface. And of course people play games. I can go and play any of the interesting games that are on – in the Windows Store and I can use Surface for using all the sensors that are within Windows as well. Surface works for all of those games.
Movies and entertainment look great as well. Excuse me just a second. Surface looks great for entertainment as well. In fact I’m going to show here for the first time a very exciting new application. This is the Netflix application designed specifically for Windows 8. Now with the wide screen you get 30% more viewing area and no banding or letter boxing like you traditionally see.
I’m happy to show this new Netflix application … [, give you an early look how it’s designed specifically for Windows 8 with semantic zoom. And Netflix will have this ready at the Windows 8 launch. I can go here and start a movie and see it stream straight to my Surface PC. Just like you would expect.
Now to stream so well Surface needs great Wi-Fi. Surface is the first tablet to incorporate dual 2×2 MIMO antennas. That means it provides the very best Wi-Fi reception of any tablet today. Surface is incredibly great for Windows and for entertainment PC. And we are just getting started.] …
More information:
– Surface Website
– On-Demand Keynote: Microsoft Surface Event
– Broll: Product imagery of Microsoft Surface
– Broll: Images from Microsoft Surface Event
– Product & Event Images
– See it in Action
And remember this leading edge Microsoft Surface family, leading edge even against Apple’s market leading offerings, so this product is definitely just the tip of the iceberg. Consider this Channel 4 report which is showing the kind of the future which could come from Microsoft as seen back to the beginning of last year:
– Touching, waving at and talking to the future with Microsoft [Channel 4 News YouTube channel, Feb 8, 2011]
(Note towards the end of the video, Panos Panay to appear as simply from Microsoft Surface.) Additional infomation:
– Benjamin Cohen, the reporter in the video, had this detailed blog post about that visit
– Steve Clayton, the Microsoft’s not that long ago initiated, ‘Next at Microsoft’ storyteller, had also this detailed blog post about that visit
Note that Microsoft shares started to raise already last Friday (obviously based on expectation when the invitations to a ‘mistery event’ were sent out). Nevertheless from $29.34 to this Tuesday’s closing price of $30.71 that was only a 1% growth. Interestingly during the same period Apple’s share price had a 1% growth as well, although Apple made its series of announcements a week earlier, on Monday last week (June 11th, 2012):
– Apple Introduces All New MacBook Pro with Retina Display
– Apple Updates MacBook Air and Current Generation MacBook Pro with Latest Processors and New Graphics
– Mountain Lion Available in July From Mac App Store
– Apple Previews iOS 6 With All New Maps, Siri Features, Facebook Integration, Shared Photo Streams & New Passbook App
which resulted in 0.5% growth only.
So the stock market evaluated the Microsoft Surface against the above Apple introductions, and found that on equal level from business growth perspective, although Apple’s closing price yesterday was $587.31, i.e. 19x higher. In terms of market capitalisation Microsoft remains on the same 47% of Apple’s, so from business competition point of view the announcement of Microsoft Surface is not changing the positions as far as the opinion of the overall business world is concerned. INTERESTING!
Meanwhile the earlier Microsoft Surface product has been renamed as Microsoft PixelSense in order to avoid confusion:
About Microsoft PixelSense [Microsoft page for the press on the PixelSense microsite, June 18, 2012]
The Samsung SUR40 with Microsoft PixelSense is an innovative product that responds to touch, natural hand gestures and real world objects placed on the display, providing effortless interaction with information and digital content in a simple and intuitive way. With a large, 360-degree, 4-inch thin horizontal user interface, the Samsung SUR40 offers a unique gathering place where multiple users can collaboratively and simultaneously interact with content and each other. In addition, the SUR40 provides businesses with unique value in delivering information and services in a more friendly way allowing better engagement with their customers. The Samsung SUR40 with Microsoft PixelSense is targeted for companies across a variety of industries including retail, hospitality, health care, and public sector.
The Samsung SUR40 with Microsoft PixelSense is a major advancement in computing that moves beyond the traditional user interface to a more natural way of interacting with information. The four key attributes that make this experience unique are:
Multiuser experience.The horizontal form factor makes it easy for several people to gather together, providing a collaborative, face-to-face computing experience.
Massive multitouch contact. The Samsung SUR40 with Microsoft PixelSense recognizes many points of contact simultaneously, not just from one finger, as with a typical touch screen, but up to dozens and dozens of items at once.
Direct interaction.Users can actually “grab” digital information with their hands and interact with content through touch and gesture, without the use of a mouse or keyboard.
Object recognition. Users can place physical objects on the display to trigger different types of digital responses, including the transfer of digital content.
At CES 2011, Microsoft unveiled the designed for touch experience featuring Microsoft PixelSense technology, which gives LCD panels the power to see without the use of cameras.
This experience comes to life in the Samsung SUR40 with Microsoft PixelSense, which incorporates significant technological advancements designed to enhance the user experience.
The Samsung SUR40 with Microsoft PixelSense features key hardware and software technology advancements informed by feedback from users around the world.
Microsoft PixelSense™.Microsoft PixelSense allows a display to recognize fingers, hands, and objects placed on the screen, enabling vision-based interaction without the use of cameras. The individual pixels in the display see what’s touching the screen and that information is immediately processed and interpreted.
PixelSense technologydelivers an innovative user experience built on the principles of direct interaction using touch and objects. The Microsoft Surface 2.0 SDK allows application developers to take advantage of capabilities of PixelSense technology.
Thin form factor with multiple configuration options.The Samsung SUR40 is four inches thin, which makes it easy to use as a table, hang on the wall with the VESA mount, or embed in walls or custom enclosures. There are standard leg supports available or customers can design and attach their own leg supports.
High definition large format display. The 40-inch, stunning high-definition screen (1920 x 1080 resolution) enables enhanced multiuser and multitouch experiences.
Microsoft PixelSense activities are available on the Microsoft PixelSense blog and Microsoft PixelSense on Twitter.
For more information, press only: PixelSense PR team
Also these two videos appeared on a new Microsoft® PixelSense™ YouTube channel [June 18, 2012]:
The Power of Microsoft® PixelSense™
Samsung SUR40 with Microsoft® PixelSense™
Now some first reactions from the event attendees:
Microsoft Surface: a closer look [TheVerge YouTube channel, June 18, 2012]
See also this article: Microsoft Surface with Windows RT hands-on pictures and video [Joshua Topolsky from The Verge, June 18, 2012]
Microsoft Surface tablet demo June 18, 2012 Event in SF [SlashGear YouTube channel, June 18, 2012]
See also these articles, same date, on SlashGear (the first ones are kind of liveblogging):
– Microsoft Surface Tablet Hands-on by Vincent Nguyen
– Microsoft Surface re-introduced as a handheld tablet by Chris Burns
– Microsoft Surface cover doubles as built-in keyboard by Cory Gunther
– Microsoft Surface for Windows 8 Pro revealed by Chris Burns:
This tablet introduced its own “Perimeter Venting” so as not to get too hot [in fact to solve the problem of cooling with a tablet which can be used in both portrait and landscape modes], works with Pen input (with digital ink, explained in a different post), and has a display that’s just 0.7mm from the glass that covers it. The Microsoft Surface for Windows 8 Pro has two digitizers, one for ink, one for touch, and has a bit of magnetization for its pen so no holes or clips are needed.
– Microsoft Surface to feature digital ink stylus support by Cory Gunther
At the event live they said it best by stating, “This surface has two digitizers. One for touch, one for digital ink.” All stylus or pen input is converted into digital ink and the new Surface tablet is extremely responsive and accurate.
The distance between the screen (digitizer) and the stylus is only .7mm thick, and allows for it to be highly accurate, making you feel like the ballpoint of a pen is actually writing on the “surface”. Surface will see the proximity of a stylus and stop recognizing hand inputs.
– Microsoft Surface Windows RT confirmed with NVIDIA’s Tegra processor by Cory Gunther
NVIDIA has just issued a rather short note confirming that their Tegra processor will be under the hood and powering the smooth and fluid Windows 8 RT model. They didn’t specify which Tegra processor as expected, but we are speculating it will be the quad-core Tegra 3 KAI platform, or the Tegra 3+ that was detailed as coming soon [… an upgraded Tegra 3 called T3+, with code-names Wayne and Grey splitting off in the third quarter of 2012 with LTE. Grey specifically will have access to LTE data speeds, with Tegra and Icera hardware being part of this sector for NVIDIA] in a lot more than just Android devices.
Microsoft Surface Touch Cover vs Type Cover hands-on by Chris Burns
These keyboards bring on a fair stab at what 3rd party manufacturers have been attempting for the iPad and a host of Android tablets now for several years. The keyboards on both units aren’t going to bring you a perfect replacement for a notebook computer if you’re attempting to match the laptop-bit of the equation, but if you’re the sort of person to work on a desk, you might be in business.
Microsoft Surface could debut MagSafe-data hybrid hook-up by Chris Davies
The four-pin port is on the right lower edge of the new tablets, and seemingly matches up with the MagSafe-like connector detailed in a patent application from the company. If so, that could mean a single hook-up for recharging the Surface and synchronizing it with other devices.
Microsoft’s patent application followed in the footsteps of Apple’s magnetic charger system – which allows the cord to break away easily if someone trips over it, rather than yanking your laptop off the desk – but added in a data connection. With just one port, the Surface could be hooked up to both a charger and other external hardware, with an optical data link used for maximum speed potential.
…
The potential for such a connection is vast. Microsoft has been coy about external device support for Surface, only mentioning the USB and video-output ports, but with this proprietary port it could be used with a docking station to add in an optical drive, wired network connection and more.
We’ve been waiting for just such a strategy from Apple for some time, and indeed the Cupertino company has an optical data MagSafe patent application of its own. More on Microsoft Surface in our hands-on here.
Microsoft Surface Tablet: Hands-on [laptopmag YouTube channel, June 18, 2012]
See also these articles:
– Microsoft Surface Tablet Hands-on: The Future of Windows is Here [Video] [Michael A. Prospero from LAPTOP Magazine, June 18, 2012]
– iPad vs Microsoft Surface: Tablet Specs Compared [Kenneth Butler from LAPTOP Magazine, June 18, 2012] (data higlights are mine to denote the essential differences)
Microsoft to Unveil a New Tablet – Good or Bad Idea? [The Wall Street Journal YouTube channel, June 18, 2012]
See also this article: Microsoft Unveils Surface Tablet to Rival iPad [Shira Ovide from The Wall Street Journal, June 18, 2012]
… Al Hilwa, an analyst at IDC, said the combination of PC and tablet features makes surface a “true converged” device. “A Swiss Army knife of a tablet?” …
The computer makers‘ business is dependent on Microsoft, so they may not express annoyance publicly at Microsoft’s trading on the hardware makers’ turf. But at least some hardware executives are fuming privatelyat Microsoft’s decision.
Microsoft’s move to make its own tablet “comes with consequences, which is complicating choices for consumers and complicating relations with third-party manufacturers,” said Sarah Rotman Epps, an analyst with Forrester Research Inc.
Microsoft “Surface” Tablet Announced, Powered by Windows 8 [Eric Savitz for ForbesVideo YouTube channel, June 18, 2012]
See also these articles:
– Microsoft: Live From Hollywood! Introducing Microsoft Surface Tablet (Updated) [Eric Savitz from Forbes, June 18, 2012]: a live blog of the event
– Microsoft Announces Surface, Its New Windows 8 Tablet [Kelly Clay from Forbes, June 18, 2012]
…
As no one does keyboards better than Microsoft, yet another keyboard is also available for Surface that features a full trackpad with clicking buttons. Though Surface is slightly heavier than the iPad and has 25% less battery life (31.5 Watt hours compared to the iPad’s 42.5 Watt hours), Surface is truly one of the most powerful and lightweight mobile PCs we have seen.
It’s clear that Surface is designed for current Windows users, and according to NetMarketshare, Windows XP, Vista, and 7 combine for 93% of all desktops. For these users – especially those in the corporate environment – there is a hesitation to switch to another platform, even just for mobile use. As a result, Surface could be a game-changer in the tablet industry. Not only does it feature key capabilities that Apple has yet to ever integrate (such as a keyboard), but Surface will undoubtedly make it easier for curent Windows users to transition from home to office and in-between. While a price has yet to be set, it’s expected to be extremely competitive compared to other tablets, ensuring that Surface is a device that many current Windows users will want to own.
Other notable first reports:
1. WIRED magazine [June 18, 2012]:
– Liveblog: Meet ‘Surface,’ Microsoft’s New Windows 8 Tablet
– Microsoft May Be Late to Tablet Fight, But Has the Cash to Keep Sparring
– Microsoft Dives Head-First Into Mobile Hardware With Two 10.6-Inch Tablets
…
Surface is much, much more than a new tablet platform. It’s also Microsoft’s first fully branded computing device — an ambitious new development direction after years of making only simple computer peripherals. And Surface is also a challenge to every hardware partner in Microsoft’s OEM stable.
“Its a bold move on the part of Microsoft,” says Gartner analyst Michael Gartenberg. “This is a real change in strategy for them, and it’s certainly a vote of no confidence for their partners. This shows how high the stakes are. There is competitive pressure from Apple that is clearly a threat to their business. Steve Ballmer seemed to be channeling Steve Jobs on stage, saying hardware and software have to be designed to together.”
…
As for pricing, Microsoft isn’t saying, but Gartenberg weighs in:
“I’m guessing somewhere between $600 and $1000 — Microsoft was very vague. This the problem you encounter when you launch something so far ahead of delivery,” he said. “For a launch like this, it’s all about the details. Everything about this event, the mysterious invitations, the presentation — Microsoft is trying to be Apple. But the only company that has successfully been like Apple, is Apple.”
2. engadget [June 18, 2012]:
– Live from Microsoft’s mystery press conference in Los Angeles! by Dana Wollman
– Hands-on with Microsoft Surface for Windows RT, Touch Cover and Type Cover (update: video!) by Dana Wollman:
… (Microsoft has only said that the ARM chip is made by NVIDIA. No one ever said it’s a Tegra 3 SoC, but that is naturally our best bet.) …
Based on remarks by Steve Ballmer and others during the presentation, it sounds like a lot of thought went into the two keyboards, so we wouldn’t be surprised if a large focus group of touch typists were able to prove Redmond’s engineers right. But having played with both, we don’t imagine this being like settling in with a new laptop or Transformer-style dock. You might have to re-learn how to type (or at least teach your brain to fuhgeddaboutit and trust your fingers to land where they’re supposed to.) …
Even after some brief handling, we feel impressed, almost sobered by what Microsoft’s managed to produce after vowing to take the Windows 8 hardware-software package into its own hands. Surface for Windows RT is well-made, polished, durable and carefully engineered. And yes, that’s sobering news: Microsoft’s own OEM partners, everyone from ASUS to Acer to HP, should feel a tinge of defensiveness. If Redmond’s mission until now has been to showcase all the possible form factors for Windows 8, it may have just taken a step in the opposite direction by upstaging everybody else.
– Microsoft reveals its own Windows 8 tablet: meet the new Surface for Windows RT by Jon Fingas:
Not unlike Apple’s last two generations, there’s a magnetically attached cover, but it’s more than just a protector: here, it includes a full multi-touch keyboard and trackpad.
– Microsoft one ups other tablet ‘smart’ covers with Surface’s Touch Cover and Type Cover by Terrence O’Brien:
… right now we’re pretty enamored with Microsoft’s Touch Cover for the newly announced Surface. See, it works almost exactly like that other “smart” tablet shield, but this one actually earns it’s smart moniker. When you peel the plastic shroud back it turns into a fully functional keyboard and touchpad. Obviously, being a thin plastic sheet, the cover is relying on touch for key presses, not the actual depression of mechanical switches. …
Perhaps one of the more interesting features though, is their ability to force Win 8 to color coordinate with your chosen shade of folio. Click the blue Touch Cover on to the Surface and the background switches to a soothing shade azure. There’s even an accelerometer inside those 3mm-thin softer covers — which is an impressive feat of engineering. The Touch Covers can easily distinguish between you simply resting your hands on the keyboard and actually typing, which should help minimize accidental key presses.
– Microsoft announces Surface for Windows 8 Pro: Intel inside, optional pen input by Donald Melanson
– Microsoft Surface tablets: the differences between Windows RT and Windows 8 Pro models by Darren Murph
3. CNET [June 18, 2012]:
– Microsoft breaks tradition with Microsoft Surface tablets
– Surface touches the right keys, but not a complete picture
– Who is the Microsoft Surface for, exactly?
– Five key takeaways from Microsoft’s Surface event:
… 1. Don’t confuse this with the table thing [i.e. the old Surface now called Microsoft PixelSense]. … 2. This isn’t just aimed at the iPad and Android tablets [as it can work like a PC, complete with a full version of Windows]. … 3. This thing is high-tech. … 5. This is just the start [as Microsoft is positioning Surface as the beginning of a family]. …
– Why Microsoft built its own tablet — think Apple and Xbox
…
The tablet and ultraportable form factors are especially fertile ground in terms of growth and innovation. A recent Online Publishers Association studyfound that 31 percent of the U.S. Internet population (74.1 million users) own tablets, up from 12 percent in 2011. By 2013, the study projected that 47 percent of the U.S. Internet population (117.4 million users) would own tablets.
At this juncture, Google’s Android platform (including Amazon’s Kindle) and Apple’s iOS are splitting the market. Apple’s continuation of its firm grip on hardware and software integration is working exceedingly well, as evidenced by the company’s incredible financial success.
…
Google gives its Android platform to partners for free, which leads to some fragmentation and a fraction of the profits Apple is generating. Like Microsoft, Google plans to introduce its own branded tablet this month. Microsoft expects that it can generate some buzz and give Windows users a legitimate alternative to Apple’s iOS and Google’s Android, as well as incent its developer community to build native apps for its platform.
Note: In the above argumentation CNET relied on the released the same day “A Portrait of Today’s Tablet User – Wave II” U.S. findings from the Online Publishers Association (OPA), particularly the one represented on the following slide:
for which the accompanying OPA press release stated the following:
… Tablet adoption has significantly increased in the past year; 2012 saw 31% of the U.S. internet population owning tablets (74.1M users), up from 12% (28.3M users) in 2011. Furthermore, by the year 2013 this figure is expected to increase with a projected 47% of the U.S. internet population (117.4M users) owning tablets.
Of these tablet users, the Android platform has drawn level with iOS, largely in part because of the strong sales of the Kindle. 52% of tablet owners have an iOS operating system, while 51% use an Android powered tablet (percentages do not add up to 100% because tablet owners own/use more than one type of tablet). This is a drastic change from 2011, which saw 72% of tablet owners use some form of the iPad while only 32% used an Android tablet. …
4. AllThingsD [June 18, 2012]:
– Microsoft’s Surface Tablet Takes On Apple’s iPad liveblog by Ina Fried
– Microsoft Launches New Surface Tablets With Windows 8 by Bonnie Cha
– Microsoft CEO Steve Ballmer on Where Microsoft’s New Surface Tablet Fits in PC Ecosystem by Ina Fried
In a brief chat after the event, Microsoft CEO Steve Ballmer said that PC makers have known for an unspecified period of time that Microsoft would be doing its own hardware.
Ballmer noted that there will be a lot of PCs sold that will be made by companies other than Microsoft.
“If you look at the bulk of the 375 million machines that get sold (next year), they probably aren’t going to be Surfaces,” Ballmer said. “On the other hand, we could have a sizeable business.”
“It’s an important companion to the whole Windows 8 story,” Ballmer said. “It’s an important piece. It’s not the only piece.”
While Microsoft kept the details of Surface tightly limited to a small group of Microsoft employees working on the project, Ballmer said PC makers weren’t totally taken by surprise.
“Our PC partners knew in advance we were announcing something today in this space,” Ballmer said.
So how did they feel about it? “No comment.”
Ballmer said Microsoft’s goal is that Surface “gives people a full range of things to think about, sort of primes the pump for more innovation around Windows 8, (and) brings new technology to the Windows PC platform.”
Just how closely to the vest has Microsoft been keeping Surface? Ballmer said he has not personally been using a prototype on a regular basis.
“We wanted to keep things under wraps,” Ballmer said. “I’m out in public a lot.”
5. Boy Genius Report (BGR) [June 18, 2012]:
– Live from Microsoft’s tablet event! by Brad Reed
– Microsoft unveils Surface tablet by Zach Epstein
– Microsoft Surface tablet hands-on by Brad Reed
I have to admit that the Touch Cover felt somewhat alien to me at first when I was playing around with it, but that could be due to the fact that I didn’t have a lot of time to play around with it — Microsoft was really herding reporters quickly through the line. The Type Cover did feel quite natural as a keyboard should, however, so at the very least, there should be one strong option for people who prefer traditional keyboards.
The tablet’s 10.6-inch display screen looked gorgeous, although Microsoft was being weirdly evasive when asked what the exact screen resolution was. The tablet’s “VaporMg” casing is extremely solid, and the tablet feels very strong in your hands. Despite being 9.3 millimeters thick, the Windows RT version of the Surface is in no danger of bending under pressure.
In terms of software, Windows RT brings some cool new capabilities to the tablet form factor, including the ability to run two apps on the same screen simultaneously. One Microsoft rep, for instance, demonstrated how to have Outlook email on one half of the screen while having sports scores on the other half. And of course, the home screen on both versions of the Surface tablet features Windows 8′s Metro UI that is significantly more intuitive, colorful and user-friendly than past editions of Windows.
Thin/Zero Client and Virtual Desktop Futures
May 30, 2012 5:36 pm / 1 Comment on Thin/Zero Client and Virtual Desktop Futures
26 years of Wyse and Citrix collaboration resulted in an advanced infrastructure solution bringing the Windows desktop into a virtualised cloud environment and accessible from any cloud computing client device, including even thin client and zero client devices, or ones presenting a HTML5 browser functionality only. The infrastructure is getting a universal device management capability as well. And the most important hallmark of this infrastructure solution is complete security meaning immunity from viruses et al. In addition to the Windows desktop applications the next wave of web applications as well as SaaS applications (such as those provided by Salesforce.com) are made easily accessible and usable from any of those device and access points. The hallmark here is the possibility of continuing usage at the point where it has been left off from another device and access point. True flexibility from the user point of view.
For more introductory information please watch these two videos:
Jeff McNaught Interview One [CitrixTV YouTube channel, May 24, 2012]
Zenith2 – The Product that Changes Evertyhing [CitrixTV YouTube channel, May 24, 2012]
The detailed elaboration of the “Thin/Zero Client and Virtual Desktop Futures” topic will go through the following sections of the post:
- Wyse entry-level solution for education
- A glimpse into the Wyse portfolio and their large public / enterprise markets
- Essential technology and market information
A highly important preview from it:
XenDesktop and Metro Receiver [CitrixTV YouTube channel, May 9, 2012]
- Note: following that video it is absolutely important to watch the SYN229: What’s new with Citrix Receiver for desktop users video next to it because of the need to understand the Virtual Desktop Future assured by the upcoming Citrix Receiver universal client as represented best by the following image:

delivered in the [18:53 – 23:05] timeframe of the video.
Finally to understand the whole picture from/through a very practical demonstration of the whole range of possibilities watch these videos:
– The Future is Now (17 minutes – part 1 of 2)
– The Future is Now (28 minutes – part 2 of 2)
– Citrix Receiver on the Wyse Xenith, connecting to a XenDesktop virtual desktop - Wyse Product/Technology Details
- Dell Wyse (i.e. the Dell acquisition of Wyse)
– for introduction to that see: Dell Completes Acquisition of Cloud Client Computing Leader Wyse Technology [Dell press release, May 25, 2012]
Before going into those detailed sections here is a highly important introduction as well (in order to understand the future potential of this advanced infrastructure solution):
Wyse Technology’s President and CEO Tarkan Maner speaks with Edie Lush at Hub Davos [hubculture YouTube channel, Jan 26, 2012]
Notes:
– [00:40] Presumably the entry-level zero (which has no OS – see much below) client, Wyse E01 is shown as “working on only 2 watts” (the spec much below says upto 3 watts) and “costing less than $50, start at $35” (the current single unit retail price of E01 is $76 however, while the list price is $99 – see much below).
– That device is even presented as needing only the data center. Currently however entry-level zero client devices such as E01 (and the latest E02) require Microsoft MultiPoint Server (see much belo). So he is definitely pointing to an upcoming solution.
– [03:00] He mentions South-Africa with “10 million devices this year” as an educational example. So that kind of upcoming solution could definitely be in the works already. The power consumption difference might also indicate such a new entry-level device.
Management team [Wyse webpage, April 2, 2012]:
President, CEO and Chief Customer Advocate, Tarkan Maner
Tarkan Maner is the President and CEO at Wyse Technology, the global leader in Cloud Client Computing. Cloud Client Computing is the ultimate end user computing solution for our time, replacing the outdated, unsecure, unreliable, un-green and expensive client/server-centric systems. Cloud Client Computing delivers the security, manageability, availability, reliability, scalability, flexibility, and user experience with the lowest energy usage and total cost of ownership. Cloud Client Computing simply connects all the dots: Cloud client software, hardware and services.
Wyse provides its customers and partners with the broadest and deepest portfolio of Cloud Clients, including Thin, Zero and Cloud PC clients, supported by the leading cloud-centric firmware, virtualization, management and mobility software in the industry. Wyse independently partners with the leading data center, networking and collaboration solution providers within its global partner ecosystem to help organizations and people reach the clouds – in a private, public, government or, even in a personal cloud. Wyse’s mission is to enable any user, anywhere, to connect to any content via any app in any work environment without constraints, conflicts or compromises.
Tarkan believes that Cloud Client Computing not only drives better economic and productivity results for organizations, but, also drives societal change throughout the world. Cloud Client Computing reduces the cost, eliminates the complexity and enables the reach of computing to the next six billion users via billions of devices pervasive in every aspect of our lives.
…
Tarkan in the news
- Forbes OpEd – Cloud Computing for Public Sector
- Top Five Cloud Myths, Trends, and Recommendations
- How to Succeed at Innovation and Differentiation
- Opinion: Seeking ‘game changers’ that will create jobs
- Tarkan at WEF 2011 in Davos – Future of Manufacturing
- Tarkan at WEF 2012 in Davos – Cloud Client Computing for a New World
- Voices from the New Generation: The Explorer
…
Wyse entry-level solution for education
Post-PC Era Expands as Wyse and Serbian Government Partner for Nation-wide Cloud Client Computing Deployment in Education [Wyse press release, Sept 28, 2011]
More than 30,000 Students Gain Access to Latest Learning Technology with Wyse and Microsoft Solutions in Schools across Serbia
LONDON, UK and SAN JOSE, Calif. – 09/28/2011 – Wyse Technology, the global leader in cloud client computing, today announced a major implementation of its zero client technology in the Digital School project to transform classroom teaching in Serbia. In one of the largest projects of its kind in Europe, all elementary schools in Serbia will be outfitted with a new IT infrastructure based on Wyse zero clients and Microsoft Windows MultiPoint Server 2010, enabling every student to have access to the latest computing software, educational applications and online resources.
Committed to modernizing the country’s educational system, among other reforms, the Ministry of Telecommunications and Information Society, identified the need for a better information technology and communications infrastructure to support teaching and learning in classrooms.
Working with its technology partner company ComTrade, the solution is based on Windows MultiPoint Server 2010 and enables multiple users to simultaneously share a single computer while each using their own monitor, keyboard and mouse. This is an ideal solution for educational customers that want to extend IT access to more students, easily and affordably. The solution is designed for simple implementation and ease-of-use for teachers, provides the familiar Windows 7 desktop experience, and requires no advanced IT expertise.
The ministry selected Wyse E01 zero clients because they maximize the advantages of Windows MultiPoint Server. The zero clients simply plug into the host computer which automatically configures and enables a student to start work immediately. Unlike comparable devices for Windows MultiPoint Server, the Wyse E01 zero client supports USB peripherals such as, webcams, and USB flash drives, allowing a more flexible computer-based teaching and learning experience.
Jasna Matic State Secretary for Digital Agenda, former Minister for Telecommunications and Information Society said , “Enhancing ICT for education is a major goal of the Government with this programme delivering on our promise to give every student access to their own computer at school. With cutting edge technology from Microsoft and Wyse, our schools have a solid foundation for delivering education to the highest standards.”
…
Deployment of the Microsoft and Wyse education solution started in December 2010 and will be completed this year.
For more information about Wyse E01 zero clients, please visit, http://www.wyse.com/products/hardware/zeroclients/E01/index.asp
Windows MultiPoint Server 2011 Overview [msmultipoint YouTube channel, May 25, 2011]
Windows MultiPoint Server 2011 is a low-cost computing solution that creates a 1:1 user to computer experience built on Windows Server. With MultiPoint Server 2011, one PC can provide up to 20 computing sessions at a fraction of the cost.
Wyse® E class™ – Affordable computing for education [Wyse brochure, Jan 23, 2012]
…
1. One Windows Multipoint server shares its operating system and applications with up to 20 users at a time.
2. Features Wyse E class zero clients, one per desktop and each one linked by a USB [E01] or Ethernet [E02] cable.
3. Low cost, fast and simple to set up delivery of Windows desktops.
Windows MultiPoint Server 2011 Quick configuration guide
4 ~ 6 users 8 ~ 20 users CPU Intel CPU i5/i7 Intel CPU i5/i7 Memory 4 GB 8 GB Hard drive 250 GB 500 GB Graphics/1 On board Intel HD Graphics 2000 or similar same Graphics/2 PCI-Express Card ATI Radeon™ HD 4600 / 4770 / 5750 nVidia GeForce 8x, 9x Series / GT220,GT240 same
Software Microsoft Windows Multipoint Server 2011 Zero client Wyse E01 [retail: $76+] and E02 [$99] Zero Client Licenses (Microsoft Academic VL) Microsoft MultiPoint Server License [$115]Microsoft MultiPoint CAL License per device [$29]
…
Technical specifications Wyse E01
[E02 difference is Ethernet networking + 2 USB 2.0 port instead of 4 with E01 + 98 x 98 x 20 millimeters dimensions and 128g + standing position]
Server OS Windows MultiPoint Server 2011 I/O peripheral support One VGA (DB-15)
Four USB 2.0 ports (1 on left side, 3 on right side)
One Mic In / One Line Out
USB keyboard (not included)
USB mouse (not included)Networking One USB in to connect to host computer (cable included)
Maximum distance between each Wyse E01 zero client and the host computer is 5 meters (16 feet 5 inches)Display Up to 1680 x 1050 @ 60Hz / 32bits or 1600 x 1200 @ 60Hz / 32bits Audio Output: 1/8-inch mini jack, full 16 bit stereo
Input: 1/8-inch mini jack, 8 bit microphonePhysical characteristics Height: 21.5mm (0.85 inches)
Width: 132mm (5.20 inches)
Depth: 87mm (3.43 inches)Shipping Weight 145g (0.32 lbs) Power Worldwide auto-sensing 100-240 VAC, 50/60 Hz. power supply
Average power usage with device connected to 1 keyboard with 1 mouse and 1 monitor:
less than 3 WattsTemperature Range Vertical position: 50° to 104° F (10° to 40° C) Humidity 20% to 80% condensing
10% to 95% non-condensing
…
Announcement information:
- Wyse Extends Client Virtualization Leadership in Education Market with the Introduction of a New Zero Client for Schools [Wyse press release, Feb 24, 2010]
$99 Wyse E01 Zero Client and Windows MultiPoint Server 2010 Optimize IT and Financial Resources for Schools in Tough Economy
…
“We’re happy to be launching with strong support from Wyse, which has committed to developing innovative and effective solutions like the Wyse E01 Zero Client for the MultiPoint platform,” said Ira Snyder, general manager, Windows MultiPoint Server at Microsoft Corp. “MultiPoint Server can deliver a familiar Windows computing experience to educational institutions around the world, helping them get the best value out of technology investments while providing the very best education for their students.”
…
- The New $99 Wyse Zero Client Provides Simple and Cost-Effective Computing Access for Education and SMBs Worldwide [Wyse press release, Jan 11, 2012]
Wyse Expands E Class Zero Client Offering for Windows MultiPoint Server
Wyse Technology … today announced the introduction of the Wyse E02 zero client in support of Microsoft’s Shape the Future program
…
The Wyse E01 zero client and the Wyse E02 zero client work with Windows MultiPoint Server 2011 to enable multiple students or SMB users to share a single server. The E02 is easy for teachers to set up and use in the classroom, providing an excellent Windows 7 desktop experience for their students. While the Wyse E01 zero client provides students access to the shared server via USB cabling up to 5 meters, the E02 goes a step further to provide access via Ethernet, at a distance of up to 100 metersfrom the Windows MultiPoint Server.“Providing students with affordable access to technology is one way Microsoft is helping to ultimately create greater opportunities and more enriched lives for youth around the world. The Wyse E02 zero client, combined with Windows MultiPoint Server, is an excellent example of how we are working to deliver on this mission,” said Microsoft’s Shape the Future Senior Director, Joice Fernandes.
…
Appropriate and sustainable technology solutions for education in Africa [in The eLearning Africa 2012 Report (p. 17), May 23, 2012]
Widening access to reliable information technology is key to how we can help our children develop educationally. This is especially true in the fast developing economies of Africa where the expectation for access to ICT in the school has increased as more citizens use information technology like mobile phones in their everyday lives.
However, in our view, the ambitious eLearning goals in Africa can only be achieved with classroom technology that is intrinsically sustainable. But, in the African context, what do I mean by sustainability? First of all this is not about ticking the box of some green IT policy set by a government. The reality of extending digital classrooms into urban or rural Africa is that IT provision must take account of the absence of reliable power supplies. Any interruptions can be managed with novel solutions around battery back-ups or solar energy to power a classroom in a remote setting.
Even when reliable power supplies are available, low power consumption is going to remain important in how schools manage their budgets. This makes thin or zero client computers very attractive as they typically only use between 3 and 15 watts of power.
Sustainability in African eLearning is much more than about energy efficiency. It also refers to how IT in schools needs to be easy to set up and manage because it is unrealistic to expect a school to always have access to IT management skills on the ground. As African educators plan their expansion of eLearning, they need to ensure the classroom technology is largely self-sufficient and simple to set up, manage and use in the classroom. The centralised management and robust plug-and-play functionality of classroom labs that use virtualisation technology answers this requirement, ensuring that investments in school classroom labs deliver the maximum educational benefit over a long period of time.
In investing in digital classrooms African educators are demonstrating incredible foresight in what new generations of Africans need to improve their lives. They need to guard against making ICT decisions that trap them in the past. While budgets are always going to be tight, African educators must be ambitious about ICT in education and take advantage of the latest 21st century thinking on virtualised and cloud computing.
Another important dimension of sustainability is the degree to which the ICT is future-proofed in how it can keep pace with future developments in applications and data. Educators are already using solutions like this to transform ICT in their schools and colleges. In South Africa more than 1.5 million students already have ICT access thanks to classroom labs that utilise Wyse cloud computing technology.
Sustainability in African eLearning is vitally important in making ICT widely accessible to students across the Continent. Indeed, African countries look set to trail-blaze other economies in their innovative use of cloud client computing on a massive scale.
David Angwin is Vice President, Field Marketing for Wyse Technology,
and based in the United Kingdom
Wyse Cloud Client Computing Highlights Sustainable E-Learning for Students at eLearning Africa 2012 [Wyse press release, May 23, 2012]
Showcases Latest Digital Classroom Solutions to Widen Availability of School Labs and One-to-One Computing for High Quality IT Enhanced Teaching and Learning in African Schools and Colleges
SAN JOSE, CA and COTONOU, Benin – 05/23/2012 – Wyse, the global leader in cloud client computing, today announced its participation in the eLearning Africaconference and exhibition. As the event’s platinum sponsor for the second year running, Wyse will discuss how advanced cloud client computing can help African educators meet their goals for widening access to technology-enhanced education, development and training. eLearning Africa runs from 23rd – 25th May 2012 in Cotonou, Benin, under the patronage of the Government of Benin.
Working across the continent with its local technology partners, Wyse has developed and deployed a range of solutions that are ideally suited to widening access to IT-enhanced education and training in Africa. The technologies involved are tailored to the continent’s requirements for classroom ICT that is exceptionally reliable, affordable and energy efficient while not compromising on access to the latest applications and data for teaching and learning.
Delegates to eLearning Africa will have the opportunity to see the latest in digital classroom solutions co-developed by Wyse and Microsoft. This includes an entry level shared computing solution for school IT labs that combines Wyse E01 and Wyse E02 zero clients with Microsoft Windows MultiPoint Server 2011; and the Wyse WSM cloud software solution, which offers a centrally-managed, scalable one-to-one computing environment for students that scales across classrooms, labs and schools. Both solutions address the requirement for classroom IT that is secure and easy to set up and run, while delivering a great desktop experience for the students.
Mark Jordan, vice president and general manager, EMEA Sales, Wyse Technology will be delivering a keynote in the opening plenary session on 23rd May 2012. He will address how cloud solutions can play a pivotal role in helping IT enhanced education transform the prospects of African students. Tarkan will be speaking alongside S.E. Max Ahouêkê, Ministère de la Communication et des Technologies de l’Information et de la Communication (MCTIC), Benin; and Prof Sugata Mitra, Professor of Educational Technology, Newcastle University, UK and Visiting Professor, MIT Media Lab, Cambridge, USA.
The event will be ideal opportunity to be updated on how African customers are advancing their e-learning strategies with Wyse cloud client computing solutions. For example in South Africa more than 1.5 million students already have ICT access thanks to classroom labs that utilize Wyse cloud computing technology. In Nigeria, a new network of examination centers relies on a Wyse cloud client computing infrastructure to enable examinations to be delivered, taken and scored entirely electronically, saving time and money while also improving reliability and service with accurate results delivered in hours rather than months.
Education is Wyse’s second largest market, with ten of the world’s top fifteen universities using Wyse solutions to reduce costs and improve learning. They and other educational institutions benefit from Wyse’s position as the only cloud vendor to offer desktop virtualization solutions for every budget and scale of implementation, ranging from ten to upwards of ten thousand units.
A glimpse into the Wyse portfolio
and their large public / enterprise markets
Health care with Citrix and Wyse Xenith next-generation zero-client devices at Seattle Children’s Hospital [WyseTechnology YouTube Channel, May 23, 2011]
Microsoft HIMSS 2011 – Interview with Andre Beuchat of Wyse Technology [WyseTechnology YouTube Channel, May 10, 2011]
Japan’s Largest Bank Turns to Wyse for VDI and Mobility [Wyse blog, April 10, 2012]
Today, Wyse announced that Bank of Tokyo-Mitsubishi is deploying 50,000 Wyse devices. The combination of Wyse’s desktop and mobile hardware, virtualization software and overall Wyse domain expertise in cloud and virtualization is the reason why the Bank of Tokyo-Mitsubishi selected Wyse for its VDI implementation. Bank of Tokyo executive Mizuhiko Tokunaga commented that “… the deciding points were the technological edge of their unique software, Wyse ThinOS, their specialization in VDI, and the sense of trust we felt toward Wyse as a company. Wyse has been a global market leader for a long time, and it shows.”
The Bank of Tokyo-Mitsubishi, the largest bank in Japan and eighth largest in the world, began what was considered the largest systems integration project in the world in 2008 when it started this ambitious project to strengthen security across all 773 branches in Japan and 73 abroad. For more information on this initiative and how Bank of Tokyo is using Wyse, visit: http://www.wyse.com/about/press/release/1917
Cloud Computing involves using information technology as a service over the network.
- Services with an API accessible over the Internet
- Using compute and storage resources as a service
- Built on the notion of efficiency above all
- Using your own datacenter servers, or renting someone else?s in granular increments, or a combination
We at Wyse believe cloud computing has the potential to change how we invent, develop, deploy, scale, update, maintain, and pay for applications and the infrastructure on which they run.
Essential technology and market information
XenDesktop and Metro Receiver [CitrixTV YouTube channel, May 9, 2012]
SYN229: What’s new with Citrix Receiver for desktop users [CitrixTV YouTube channel, May 10, 2012] — absolutely important to watch in order to understand how the virtual desktop future would be assured by the upcoming Citrix Receiver universal client experience across different end-user access points (PC, Mac, tablets, smartphones, thin clients and web browsers) for Windows, web and SaaS applications (at least go forward to the [18:53 – 23:05] timeframe in the video) !!!
Wyse, Marvell, and the Citrix System-on-Chip Initiative [Wyse blog, May 10, 2012]
Yesterday Marvell announced participation in the Citrix System-on-Chip (SoC) initiative with the Marvell® ARMADA® 510 SoC for seamless integration with Citrix HDX in a complete silicon solution. The SoC combines a high-performance, low-power SoC with a hardware graphics processing unit and video decoding acceleration hardware. The end result is excellent processing power for high-end apps like HD multimedia in a very efficient, cost-effective footprint.
Wyse already uses the Marvell ARM SoC in our industry-leading T class thin clients. Combining Marvell’s high performance SoC with software optimized for Citrix HDX enables Wyse to offer compact, efficient, and powerful thin clients like the Linux-based T50 thin client and the super-secure T10 thin client based on Wyse ThinOS. In addition, our newly announced Xenith 2 zero client for Citrix XenDesktop and HDX is also based on the ARM SoC, and sets a new price/performance standard for Citrix zero clients in its class.
Zenith2 – The Product that Changes Evertyhing [CitrixTV YouTube channel, May 24, 2012]
Wyse Zero [Engine] and Wyse ThinOS [Wyse webpage, Feb 24, 2012]
Built for VDI Optimized for Citrix XenApp, Citrix XenDesktop, Microsoft Terminal Server and VMware View virtual desktop environments Lightning fast Super-fast start-up provides access to virtual desktops in under 20 seconds Super Secure No attack surface provides immunity to viruses and malware Easy-to-manage Hands-off, scalable device management with Wyse Device Manager; easy FTP-based configuration and automatic updates Smart card support Seamless smart-card roaming ideal for workstation-based environments Rich user experience Integrated Wyse TCX Suite for enhanced audio, video and multimedia
Overview
Wyse ThinOS
Wyse ThinOS is the most optimized, management-free solution for Citrix XenApp, Citrix XenDesktop, Microsoft Terminal Server and VMware Viewvirtual desktop environments. With an unpublished API and no attack surface, Wyse ThinOS is immune to malware and viruses that make other operating systems vulnerable to attack. This super-fast, purpose-built thin computing OS boots up in seconds, updates itself automatically and delivers simple, scalable administration to eliminate time-consuming maintenance tasks related to configuration, management and updates. With full support for Wyse Virtual Desktop Accelerator (VDA), ThinOS neutralizes the effects of network latency and packet loss, even in remote-branch and field-based applications.
Related link
- What’s new in Wyse ThinOS with David Angwin, Wyse Technology Watch video »
Wyse Zero [Engine]
Already used in millions of thin clients, zero clients, and handheld smart devices, Wyse Zero [Engine] simplifies the development of cloud-connected smart devices, enabling seamless user access to cloud computing services and virtual desktops. Wyse Zero [Engine] addresses limitations with current embedded options, such as the typical security vulnerabilities of Windows and Linux-based operating systems, and slow initialization due to their large size. With a rich array of networking, management and protocol technology packaged in an engine less than 4MB in size, Wyse Zero reduces costs and simplifies management and updates. With no underlying OS to slow it down, it starts up instantly for a more satisfying user experience. And unlike Windows or Linux-based embedded products that require extensive protection, Wyse Zero [Engine] is original technology and therefore virtually immune to malware, viruses and hackers.
Wyse Stratus Overview [WyseTechnology YouTube channel, Feb 24, 2012]
Wyse Announces Private Beta of Cloud-Based Service to Secure and Simplify Corporate Access for Users Across All Devices [Wyse press release, May 8, 2012]
Project Stratus Directly Tackles Consumerization of IT Challenges with Intelligent, Integrated and Cross-Platform User and Device Management
05/08/2012 – Wyse Technology, the global leader in cloud client computing, today announced the Project Stratus private beta program. Project Stratus provides IT administrators with an intelligent and dynamic cloud-based console to securely manage and enable corporate access to any device, regardless if that device is owned by the company or by the individual. Initial support will focus on securing and provisioning corporate access to smartphones, tablets, thin clients, and zero clients with plans to quickly expand support to additional devices used in the workplace.
Project Stratus delivers a unified console that goes beyond standard device management solutions by providing a complete view of the IT infrastructure serving end-users. The console provides visibility not only into users and their devices, but also into their relationship with the IT ecosystem. The result for IT is valuable insight into usage models, trends, and the means to identify areas of investment to more securely and effectively provide corporate services to end users.
“The biggest challenges to IT in a BYOD world has to do with the securing of corporate access to all devices being used by employees. With Project Stratus, our goal is to eliminate the need to have a separate, silo’ed console for each device type and instead allow IT admins to set an access policy for a user that will apply regardless of what device they are using—providing for the first time a one-stop shop for device and access management,” said Hector Angulo, Product Manager at Wyse.
“For a company such as ours that relies on a distributed and mobile workforce, the means to simplify and secure our mobile devices is very appealing,” according to Adam Bari, Managing Director at IPM. “We are very much looking forward to deploying Project Stratus to better manage our mobile computing infrastructure.”
Wyse will be showcasing Project Stratus at Citrix Synergy™ 2012 in San Francisco, May 9th – 11th in Wyse Booth #206 at the Moscone Center. Companies interested in taking part of the private beta can sign-up by going to http://www.wyse.com/stratus
Key features of Project Stratus include:
• Simplicity. Streamlined, discoverable interface with user-centric policy management to help automate user access regardless of what device they are using, including easy exception handling– natural and intuitive management for today’s dynamic IT world
• TCO Reduction. Cloud-hosted service eliminates costly on-premise servers and enables instant deployment and scaling — drastically reduces the total cost of operations and ownership
• Real-time Analytics. Dynamic and instantly personalized data feeds always present admins with the most relevant insight to help expedite the task at hand – powerful analytic engine exposes most important activities, events, and trends
• Actionable. Pro-active alerts notify admins about compliance violations and other potential issues with option to take contextual actions in-place (i.e. warn user, block, ignore) or automate future mitigation (i.e. automatically approve roaming exception request for all members of ‘executive’ group)
• Time-Saving. User and device pages that provide instant visibility into any managed asset, including who is using the device, what it is interacting with, and any potential performance or security issues in order to expedite issue identification and resolution
• Unified Console. Visibility and management of all devices used in the enterprise, with support for smartphones, tablets, thin clients, and zero clients — one-stop shop for all devices, no more hassle of dealing with many consoles
• Security. Enterprise-ready, multi-tenant architecture with fully encrypted communication ensures only you have access to your data
HDX Ready Software-on-Chip with TI and NComputing [CitrixTV YouTube channel, Nov 8, 2011]
HDX Ready Thin Clients [Citrix microsite, May 9, 2012]
The HDX Ready designation is reserved for thin client devices that have been verified to work with all of the XenDesktop and XenApp HDX features. HDX refers to High Definition User eXperience – a term coined by Citrix to describe capabilities in XenDesktop that optimize the user experience when accessing hosted virtual desktops and applications. The HDX Ready category assists IT managers to easily identify thin client devices that deliver the best possible high definition user experience with XenDesktop and XenApp.
There is a trade-off between a thin client’s cost and its capabilities. Not all users require the functionality of all of HDX features of XenDesktop or XenApp. Devices that are not deemed HDX Ready may still be useful for certain user types and use cases, generally at a lower price point than HDX Ready devices. The Citrix Ready thin client designation exists for those devices that support connectivity to XenDesktop or XenApp but only a subset of HDX functionality. Information regarding HDX feature coverage by a particular thin client device is available on the Citrix Ready website
Citrix HDX SoC spurs innovation and cuts the cost of thin clients in half [The Citrix Blog, May 9, 2012]
Today Citrix celebrates with our partners the unveiling of exciting new client computing devices that leverage the HDX SoC initiative.
Thousands of Citrix customers are already using thin client devices to access virtual desktops and apps delivered by Citrix infrastructure. These customers who have successfully deployed thin clients are getting the benefits of reducing or even eliminating their device management footprint, decreased their dependency on lifecycle management, and have reduced their power consumption by efficiently leveraging computing resources in the datacenter or server room.
There are also many customers who look at the cost of desktop virtualization and can easily justify supporting mobile workers and BYO programs. However, when it comes to replacing desktops in their offices, they may find it harder to justify purchasing a thin client when the price of the endpoint also, after all the dust settles, might be close to the replacement cost of a PC.
Delivering cost reduction
Last October, at Synergy Barcelona 2011, Citrix announced the HDX System on Chip initiative in partnership with Texas Instruments and NComputing, to create new SoC reference designs based on ARM chipsets to accelerate HDX user experience technologies in silicon. By using optimized hardware-based acceleration rather than decoding and rendering virtual desktop traffic on a general purpose processors in software, these SoCs can deliver the user experience of thin client devices costing twice as much or more while reducing power consumption, heat, and footprint. However, don’t mistake hardware-acceleration for “all-hardware.” Devices built on the HDX SoC initiative still run a Citrix Receiver in an embedded OS that permits updates to provide devices new functionality over time, further extending the expected lifecycle.
Taking cues from the living room
This direction of optimized delivery of high definition experience is no different than what many of us are seeing play out in our living rooms. Instead of collecting massive collections of videos to store in cabinets or home servers, cloud providers like NetFlix, Amazon, Apple, Hulu, Pandora, and others store media for us, allowing us us to stream in many cases real time content to our homes. This media can be displayed from TV’s using integrated “internet streaming,” from most any smartphone, tablet, or computer, or through the addition of a $50 appliance from companies like Roku that we plug into our TVs. It is this revolution in cloud entertainment services and the drive for low-cost, low-powered – long battery life devices overtaking the consumer electronics industry that Citrix can now leverage to optimize end point devices for desktop virtualization.
To learn more about these exciting, market-changing, transformative new devices being unveiled by HP, Atrust, Centerm, NComputing, and ThinLinX, please check out the HDX SoC 2012 partner page here.
Dell Wyse: acqusition of Wyse Technologies by Dell
(a summary of the many original materials compiled in the closing part of this post)
- Wyse – a leader in Desktop Virtualization
- Wyse – ranking number one worldwide in thin clientunit share in the fourth quarter of 2011
- Differentiated IP and device management, thin client operating systems, and mobility software that is customized to offer the best user experience with Microsoft, Citrix and VMware virtual desktop infrastructures.
- Much of their software value is captured in the hardware itself. Their ThinOS and the IP around the ThinOS has allowed them to drive greater performance using less memory. So Wyse solutions require less memory and processing power than other comparable thin client solutions, making them more cost competitive and effective for customers.
- Wyse as an independent entity has really been gaining momentum to grow into a number one market share position. In fact, they are growth accelerated in their last fiscal year to 45 percent
- Dell’s view on that:
– The momentum around alternative computing is a trend that they see many customers continuing to experiment with and in many cases, beginning to deploy, although the adoption rates are still relatively low for desktop virtualization.
– They don’t see the entire world going to thin clients. They still think there’s a healthy PC demand in the industry and there’s a balance of alternative computing that allows people to take advantage of securing their information, managing the assets in a very differentiated way. Even a common thin client deployment today is on a standard PC that’s been virtualized.
– This is an opportunity particularly in the verticals around financial services, government healthcare, and the financial services sector to really take a leadership position. This is really specific use cases. For example, in regulated industries like healthcare and financial services, the value of centralizing your data to better have access and control is a specific use case that this thin client desktop virtualization lends itself to.
– They needed it because it is also a different workload to move forward their cloud computing strategy.
– Again, they don’t think a zero client or a thin client is an answer for all customers. They think in their mind that the bigger message here is they now have a range of devices, an incredibly strong portfolio of thin client devices and zero client devices from Wyse, the standard Dell set of PCs, which do virtualization, and now the ability to manage those in a very differentiated way with the key software assets that they’re bringing on board that expand themselves to tablets, expands itself to mobile phones.
- Wyse portfolio includes a wide selection of industry leading thin and zero client devices designed easily to integrate into a virtualized or web based infrastructure
- It compliments and extends the desktop virtualization capabilities that Dell has today.
- Also a big part of this transaction is the synergy that Dell would get from their datacenter solutions business, including servers, storage, networking services, and software. For every thin client hardware dollar that exists in the IT industry, there’s $5 of enterprise servers, storage, networking services that go along with that.
- This could also remove the barrier for some companies that did not have the right level of datacenter portfolio and datacenter ecosystem to exploit the thin client alternative of enterprise computing: i.e. deploying desktop virtualization centric cloud client portfolios and platforms.
- Wyse is a company that has 31 years of experience. They have the intellectual property, they have the software and 150 R&D engineers which 140 are in software. Wyse and one other competitor basically had almost 50 percent of the market. Wyse are pretty close partners with Microsoft, and they do a lot of work with VMware, with Citrix as well. As these providers provide desktop virtualization methodology and technology between the datacenter and end use computing platforms Wyse add to that value and they partner heavily with them and obviously that’s going to continue.
- [Wyse:] And also, one other piece to add, we provide some of the software we provide is differentiated in the marketplace, is the leader in this space also from the cloud, both on the infrastructure management side from the cloud, with a product called Wyse Stratus. So, many of you on the phone are using today, Wyse PocketCloud, the market leading product for content management from the cloud on any mobile device and also from your web browser, connecting your apps and content inside the content voice data video from your choice of your cloud, private or public.
- The software stack that brings together the edge device, the management software that manages that, that sits into the cloud or sits into the datacenter, and the ability to build that software from essentially ground zero to being able to acquire those capabilities and that experience and the technology with it, puts Dell in a leadership position. The differentiated technology that they are getting with the integration of Brad Anderson’s [Dell president, Enterprise Solutions] and Steve Schuckenbrock’s [Dell president of Services] businesses, allow them a unique position to do this for their customers. All this allows them to move quite quickly in the marketplace, much quicker than they could have done it on their own.
- IDC: worldwide thin client demand will grow 15 percent per year to approximately 3 billion by 2015
- IDC: the overall end to end solutions market with thin clients is expected to exceed 15 billion by 2015
Wyse Cloud Client Computing [WyseTechnology YouTube channel, April 1, 2012]
Citrix Announces New Innovations in Desktop Virtualization Lowering Cost and Accelerating the Transformation to Virtual Desktops [Citrix press release, May 9, 2012]
New XenDesktop, VDI-in-a-Box & AppDNA capabilities drive adoption
San Francisco, CA » 5/9/2012 » Today, at Citrix Synergy™, the conference where mobile workstyles and cloud services meet, Citrix announced a set of new innovations that help organizations transform their Windows desktops and apps into a cloud-like service that can be managed centrally and delivered to any device in any location. New releases of Citrix desktop virtualization products and new game-changing Citrix HDX Ready SoC-based endpoint devices from key partners are helping to ease the transition to virtual desktops, drive down the acquisition costs and provide expanded capabilities targeting broad use cases from the call center, to high-end engineering and mobile workers in enterprises, the public sector and SMBs, enabling organizations of all sizes to deliver anywhere, anytime access to desktops, applications and data to users.
With the tremendous explosion of new devices, operating systems and applications, organizations are struggling to keep up with the challenge of managing desktops and applications in this new highly mobile world. At the same time, trends such as consumerization and bring your own device (BYOD) programs are putting added strain on IT resources. Citrix is raising the bar once again delivering new innovations across its desktop virtualization products and working with partners to drive down the costs of virtual desktops.
Easier On-ramp to Desktop Virtualization
- New Remote PC Option in XenDesktop FlexCast– The new RemotePC option is part of the FlexCast® delivery technology in the Citrix XenDesktop® product line. Using the new RemotePC capability, XenDesktop customers will be able to quickly turn existing office PCs into distributed VDI hubs without setting up additional servers and storage in the datacenter. This innovative new solution makes it easy for IT to give end users fast, secure remote access to all the apps and data on their office PC from any device. Once IT is ready to move to a more full-service VDI implementation, these distributed RemotePC images can be easily moved into the datacenter to run in a traditional hosted VDI model for better consolidation, security and management efficiency. Remote PC functionality will be included in XenDesktop 5.6 Feature Pack 1, which will ship in June, 2012.
- New AppDNA Software Release – To ease the transition to Windows 7 and a virtual desktop infrastructure, the new release of Citrix AppDNA software brings a simplified overall installation, setup and user environment to accommodate a broader range of enterprises, the channel and global SIs. Citrix AppDNA also provides even more in-depth application details so enterprises can accurately assess, rationalize and act on applications before a project begins. The AppDNA 6.1 software will be available in Q2, 2012. (see announcement blog for more detail)
Reducing the Acquisition Costs of Virtual Desktops
- First Wave of Game-changing Endpoints Arrives – The first results of the Citrix HDX System-on-Chip initiative that was announced at Citrix Synergy Barcelona are being delivered to the market. The initiative was designed to enable an entirely new generation of devices that deliver high-definition virtual desktops and apps at game-changing price points and form factors. These devices reduce the cost of high-performance HDX Ready thin clients by more than half, further driving down the cost of desktop virtualization. New devices from ATrust, Centerm, HP, NComputingand ThinLinx are being announced today at Citrix Synergy San Francisco and are built for Citrix XenDesktop, and Citrix VDI-in-a-Box. (See announcement blog for more detail)
- Personalized VDI for Less than the Cost of PCs – The Project Aruba technology preview delivers a cost-efficient yet complete VDI solution by extending the simple affordable Citrix VDI-in-a-Box™ with layering technology using personal vDisks to deliver highly personalized virtual desktops that retain the cost-efficiencies of pooled desktops. Project Aruba also provides a validated blueprint for service providers looking to deliver cost-effective VDI-based Desktops-as-a-Service.
Citrix has also made available a license migration path from VDI-in-a-Box to XenDesktop for customers that want to extend beyond VDI to leverage the full flexibility of XenDesktop while preserving their investment. The end-user experience is consistent across both products as both VDI-in-a-Box and XenDesktop use the same HDX stack and Citrix Receiver. (See announcement blog for more detail)
Delivering Expanded Functionality for Broad Use Cases
Citrix is delivering new innovations that create a very seamless experience for end-users, delivering a more complete solution than other alternatives on the market.
- Empowering Point-to-Point Unified Communications for Cisco and Microsoft– With the introduction of HDX Real Time technologies for voice and video collaboration, industry-leading unified communications (UC) solutions including Cisco VXI Unified Communications and Microsoft Lync 2010 can process voice and video locally and create a peer-to-peer connection for the ultimate user experience while taking the load off datacenter processing and bandwidth resources. XenDesktop delivers new levels of efficiency and quality of service for the most demanding use cases. HDX Real Time will be available with XenDesktop 5.6 Feature Pack 1 in June, 2012. – Support for HDX Real-Time with select Cisco VXI clients was recently announced in April, 2012 representing the first optimized UC solution for desktop virtualization on the market. This solution represents one of the first deliverables from the recent collaboration agreement between Cisco and Citrix to optimize HDX for Cisco networks.- The new Optimization Pack for Microsoft Lync 2010 will be included in XenDesktop 5.6 Feature Pack 1. This pack supports Microsoft Lync 2010 for point to point voice and video communications to Windows and Linux devices and will extend across all Citrix Receiver™-enabled devices over the coming months.- Beyond traditional unified communications support, XenDesktop also optimizes voice and video collaboration for cloud-based solutions including Citrix GoToMeeting® by compressing voice and video traffic on the client before transmission over the network.
- Cutting Network Bandwidth for Demanding 3D Engineering Environments – Whether collaborating with design engineers across oceans using advanced CAD/CAM or GIS apps or consulting medical imaging at a patient’s bedside with an iPad, the secure, high performance delivery of GPU accelerated 3D applications and desktops with XenDesktop has never been more powerful or efficient. Using new deep compression codec technology that reduces bandwidth requirements by 50 percent, XenDesktop with HDX 3D Pro technologies secures sensitive intellectual property and privacy-sensitive data while improving collaboration and performance eliminating the need to synchronize and transfer massive data files. Meanwhile, users leverage state-of-the-art graphics processing hardware in the datacenter to access designs and images from any device, anywhere. HDX 3D Pro will be available with XenDesktop 5.6 Feature Pack 1 in June, 2012. (See the announcement blog for more detail)
- New XenClient Enterprise and Acquisition of Virtual Computer – Citrix announced the acquisition of Virtual Computer, provider of enterprise-scale management solutions for client-side virtualization. Citrix will combine the newly-acquired Virtual Computer technology with its market-leading XenClient® hypervisor to create the new Citrix XenClient Enterprise edition. The new XenClient Enterprise, available in Q2, 2012, will combine all the power of the XenClient hypervisor with a rich set of management functionality designed to help enterprise customers manage large fleets of corporate laptops across a distributed enterprise. The combined solution will give corporate laptop users the power of virtual desktops “to go”, while making it far more secure and cost-effective for IT to manage thousands of corporate laptops across today’s increasingly mobile enterprise.
- Simplifying Printing with New HDX Universal Print Server – Now, Citrix desktop virtualization products tame the complexity of printing by completing a universal printing architecture with the Citrix HDX Universal Print Server. Combined with the previously available Universal Print Driver, administrators may now install a single driver in the virtual desktop image or application server to permit local or network printing from any device, including thin clients and tablets, leveraging HDX optimization technology to reduce bandwidth load over wide area networks and manage printing communications outside of the virtual desktop channel for enhanced quality of service. HDX Universal Print server will be available with XenDesktop 5.6 Feature Pack 1 in June, 2012. (See the announcement blog for more details)
Quote
“Citrix is helping to drive down the costs of virtual desktops, and advancing technology around user experience and manageability to move desktop virtualization adoption forward at a rapid pace. Though product innovation and strong partner ecosystems we are addressing barriers on all fronts including acquisition costs, migration complexity and delivering complete solutions for all customer segments from large enterprises to SMBs.”
– John Fanelli, Vice President of Product Marketing, Enterprise Desktops and Applications at Citrix
Related Links
- Announcement: New XenDesktop Release Accelerates Migration to Windows 7 and Beyond
- Announcement: Dell and Citrix Deliver a Simple VDI Appliance for the Mass Market
Follow Us Online
- Citrix XenDesktoppage
- Citrix AppDNApage
- Citrix VDI-in-a-Boxpage
- Twitter: @Citrix, @XenDesktop, @VDIinaBox, @AppDNA
- Citrix on Facebook
NOW to understand the whole picture from/through a very practical demonstration of the whole range of possibilities watch these videos:
The Future is Now (17 minutes – part 1 of 2) [citrixvideos YouTube channel, April 11, 2011]
The Future is Now (28 minutes – part 2 of 2) [citrixvideos YouTube channel, April 11, 2011]
Citrix Receiver on the Wyse Xenith, connecting to a XenDesktop virtual desktop [citrixvideos YouTube channel, April 10, 2011]
Wyse Product/Technology Details
Wyse Changes Everything with Announcement of Xenith 2 Zero Client for Citrix VDI-Based Deployments [[Dell] Wyse press release, May 9, 2012]
Leading Zero Client Improves Performance for VDI Installations Using Citrix Desktop Virtualization Solutions
SAN JOSE, CA – 05/09/2012 –
Wyse Technology, the global leader in cloud client computing, today announced the Wyse Xenith 2, based on the ultra-secure Wyse zero framework. This breakthrough zero client was revealed today at Citrix Synergy™ 2012, the premier event on cloud computing, virtualization and networking. Wyse, the leading shipper of fixed and mobile desktop zero clients in the world, will be demonstrating the Xenith 2 at Wyse Booth #206 from May 9-10, 2012.
Following on the success of the Wyse Xenith and Wyse Xenith Pro, the Wyse Xenith 2 is the ideal Citrix zero client solution for both enterprise and SMB organizations. The Wyse Xenith 2 zero client is purpose-built for Citrix XenDesktop® blending the amazing cost benefits of the ARM System-on-Chip (SoC) architecture, with a non-Windows Citrix Receiver compatible client, developed in cooperation with Citrix. Improving on the success of the Xenith, with 30% faster performance and lower power consumption, the result is a super secure, very affordable, true high-fidelity desktop experience. For users requiring a diverse variety of applications, including HD multimedia, the Wyse Xenith 2 delivers a new standard in price and performance in a compact zero client and delivers an unprecedented combination of simplicity, performance and security for office-based workers.
The Wyse Xenith 2 requires no local configuration or management and can offer customers of all sizes a more secure client while helping reduce management and overall client cost. Full AES 128 bit encryption enables encryption of network certificates on the client, which is a truly ironclad level of security. Leveraging the Wyse zero framework, the Wyse Xenith 2 is able to provide a secure, ‘instant on’ experience for end users—booting up and logging into a Citrix XenDesktop® in less than 10 seconds. With no exposed API’s and no attack surface, the Wyse Xenith 2 zero client is malware and virus immune, removing client security concerns.
“Wyse Xenith has been a game-changer for us,” according to Wes Wright, Chief Technology Officer at Seattle Children’s Hospital. “Not only are we saving $6 million in hardware replacement costs, more than $1 million in staff time, and $300,000 per year in energy savings, we also have devices that are faster, more secure and more reliable than anything we had before. With Xenith 2, Wyse is simply adding more appeal to an endpoint device family that makes Citrix XenDesktop a great end-to-end VDI solution.”
Like the Wyse Xenith and Wyse Xenith Pro, the Wyse Xenith 2 changes everything, including the economics of desktop computing. Wyse Xenith 2 eliminates the complications of management and security issues associated with traditional client devices, while ensuring an unparalleled high-definition user experience, further lowering the barriers for mainstream adoption of desktop virtualization.
“As customers look to the flexibility of desktop virtualization, Citrix is enabling these enterprises to transform their traditional Windows computing environments into a cloud-like service, delivering anywhere, anytime access to desktops, applications and data. Through collaborative relationships like the one with Wyse, we are further driving down the costs of virtual desktop deployments and accelerating adoption. The Xenith 2 achieves this goal by providing a secure, affordable solution that is optimized to deliver a high-definition virtual desktop experience through Citrix Receiver,” said Sumit Dhawan, group vice president and general manager, Receiver and Gateways at Citrix Systems.
“By tightly-integrating with Citrix, we’re delivering a zero client that is second to none in performance, security, manageability, and ease of use for this class of VDI endpoint,” according to Param Desai, VP, Product Management at Wyse Technology. “All of this plus it is more affordable than ever before.”
“Vendors like Wyse continue to push the envelope in zero client technology,” according to Bob O’Donnell, Program VP, Clients and Displays at IDC. “The ability to improve device performance while adding additional functionality and reducing cost bodes well for future zero client customers.”
Top Product Benefits
• Secure. Stateless zero client has zero attack surface for viruses & malware; no local disk and no APIs. Xenith 2 also offers single sign-on and is integrated with Imprivata support. Full AES 128 bit encryption enables encryption of network certificates on the device.
• Powerful. The Wyse Xenith 2 includes a Citrix Receiver client and achieves unparalleled user experience, great graphics performance and high fidelity multimedia due to Wyse’s innovative performance optimizations for ARM SoC and available only on Xenith 2 and T10. Xenith 2 starts up in 6 seconds.
• Affordable. Sets a new level of price / performance.
• Easy to manage. Integrated out of the box with XenDesktop management console in addition to also being managed by Wyse Stratus as part of a comprehensive device management from the cloud. Xenith 2 also comes with auto detection of server and configuration and is a completely stateless device, always using the latest zero engine delivered directly from a central configuration file server and the XenDesktop server.
• Compact. Requires very little space or none — includes VESA mount for back of display mounting. Xenith 2 is 30 percent smaller than original Xenith and utilizes only 7 watts in full operation.
• Zero-compromise user experience. Network-based QoS ensures quality (HDX multi-stream). Devices offers true 720P 25+ fps HD for wmv and H.264 with HW decoding engines. Dual display with rotation and l-shaped [which is unique and essential for financial services environments with an additional screen for spreadsheet viewing in vertical] display capabilities. New WAN support with local echo and bandwidth reporting allowing remote and at home users greater flexibility and performance..
Pricing and Availability
The Wyse Xenith 2 will be available soon with an estimated customer price TBD. For more information, please visit:
http://www.wyse.com/products/cloud-clients/zero-clients/Xenith2

Overview
Establishing a new price/performance standard for zero clients for Citrix, the new Wyse Xenith 2 provides an exceptional user experience at a highly affordable price for Citrix XenDesktop and XenApp environments. With zero attack surface, the ultra-secureXenith 2 offers network-borne viruses and malware zero target for attacks. Xenith 2 boots up in just seconds and delivers exceptional performance for Citrix XenDesktop and XenApp users while offering usability and management features found in premium Wyse cloud client devices. Xenith 2 delivers outstanding performance based on its system-on-chip (SoC) design optimized with its Wyse Zero architecture, and a built-in media processor delivers smooth multimedia, bi-directional audio and Flash playback. Flexible mounting options let you position Xenith 2 vertically or horizontally on your desk, on the wall or behind your display. Using about 7 watts of power in full operation, the Xenith 2 creates very little heat for a greener, more comfortable working environment.



Specifications
| Operating System: | Wyse Zero™ Engine |
| Processor: | Marvell® ARMADA™ PXA 510 v7 1.0 GHz system-on-chip (SoC) |
| Memory: | 0MB Flash / 1GB RAM DDR3 |
| I/O peripheral support: | • One DVI-I port, DVI to VGA (DB-15) adapter included • Dual display support with optional DVI-I to DVI-D plus VGA-monitor splitter cable (sold separately) • Four USB 2.0 |
| Networking: | • 10/100/1000 Base-T Gigabit Ethernet • Optional internal wireless 802.11 b/g |
| Display: | • VESA monitor support with Display Data Control (DDC) for automatic setting of resolution and refresh rate • Dual monitor supported with ‘L shaped’ display rotation • Single: 1920×1200@60Hz; color depth: 32 bpp • Dual: Up To 1920×1080@60Hz; color depth: 32 bpp |
| Audio: | • Output: 1/8-inch mini jack, full 16-bit stereo, 48KHz sample rate • Input: 1/8-inch mini jack, 8-bit microphone |
| Included: | • Enhanced USB keyboard with PS/2 mouse port and Windows keys • PS/2 mouse |
| Power: | • Worldwide auto-sensing 100-240 VAC, 50/60 Hz. • Energy Star V5.0 • Phase V external and EuP compliant power adapter |
| Power consumption: | Under 7.2 Watts (average) |
| Dimensions: | • Height: 1 inch (25mm) • Width: 6.9 inches (177mm) • Depth: 4.69 inches (119mm) Weight: 1 lb (450g) |
| Shipping Weight: | 1.003 lbs. (.455kg) |
| Mountings: | • Stand for horizontal use and VESA/wall mounting (included) • Optional vertical stand |
| Temperature Range: | • Operating • Horizontal position: 50° to 95° F (10° to 35° C) • Vertical position: Power button up: 50° to 104° F (10° to 40° C) • Storage: 14° to 140° F (-10° to 60° C) |
| Humidity: | • 20% to 80% condensing • 10% to 95% non-condensing |
| Security: | Built-in Kensington security slot (cable sold separately) |
| Safety Certifications: | • Ergonomics: German EKI-ITB 2000, ISO 9241-3/-8 • Safety: cULus 60950, TÜV-GS, EN 60950 • RF Interference: FCC Class B, CE, VCCI, C-Tick • Environmental: WEEE, RoHS Compliant |
| Warranty: | 3-year limited hardware warranty |
Jeff McNaught Interview One [CitrixTV YouTube channel, May 24, 2012]
Marvell Joins Citrix System-on-Chip Initiative to Bring Citrix HDX Technology for Thin Clients to Market [Marvell press release, May 9, 2012]
Santa Clara, California (May 9, 2012) – Marvell (Nasdaq: MRVL) today announced participation in the Citrix System-on-Chip (SoC) initiative to enable an entirely new generation of thin clients for high-definition virtual applications and desktops at a low cost. The Marvell® ARMADA® 510 SoC seamlessly integrates Citrix HDX capabilities into a complete silicon solution. The first of many ARMADA chips to be verified as part of the Citrix SoC initiative, the ARMADA 510 is a high-performance, highly integrated, low-power SoC comprised of an ARM v6/v7-compliant superscalar processor core, a hardware graphics processing unit, video decoding acceleration hardware and a broad range of peripherals, answering the need for fast processing and a rich multimedia user experience.
“The future of enterprise computing is in the convergence between mobile devices and digital content – it’s imperative that end users have access to the content they need from any device, whether it’s a thin client, tablet or smartphone. Citrix has been abreast of this monumental shift in the computing landscape for years – and now the Citrix SoC initiative makes it even easier for companies to deliver a new category of mobile-enterprise friendly devices to users quickly and affordably,” said Jack Kang, director of marketing for mobile at Marvell Semiconductor, Inc. “Working closely with Wyse, Marvell is proud to integrate the performance enhancements from Citrix SoC initiative onto Wyse’s performance rich Citrix HDX Ready T50 device based on Marvell’s ARMADA 510. Marvell is also working closely with Citrix to verify its full portfolio of highly scalable enterprise silicon solutions, from cloud servers to mobile and consumer end point devices, and we look forward to further collaborations with Citrix Ready partners to deliver new and exciting products throughout the enterprise.”
“Citrix XenDesktop delivers the capabilities to enable enterprise customers to begin or accelerate their migration to Windows 7 and beyond, while gaining the mobility, flexibility, and management benefits of desktop virtualization.” said Ankur Shah, principal product manager at Citrix Systems. “We welcome Marvell to the Citrix System-on-Chip initiative. Marvell’s broad portfolio of technology will enable a wide variety of devices to leverage the benefits of Citrix desktop virtualization technology.”
”Wyse is excited about Marvell’s partnership with Citrix on the Citrix SoC initiative,” said Kiran Rao, director of product management at Wyse Technology. “The end-to-end approach, incorporating Marvell’s high performance hardware with software optimized for HDX technology, enables Wyse to quickly bring innovative devices to market that provide a superior end user experience. Wyse’s compact, affordable Citrix HDX Ready T50 and T10 thin clients, as well as the new Xenith 2 zero client, powered by Marvell’s ARMADA 510 SoC will further expand access to cloud-based desktop virtualization using Citrix XenDesktop in the enterprise and beyond.”
Wyse and Microsoft discuss cloud PCs and OS licensing [WyseTechnology YouTube channel, May 19, 2011]
Wyse Z Class Thin Client [WyseTechnology YouTube channel, Jan 31, 2011]
Comparison of the current Z class products: Wyse Z90DE7, Wyse Z90D7, Wyse Z90S7, Wyse Z50D, Wyse Z50S, Wyse Z90DW
All with dual-core AMD G-T56N. The 4 Windows® Embedded Standard 7 based ones at 1.6 or 1.65 GHz while the 2 Wyse-enhanced SUSE Linux based ones at 1.5 and 1.6 GHz respectively. Memory is 2/4/8GB Flash + 2/4GB RAM, DDR3, depending on the model. Memory on 3 models is expandable, and on 3 Windows® Embedded Standard 7 based ones SSD storage is also supported. Power consumption is under 15 Watts (average) for all. Dimensions are 200 x 47 x 225 millimeters. Weight is 1.1kg.
Wyse Introduces World’s Fastest Thin Client Family [Wyse press release, Aug 29, 2011]
Wyse, Cloud Client Computing, Z class, World, Fastest, Available, VMworld 2011
SAN JOSE, Calif. – 08/29/2011 – Today at VMworld® 2011, Wyse Technology, the global leader in cloud client computing, today announced that its fastest thin clients ever, the [Windows® Embedded Standard 7 based] Wyse Z90D7 and Z90DW are now shipping. In addition, Wyse today introduced two new Linux-based members of its Z class family – the Wyse Z50S and Wyse Z50D. The Wyse Z50 is the high performance thin client family based on Wyse Enhanced SUSE Linux Enterprise, the industry’s only enterprise-quality Linux operating system that combines the security, flexibility, and market-leading usability of SUSE Linux Enterprise from Novell, with Wyse’s thin computing optimizations in management and user experience.
In connection with the availability of these breakthrough thin clients, Wyse also announced the results of independent testing, recently conducted by The Tolly Group, of the Wyse Z class versus the competition. Wyse made this announcement in connection with VMworld® 2011, the global conference for virtualization and cloud computing held in Las Vegas, August 29th through September 1st at The Venetian. As part of VMworld 2011, Wyse is demonstrating their award-winning virtualization, management, and cloud software and a wide range of thin, zero, mobile and cloud PC client hardware at Booth #1111.
At the heart of the Wyse Z class thin clients lie an entirely new engine, one where all the major system elements – CPU cores, vector engines, and a unified video decoder for HD decoding tasks – live on the same piece of silicon. This design concept eliminates one of the fundamental constraints that limit performance.
The Wyse Z class delivers a combination of performance, simplicity, and connectivity never before seen in a thin client. With available dual-core AMD G-series Fusion accelerated processing units, the Wyse Z class is the world’s best-performing thin client, able to support the most processing-intensive applications including 3D solids modeling, HD graphics simulation, and unified communications with ease. They also include the first SuperSpeed USB 3.0 connectivity in a thin client, enabling the newest peripherals and speeds up to 10 times faster than USB 2.0. With Wyse Z class thin clients, users have more display options than ever before including DisplayPort and DVI.
The Wyse Z class also includes advanced networking capabilities, with support for gigabit Ethernet, and available integrated A/B/G/N dual band Wi-Fi. They are compliant with the ENERGY STAR Version 5.0 Thin Client specification.
Independent testing by The Tolly Group recently confirmed the Z90D7s substantial leadership position in thin client performance compared to rival products. In support of rich video-based Web applications, for example, the Z90D7 boasted a clear advantage in video playback quality while using just a fraction of its processing and memory capability. That equates to a clearly superior user experience on a much more energy-efficient platform. In addition, the Z90D7 scored up to five times higher in industry-standard performance ratings (CPU Mark, 3D Graphics Mark, and PassMark ratings) than the competition. Among secure, cost-effective, yet powerful thin clients, these independent tests confirmed that the Wyse Z class is the clear winner.
“Being able to combine power and performance in such an easily-managed device is something we are extremely proud of,” said Param Desai, Sr. Director, Product Management, with Wyse Technology. “With the availability of Wyse Z class we’ve more than doubled the performance capabilities of competing top-of-the-line thin clients with similar energy requirements.”
Built on the same exact advanced single and dual core processor hardware platform as the Wyse Z90 thin clients, the upcoming Linux-based Wyse Z50 promises more of the same industry leading power and capability on an enterprise-class Linux operating system.
“We are very familiar with the performance of Wyse products having deployed several Z90 devices throughout our campus,” according to Ryan Foster, Network Engineer at Montgomery County Community College in Southeast Pennsylvania. “We were particularly impressed with the improvements to our desktop security, and by the capabilities of these devices handling multimedia files such as audio, video and Flash.”
Supporting Quotes
“The Wyse Z Class and VMware View™ combine to take advantage of PCoIP® in ways that will enhance the end-user experience,” said Vittorio Viarengo, vice president, End-User Computing, VMware. “Better security, easier management and significant energy savings all combine in a high-performance thin client that will benefit both IT and end users.”
“Wyse has made innovative use of the AMD G-Series Accelerated Processing Unit which combines a multi-core CPU, a discrete-class DirectX® 11 capable GPU and HD video decoding in one tiny piece of silicon,” said Buddy Broeker, director of embedded solutions at Advanced Micro Devices “The Wyse Z class takes full advantage of the processor’s unprecedented level of graphics integration that delivers a unique combination of performance and efficiency.”
Availability
For more information on Wyse Z90 including independent report results, please visit:http://www.wyse.com/products/hardware/thinclients/Z90The Wyse Z50 will be available later this year.
Wyse PocketCloud Family Overview [WyseTechnology YouTube channel, Feb 21, 2012]
Wyse PocketCloud Personal Cloud [WyseTechnology YouTube channel, Sept 21, 2011]
More videos about the PocketCloud:
- Wyse PocketCloud wins 2011 Appy Award for Productivity category [WyseTechnologyYouTube channel, March 3, 2011]
- Wyse PocketCloud 2.0 Features for iOS Devices [WyseTechnologyYouTube channel, Oct 13, 2010]
- Wyse PocketCloud demo on an iPad [WyseTechnologyYouTube channel, Dec 14, 2010]
- Wyse PocketCloud Features for Android Devices [WyseTechnologyYouTube channel, Dec 8, 2010]
- Introducing Wyse PocketCloud [WyseTechnologyYouTube channel, Sept 15, 2009]
- Wyse PocketCloud [WyseTechnology YouTube channel, Sept 15, 2009]
Dell Wyse
Focus on Dell [May 24, 2012]
Dell Completes Acquisition of Cloud Client Computing Leader Wyse Technology [Dell press release, May 25, 2012]
- With Wyse, Dell assumes a leadership position in Thin Clients[1]
- Dell’s new Desktop Virtualization capabilities combined by Dell’s leadership position in Server, Storage and Networking solutions successfully positions the company as true end-to-end IT vendor
Dell today announced it has completed its acquisition of Wyse Technology, the global leader in cloud client computing. The combination of Wyse’s capabilities with Dell’s existing desktop virtualization offerings position the company as the leader in the desktop virtualization, enabling it to offer true end-to-end IT solutions for customers and partners.
Dell has made significant strategic investments over the past three years to expand its enterprise technology and services capabilities. The Dell Wyse portfolio with current Dell desktop virtualization offerings, leading data center products such as servers and storage, and Dell’s services division, provides customers and partners with a single vendor that can match the full range of their cloud computing and desktop virtualization needs.
The Dell Wyse solution portfolio includes industry-leading thin, zero and cloud client computing solutions with advanced management, desktop virtualization and cloud software supporting desktops, laptops and next generation mobile devices. Dell Wyse has more than 180 patents, both issued and pending, covering its solutions, software and differentiated intellectual property. Dell’s existing offerings include Desktop Virtualization Solution Simplified and Desktop Virtualization Solution Enterprise.
Dell recognizes it’s critical for the desktop virtualization solutions strategy to embrace simple device management, enhance security, scale, and boost user productivity, while providing the flexibility to support anytime, anywhere access on any device.
Dell plans to preserve Wyse’s channel offerings and all existing Wyse channel partners will be eligible for our PartnerDirect Program. Dell will combine the best of both companies’ channel deal registration programs, extend this new deal registration program to all partners, and introduce a program in which partners can grow and nurture a customer relationship.
Quotes
“We’re excited to officially welcome Wyse to Dell and help extend its industry-leading efforts to a broader range of customers and partners,” said Jeff Clarke, Dell vice chairman and president, Global Operations and End User Computing Solutions. “We believe the Dell Wyse capabilities, combined with our previous desktop virtualization offerings and the strength of the Dell enterprise portfolio, provides the most comprehensive and competitive DVS solution available today.”“Wyse and Dell share the vision and passion in helping our customers and partners create a frictionless user experience via the cloud,” said Tarkan Maner, Vice President and General Manager Dell Wyse, Cloud Client Computing. “Combining our relentless IP innovation and tight operational skills, and most importantly our laser focus on customer and partner advocacy, Dell cloud client computing will develop and deliver the most advanced solutions globally, from the data center to the end user. We are and will be completely focused on the best user experience for any user, for any content, using any app, on any device, anytime, anywhere; without any conflict, compromise and constraint.”
“As a current customer who has deployed Wyse cloud client computing solutions with Dell PowerEdge servers and Dell EqualLogic storage, Western Wayne School District is excited about the combination of Dell and Wyse,” said Brian Seaman, Network Administrator at Western Wayne School District in Pennsylvania. “Like most school districts, Western Wayne operates in a budget constrained environment and our move to desktop virtualization technologies supported with strong enterprise infrastructure has enabled us to do more with less in service of our students and community. In working with Dell and Wyse to scope and deploy our computing environment, Western Wayne now has the right technology to help us achieve our vision of educating our students of today to become the productive citizens of tomorrow.”
“End point computing models continue to evolve and are accelerating tremendous innovation and efficiencies across enterprise desktop and personal computing,” said Bob O’Donnell, vice president, Clients and Displays, IDC. “One area of strong customer growth is in the desktop virtualization space and we expect to see adoption rates continue to grow over the next several years. As use models continue to mature, so do the vendors who offer solutions in this product space. Dell’s acquisition of Wyse results in an industry-leading solutions and services provider with a formidable end-to-end technology stack from the end point to the datacenter to the cloud.”
Dell to Acquire Wyse Technology Conference Call This slideshow could not be started. Try refreshing the page or viewing it in another browser. Dave Johnson, Senior Vice President, Dell Corporate Strategy: We at Dell continue to execute on our strategy to develop and expand our solutions capability built on Dell’s intellectual property. These solutions are open with a focus on enhancing customer productivity, delivering results faster and eliminating unnecessary complexity. We’re making great progress in delivering solid results on this strategy. Today’s announcement is an important next step to our end user computing strategy. It enhances our portfolio in the critical area of client computing and further supports our efforts to help our customers innovate end to end IT solutions from the edge to the core of the cloud. The acquisition of Wyse Technology compliments and expands Dell’s existing desktop virtualization capabilities, allowing us to offer industry leading and differentiated solutions to a fast-growing segment of the end user computing space. In addition, it also provides synergies with our enterprise solutions business. Our ability to now offer an industry leading cloud client computing solution will provide opportunities for Dell to further accelerate the growth of our servers, storage and network portfolios. IDC estimates that worldwide thin client demand will grow 15 percent per year to approximately 3 billion by 2015, and that the end to end datacenter infrastructure stack for these solutions is expected to exceed 15 billion by 2015. And with Dell’s portfolio, we’ll be able to participate in this broader opportunity. Wyse Technology is a leader in the high growth and strategic area of cloud client computing, ranking number one worldwide in thin client unit share in the fourth quarter of 2011. Wyse delivered approximately $375 million in annual revenue over the trailing 12 months. Wyse has approximately 500 employees with 150 employees in research and development, most of which are software engineers. In addition, it has approximately 250 sales specialists that are solely focused on selling Wyse cloud client computing end to end solutions. They have more than 3000 channel partners that sell Wyse technology on a global basis. This transaction expected to be accretive to Dell’s non-GAAP earnings in the second half of fiscal year 2013. Dell’s reputation as a trusted adviser to our customer, our distribution and sales capabilities combined with Wyse’s innovative solutions in cloud computing will help address customers’ needs and is a great strategic fit, both operationally and culturally for Dell. Finally, Dell has a strong track record of integrating acquisitions of this size. Based on experience with similar acquisitions, we expect this transaction to be accretive to earnings on a non-GAAP basis in the second half of this year. We’re really excited about welcoming Wyse to Dell and even more excited about the opportunities for our customers. Jeff Clarke, Vice Chairman, Global Operations and End User Computing Solutions: We see a growing opportunity in cloud client computing. This includes thin and zero client hardware, client infrastructure management software, virtualization end user optimization software, datacenter networking and implementation and managed services. It compliments and extends the desktop virtualization capabilities that Dell has today. These solutions offer customers an alternative compute model and helps enterprises enhance security, streamline desktop management and boost user productivity. Examples of the benefits that a cloud client computing solution can provide include, We have discussed our strategy and end user computer was first to strengthen our core business by implementing sustainable supply chain improvements and the results of which were evident in FY ’12. Our next goal was to deliver solutions and include compelling devices plus the tools to secure, manage that hardware, software and data. You’ve seen the results of that with some of our recent product announcements, as well as the strong growth of our transactional services business in FY ’12. And finally, we indicated our intensions to expand our reach into new and fast-growing areas of the end user computing. The acquisition of Wyse Technology and its portfolio of industry leading capabilities is the next step in our end user computing strategy. Wyse is a global leader in client – excuse me – in cloud client computing. Its portfolio includes a wide selection of industry leading thin and zero client devices designed easily to integrate into a virtualized or web based infrastructure. Differentiated IP and device management, thin client operating systems, and mobility software that is customized to offer the best user experience with Microsoft, Citrix and VMware virtual desktop infrastructures. Wyse solutions require less memory and processing power than other comparable thin client solutions, making them more cost competitive and effective for customers. To date, Dell has relied on shared IP solutions to serve its thin client customers. With this transaction, we are moving to a more profitable industry leading and complete end to end solutions with Dell owned IP and the associated R&D capabilities with it. Wyse Technology’s portfolio complements and extends Dell vision of providing innovative and complete end to end solutions to our customers. In addition, the combination of Wyse Technology with Dell’s brand and customer reach presents a dramatic increase in Wyse’s addressable demand. I’d like to leave you with the following takeaways; Tarkan Maner, CEO of Wyse Technology: The entire team at Wyse is excited about joining the Dell team and becoming an integral part of enabling Dell’s end user company vision. This agreement is great news for our customers and channel members worldwide. We’ve been focused on delivering innovative solutions for our customers and channel members for the past 30 years now. To be exact, 31 years now. Dell and Wyse share a focus on delivering innovative IP, world class service support, and optimized overall value to our customers and channel members. Customers and channel members rely on Dell to provide comprehensive end to end IT solutions. Clearly, Dell distribution, reach and brand are well recognized across the industry and it has industry leading capabilities across servers, storage, networking services and end user computing solutions. Wyse has historically been recognized as a leader in cloud client computing where our skills and capabilities in security, manageability, availability, reliability, lower total cost of ownership both in terms of CAPEX and OPEX, and scalability have been key differentiators in delivering the best value to our customers and channel members. Through the combination with Dell, we see obviously a tremendous opportunities to grow our core desktop virtualization business, as well as to expand into new and fast growing market segments and on mobility, and cloud computing. These include infrastructure and content management as a service solution from the cloud for large enterprises, for small and medium businesses, as well as consumers. We have extended our solutions into the unified communications space lately as well, providing voice, data, and video (what we call triple play) type of content delivery from the cloud for any user, for any content, for any app on any device, anywhere, anytime. And we would like to say, without compromise, without constraint or conflict. Our strong alliance ecosystem will be able to benefit from the extensive solutions portfolio they can now provide to their customers in teaming up with Dell. The Dell PartnerDirect program currently has 100,000 channel members and a proven track record of effectively onboarding and training channel members of acquired companies. This is exciting for us. Wyse has a history of innovation across all of our product lines and have recently introduced many new solutions for our customers and channel members with more than 180 patents; to be exact, 182 patents in cloud client computing. We believe that taking the next step at Dell is a very natural progression for our business and offers our customers and channel members some great advantages that are not available to us today at our scale and size. It is exciting to think about the potential of integrating Wyse’s technology and R&D capabilities with Dell’s reach, existing solutions, capabilities and reputation. We believe our customers and channel members worldwide will benefit in a big way from this entire combination. … Q: … just some more detail on Wyse’s hardware/software mix and margin structure, and what growth assumptions did you guys make to justify the price and over what time period and did you make any assumptions about cross-selling Dell branded enterprise solutions when coming up with the price? Today, the majority of the revenue is from the thin client and zero client business with the growing percentage of that revenue now starting to come from some other areas, including some of the things that Tarkan spoke about. … If we look at and project out a few years, clearly a big part of this transaction is the synergy that we would get from our datacenter solutions business, including servers, storage, networking services, and software. We also would expect, you know, within the services space, maintenance and some ongoing hosting opportunities over time, and there are also opportunities even in software and peripherals (S&P) if you think about the things like monitors and other items that you would sell in conjunction with a thin client solution. … … Wyse as an independent entity has really been gaining momentum to grow into a number one market share position. In fact, they are growth accelerated in their last fiscal year to 45 percent. Far outstripping the mid-teens industry average growth, both historically and projected in the future for this segment. And that’s driven by the breadth of their portfolio and the differentiation that they bring to their customers. … the thin client portion of the entire stack is really a small piece. Our expectation and our experience has been as we engage with our customers on helping them determine how to solve for this workload set of requirements – and it really is a workload that you’re talking about – and your engaging at a much more comprehensive enterprise level about a solution. And if you move to a thin client solution, and clearly the network, compute and storage moves, whether that’s into a private cloud or a public cloud, it’s in part of the entire solution. Wyse is an independent entity that didn’t have, of course, access to the broad portfolio that we do. … So, we believe the combination of our service and enterprise with our capabilities and the added capabilities of Wyse in the client space is a great combination and will be extremely synergistic for us. … I think, a key element that much of their software value is captured in the hardware itself. So, for example, they build on top of the protocols in our industry events features ahead of others, whether that’s multi-monitor support, the integration of voice, data and video, and/or USB redirect. Their ability to put those features into the platform ahead of the industry has allowed Wyse to extract value for that from its customers. It also, as we mentioned in our remarks, their thin OS and the IP around the thin OS has allowed them to drive greater performance using less memory and they extract a value for that in the industry. And then the bigger picture Dave hit on, for every thin client hardware dollar that exists in our industry, there’s $5 of enterprise servers, storage, networking services that go along with that. So, our ability to really move into that $18 billion marketplace with an end to end set of solutions from Dell is certainly how we view the asset a key piece. Q: Obviously, this is a capability that Dell could have developed probably internally. Does the fact that you decided to do this acquisition now suggest that you’re – Dell is seeing an inflection in the number of customers that are looking for these types of solutions and maybe if you could just give a little more detail on that and what you’re hearing from customers at this point on thin client? … what we view is the momentum around alternative computing is a trend that we see many customers continuing to experiment with and in many cases, beginning to deploy. The adoption rates are still relatively low for desktop virtualization, but there clearly are a lot of customers out kicking the tires, very similar to maybe a decade ago around server virtualization. Not that I’m comparing the two, but more of just the adoption rate. And we think this is an opportunity particularly in the verticals around financial services, government healthcare, and the financial services sector to really take a leadership position. Wyse Technology does have a leadership position in the thin client itself. We have very strong presence in the enterprise and each of those verticals and us building – and Dell now being able to build end to end vertical solutions for these set of customers where it makes sense is key. And again, I would emphasize we don’t see the entire world going to thin clients. We still think there’s a healthy PC demand in the industry and there’s a balance of alternative computing that allows people to take advantage of securing their information, managing the assets in a very differentiated way. And as Dave said, which I think is key in our thinking here, this is a different workload. We look at this workload from the device out on the edge to what we do in the datacenter, providing a set of services and value offerings to our customers. … This is really specific use cases. For example, in regulated industries like healthcare and financial services, the value of centralizing your data to better have access and control is a specific use case that this thin client desktop virtualization lends itself to. And also, lends itself to environments in industries where, again, there’s a desire to simplify the endpoint and manage the application much more centrally. That is often the case in education and ever increasing in some of the emerging geographies. So, we see this as an opportunity, again, to provide specific solutions to specific customer problems and much more industry-centric approach to our business. Q: … do you have any specifics around what percentage of your VDI customers for Dell are incorporating a full PC versus a thin client? And then any thoughts as to whether there’s anything on the horizon that would, you know, increase the ratio of thin client penetration versus a full PC in virtualized installations? We don’t see any real dramatic change. The IDC forecast continues to project into the future a sort of steady 15 percent growth rate. So, there’s no apparent broad inflection point. And as we articulated a moment ago, these are mostly fairly specific situations where the value proposition applies. And so, today, the total opportunity is, you know, counting the entire stack is about $3 billion. And so that’s still a relatively de minimis piece of the overall PC industry. Q: But, just to be clear on that point, you do have customers who are virtualizing their desktop and still purchasing regular Dell PCs rather than thin client? … A common deployment today is on a standard PC that’s been virtualized. Yea, I mean, we’ve seen that business grow in demand through last year and expect it to grow in demand this year. … And again, I don’t think a zero client or a thin client is an answer for all customers. I think in our mind the bigger message here is we now have a range of devices, an incredibly strong portfolio of thin client devices and zero client devices from Wyse, the standard Dell set of PCs, which do virtualization, and now the ability to manage those in a very differentiated way with the key software assets that we’re bringing on board that expand themselves to tablets, expands itself to mobile phones. And the fact that in some cases these usage models are moving to the cloud and the ability to do client cloud computing, I think is key, and a key element of this acquisition. Q: … You mentioned earlier some of the verticals that have been early adopters for this type of technology, can you talk about what you think some of the remaining barriers to broader adoption may be and how, perhaps, Dell is still solving that and what this acquisition does to help you there? … from a vertical perspective … we see growth both in public sector and private sector, obviously, both in large enterprise and midmarket. And from a bigger perspective, we see from time to time, some companies do not have the right level of datacenter portfolio and datacenter ecosystem. Sometimes we see certain customers in certain – in vertical industries or geographies complain about the fact they don’t have the right networking systems in the backend. … these open up an opportunity, obviously. So, those two are mostly the biggest barriers for deploying desktop virtualization centric cloud client portfolios and platforms. … I think the key elements – one of the opportunities we have has changed the value proposition to make the total cost of ownership around manageability, securing the data and the devices much more efficient and attractive for our customers. I think the differentiated technology that we’re getting with the integration of Brad Anderson’s [Dell president, Enterprise Solutions] and Steve Schuckenbrock’s [Dell president of Services] businesses, allow us a unique position to do this for our customers. Q: … because you had mentioned seeing specific vertical opportunities, do you have any details on the split today of [Wyse] revenue by verticals or by geography? The geographic mix is roughly 40 percent U.S., 40 percent EMEA and 20 percent APJ. … from a vertical perspective, I would say 50 percent public sector, 50 percent private sector. When I say public sector, we mean, obviously, you know, state and local governments, healthcare, education, and federal government type of deployments and also private sector, you get the point. In terms of customer size segmentation, I would say about 50 percent large enterprise, 50 percent midmarket/small business is our business at very high level. Q: … if you expect to accelerate the growth rate actually from 45 percent, given synergies from Dell, and then, if you do or whatnot, is the revenue incremental or do you expect any substitutional revenue as well? Like, do you expect that maybe Dell client sales will be hurt by Wyse and then it wouldn’t be completely additive, we’d have to subtract a little from the client side? … our projection is that we will maybe conservatively grow with the industry relative to thin client. But, of course, as you’re pointing out, they didn’t have the ability to integrate the comprehensive solution with networking, storage, compute, as well as wrap all the services around it. So, much of the revenue acceleration is driven by those synergies that you’re pointing out and we expect that to be significant in terms of the growth rates that we’ll be able to achieve through the entire offering that we will provide. Q: … could you go back and speak to build versus buy because it seems to me that Dell would have had a fairly easy time replicating the thin clients from Wyse. … Getting to your point about internal versus external, a comment on this that this is one of the industries when you look at it where Wyse and one other competitor basically had almost 50 percent of the market and then it’s a tremendous drop off to the rest of the players, none greater than 10 percent. And so, the combination of Dell with Wyse will put us in a very dramatic number one – not dramatic, but clearly a number one market position. And so, there’s certain value, as you know, of being a significant player in that kind of an industry situation. … because one of the other elements of the question is Dell versus buy, could we have done this organically? And our view is, I think, very straightforward. This [Wyse] is a company that has 31 years of experience. They have the intellectual property, they have the software and as Dave mentioned earlier, 150 R&D engineers which 140 are in software. We think the stickiness and the solution in the stack that I showed on one of the earlier slides is the software stack that brings together the edge device, the management software that manages that, that sits into the cloud or sits into the datacenter, and the ability to build that software from essentially ground zero to being able to acquire those capabilities and that experience and the technology with it, puts in a, I think, a leadership position and in a position as we integrate this with Steve [Schuckenbrock’s] and Brad [Anderson]’s organizations and build out workloads and solutions to move quite quickly in the marketplace much quicker than we could have done it on our own. Q: … specifically, I noticed than one of your newer products is where the T10 is on an ARM based platform, so what type of ARM engineers are you bringing to Dell? … I’m just curious about ARM technology that’s being – will this further Dell’s ARM, I guess, initiatives? Well, the way that I’d like to answer that question is simply around we’re going to build client devices, both desktops, notebooks, tablets, smart phones, thin clients, zero clients at the appropriate hardware architecture. That will be a combination of x86 and ARM. Dell itself has a pretty strong capability around ARM processor architecture. And as we mentioned, there’s only a dozen or so hardware engineers inside Wyse technology that work on the hardware. So, us getting hardware competence or assets around the design of ARM from Wyse, that’s not the nature of this acquisition, it’s the 140 software engineers that were key. The hardware architects on the Dell side that are working on ARM implementation across the plethora of devices that I mentioned earlier would still be the core ARM architects and the knowledge based for our ARM implementations. The real question maybe lying in the fact, will we continue to support thin clients based on ARM architecture and this thin OS? Absolutely. We believe that’s part of the value proposition that Wyse has had in the marketplace today. It’s allowed them to move quite quickly in implementing new products to the marketplace, providing a performance advantage or a lower cost option because they’ve done a great job in designing for cost and providing comparable features in the marketplace that others do in a more costly way. And on top of that, they innovate the platform, as I mentioned earlier, around the management stack, and then the promise around the software engineer being able to take things like Stratus and PocketCloud and being able to build that around those platforms and integrate Dell’s services around that with the rest of our Dell client assets, we think is an opportunity for us to differentiate with this acquisition. Q: … how this sort of positions yourself with Citrix and the VMware’s of the world, i.e. you know, there’s not going to be any attempts to (inaudible) features and functionality you get with some of those software partners. … we have strong relationships with the key players in thin client computing and virtualization. Not only are we going to continue those partnerships, we’re going to grow those and foster even deeper relationships. … as you all know, we [Wyse] are pretty close partners with Microsoft, we do a lot of work with VMware, with Citrix. As these providers, you know, provide desktop virtualization methodology and technology between the datacenter and end use computing platforms. So, we add to that value and the partner heavily with them and obviously that’s going to continue and the opportunity now, obviously as Jeff said earlier, now we’re bringing the datacenter, the network and end user platform all in an integrated way to our customers for more value. So, we’re going to have more opportunities to partner with Microsoft, with VMware, with Citrix and others in that space. And also, one other piece to add, we provide some of the software we provide is differentiated in the marketplace, is the leader in this space also from the cloud, both on the infrastructure management side from the cloud, with a product called Wyse Stratus. So, many of you on the phone are using today, Wyse PocketCloud, the market leading product for content management from the cloud on any mobile device and also from your web browser, connecting your apps and content inside the content voice data video from your choice of your cloud, private or public. So, these are all opportunities for us to do more with Microsoft, with VMware and Citrix as they move forward. And that’s a big differentiator.
Tech investment banking expertise to strengthen the unique value focus of growing the HTC brand and to achieve high growth again
April 18, 2012 9:39 pm / 2 Comments on Tech investment banking expertise to strengthen the unique value focus of growing the HTC brand and to achieve high growth again
Updates #2:
– HTC sees revenues down sharply on-year in July [DIGITIMES, Aug 7, 2012]
HTC saw its revenues dip 16.7% on month and 44.5% on year to a five-month low of NT$25 billion (US$834.45 million) in July. For the first seven months of 2012, revenues amounted to NT$183.9 billion, decreasing 32.8% from a year earlier, according to company filing with the Taiwan Stock Exchange (TSE).
… With HTC estimating its revenues to reach only NT$70-80 billion in the third quarter [US$2.3-2.7 billion], it is unlikely to see HTC’s revenues rebound to NT$30 billion in August and September, the Chinese-language Commercial Times said on August 7 report.
– HTC sees fall in 3Q12 sales with lower margin [DIGITIMES, Aug 3, 2012] [US$3.5 billion]
HTC reported second-quarter consolidated revenues of NT$91.04 billion (US$3.04 billion), in line with its targeted NT$91 billion, which had been cut from its original target of NT$105 billion [US$3.5 billion]. Gross margin and operating margin for the second quarter came to 27.01% and 9%, respectively.
Second-quarter sales represented a 34.3% increase, but were 26.8% lower than those posted in the second quarter a year ago. Meanwhile, gross margin and operating margin showed improvement from the prior quarter, but decreases compared to the same period of 2011.
HTC generated net profits of NT$7.4 billion, or NT$8.90 a share, in the second quarter of 2012. Profits declined more than 50% from a year earlier, but rose over 60% on quarter.
– HTC adjusts workforce [DIGITIMES, July 25, 2012]
HTC has been adjusting human resources in its production, R&D, and sales teams. Industry sources believe corporate restructuring is necessary as HTC’s sales have been declining.
Sales of the HTC One series have not been picking up due to tough competition in Europe and North America. HTC has been adjusting its global workforce by shutting down the R&D team in North Carolina, US, and offices in Brazil. Some members of the R&D team have been laid-off and there will be no renewals of contracts for 600 workers. The adjustments have impacted close to 1,000 staff.
– Nokia, RIM and HTC to see smartphone shipments continue sliding in 2H12, say sources [DIGITIMES, July 9, 2012]
Nokia, RIM and HTC are expected to see their smartphone shipments, as well as market share, continue declining in the third and fourth quarters of 2012 due to a lag in migration to new platforms and weakening competitiveness of their products, according to industry sources.
Despite efforts initiated by Nokia, RIM and HTC to fend off competition from Apple and Samsung Electronics, RIM and HTC have reported lower than expected shipments for the second quarter of 2012, while Nokia is expected to see its second-quarter smartphone shipments drop below 10 million units, said the sources.
Although HTC managed to post a sequential gain in shipments in the second quarter, its second-quarter smartphone shipments barely reached nine million units, pointed out the sources.
HTC is expected to see its shipments stay flat or drop to eight million units in the third quarter and slip further to seven million in the fourth quarter, according to a Chinese-language Commercial Times report.
End of updates #2
Preliminary reading: HTC: the most promising ICT brand in Taiwan [Oct 18, 2010 – July 5, 2011; then with major updates on Feb 7, 2012
Source: HTC, Investor Relations
Updates #1: The MWC introduced HTC One series unveiled (for the first time) a proprietary HTC ImageChip image signal processor (thus not relying on ISPs coming with the Tegra 3 and Qualcomm S4 SoCs) in order to be able to take a shot “in just 0.7 seconds” and to have “a new superfast 0.2-seconds autofocus, continue to take nearly unlimited continuous shots“, as well as “capture a photo and shoot video at the same time” and be “also able to capture a photo frame from a previously recorded video.” (See also a detailed description of that inside of the so called HTC ImageSense feature set.) Such a hardware based differentiation approach will even be greater with HTC’s upcoming products according to the following news:
– HTC plans to develop customized processors [DIGITIMES, April 24, 2012]
In order to have significant product differentiation, HTC plans to cooperate with Qualcomm, Nvidia and ST-Ericsson to develop and produce customized processors with specific functions for its smartphones, according to Taiwan-based handset chip designers.
HTC may develop specific functions for its smartphones and secure supply of customized processors, but it may run the risk of inventories because such processors are unlikely to be adopted by other vendors, the sources commented.
– HTC plans to develop its own processor [China Times, April 23, 2012] with Google translation of the original in Chinese , or the same with Bing translation.The essential content of that was first reported by Unwired as: HTC is developing its own CPU for lower end smartphones with ST-Ericsson
HTC is following in the footsteps of Apple and Samsung, and is now working on its own dedicated applications processor. According to China Times, the Taiwanese smartphone maker has already signed memorandum of cooperation with ST-Ericsson to co-develop the chip.
Contrary to high performance Samsung and Apple CPUs which power their flagships, the new HTC processor will run the lower end smartphones. The devices with new chip will start shipping in volume sometime in 2013.
It seems that HTC is getting increasingly unhappy with Qualcomm, which powered most of HTC devices until this year. They have signaled their unhappiness in early February, and may even consider Qualcomm one of the reasons for the sales problems of the last few months. HTC has already added NVIDIA to its application processor supplier list –quad core Tegra 3 is powering non U.S. version of the new One X flagship. But it has yet to diversify on the lower end.
Turning to ST-Ericsson and co-developing its own, cheaper CPU, may also be a way for HTC to start moving down market with lower priced devices. Up until now – HTC was mostly focused on a premium high-end smartphones, pretty much ignoring the low-end of the market. But as component prices get cheaper, and ever better quality Android devices are released at ever lower price points by Samsung, ZTE and Huawei – Taiwanese vendor has to find a way to respond.
And this move may be one of the responses.
– HTC, Facebook jointly developing smartphone, say sources [DIGITIMES, April 25, 2012]
Given that Google is expected to continue to cooperate with Samsung Electronics for the development of the next-generation Nexus smartphone, HTC reportedly has decided to move forward in its own way and is currently developing a customized smartphone in cooperation with Facebook slated to be launched in the third quarter of 2012 at the earliest, according to industry sources.
HTC had previously joined forces with Google to launch Google’s first own-brand smartphone, the Nexus One. However, Google then shifted to cooperating with Samsung as its primary production partner for the launch of its second and third own-brand smartphones.
Since Samsung has become the top vendor of Android smartphones, Google will continue to have Samsung develop its next-generation Nexus models, leveraging Samsung’s innovation ability with regard to the Android platform, and its ability to control the supply of key components, said sources.
The new Android smartphone being developed by HTC will have a platform exclusive to Facebook to enable and integrate all functions available on the social networking site, the sources indicated. Previously, HTC launched two Facebook-enabled smartphones, the Salsa and Chacha.
Facebook is expected to further expand its investments and sources of income after becoming a public company, and the launch of own-brand smartphones will be part of its development strategy, the sources commented.
End of updates #1
HTC personnel change indicates new value focus: Goldman Sachs [Focus Taiwan, Taiwan’s national news agency, April 17, 2012]
… The 45-year-old Chang, an investment banker and partner at Goldman Sachs before joining HTC, will be responsible for corporate finance and accounting, strategic acquisitions and investment, and investor relations.
“We believe the change in CFO may indicate HTC’s more aggressive attitude toward its finance department in terms of creating value other than just accounting integrity,” Goldman Sachs analyst Robert Yen wrote in a research note.
For example, he said, added value could mean enhancing “the uniqueness and competitiveness of HTC’s smartphone products and services.”
Given HTC’s many acquisitions and strategic investments in content and mobile services in the past and its decent cash position, it could be creating a different value by choosing a CFO with industry and banking background, Yen said. …
HTC Desire V for China Unicom (WCDMA)
Comparison [PDAdb.net]: HTC Desire VC T328d vs. HTC Desire V T328w vs. HTC Desire VT T328t [3]
Note: According to the detailed specifications given above these phones all have SLCD screens (see: Super LCD, Explained [DisplayBlog, Nov 24, 2011]), as on quite a number of higher end HTC smartphones in the last 2 years (since HTC Desire A8181 / HTC Bravo). Otherwise they have been using “transflective TFT LCD” mostly and in very few cases Super AMOLED.
HTC eyes cheaper smartphone market in China [Focus Taiwan, Taiwan’s national news agency, April 17, 2012]
Taiwan’s HTC Corp. launched several smartphones in China priced as low as 1,999 Chinese yuan (US$317) Tuesday in a bid to tap into the emerging mobile market.
The HTC launch in Beijing includes three smartphones in its customized New Desire series, which will go on sale from mid-April through three major Chinese telecom operators, according to a company statement.
The New Desire V, running on China Unicom’s 3G WCDMA network, will start from 1,999 Chinese yuan before subsidies, while the New Desire VC will support China Telecom’s CDMA 2000 frequency, for the same price tag.
Pricing for the New Desire VT, which will run on the country’s home-grown TD-SCDMA network provided by China Mobile, was not disclosed.
“The China market has always been a critical part of HTC’s global strategy. In addition to the HTC One series, we are introducing the New Desire series targeting Chinese consumers,” said Ray Yam, president of HTC’s China division.
“We believe HTC’s future is closely connected with China and that HTC will continue to bring the best experience and the most innovative smartphones to the country as soon as possible,” he added.
All the models in the New Desire series are equipped with a 4-inch display, a 1 GHz processor and a 5-megapixel camera, according to the company.
Separately, HTC said its new “One” family will also hit store shelves in China this month, with price tags ranging from 2,688 to 5,688 yuan.
The Taoyuan-based manufacturer is hoping that the streamlined models and an increased retail presence will help it boost its market share in China, which stood at only about 2 percent last year, according to analysts at Morgan Stanley.
…

HTC Desire VC for China Telecom (CDMA2000)
China market: HTC launches One, new Desire lineups [DIGITIMES, April 18, 2012]
… HTC currently accounts for a 10% share of smartphones sold with a price tag over CNY2,000 in China, but has not entered the mainstream sub-CNY1,000 segment, indicated the sources. …
HTC Prepares to Launch Lower-end “Kewang” Smartphones for China [IDG News, April 17, 2012]
… The HTC Kewang V, or Desire in English, will launch on April 23 through mobile operator China Unicom. … HTC’s goal with the Kewang series is to provide smartphones at a low price, but also with high-performance and strong features, said Ethan Qian, an HTC spokesman. The Kewang line is being released only in China, he added. …
HTC Desire VT for China Mobile (TD-SCDMA)
Dual card dual standby for only 1999 yuan HTC desire V officially released [China Tech News, April 17, 2012]
On the afternoon of April 16, China Unicom and HTC jointly held “China Unicom fertile 3G HTC new Desire V listed” conference, officially released the HTC new Desire V. It is customized by China Unicom, has 9.3mm ultra-thin body, support dual card dual standby, using clocked at 1GHz Qualcomm MSM 7227A processor [with Cortex-A5 single core, having 1.57 DMIPS/MHz performance, while Cortex-A8 has 2.0 DMIPS/MHz], 4 inches [Super LCD] screen with a resolution of 480×800 (WVGA). The phone will be powered by the Android 4.0 system, using the HTC Sense 4.0 UI, 5 megapixel camera with auto-focus, [512MB RAM] 4GB ROM, integrated Beats audio audio technology, the battery capacity of 1650 mA. Bare metal price of 1999 yuan is also a bright spot.
Super LCD vs Super AMOLED displays (HD) [TheTechTonicdotCom YouTube channel, Dec 11, 2011]
Note: Nokia has a superior technology for better brightness, contrast and outdoor visibility with a significant enhancement of both In-Plane Switching (IPS) type TFT and AMOLED display panels typically used. See: The leading ClearBlack display technology from Nokia [Dec 18, 2011 – Feb 2, 2012], especially for comparison with Super LCD of HTC Mozart (as well as with the Super AMOLED of Samsung Galaxy S II).
Commentary: HTC appoints new CFO, but challenges remain [DIGITIMES, April 17, 2012]
HTC has reshuffled its management team again by appointing former Goldman Sachs Group partner Chia-Lin Chang as its chief financial officer, which is part of the company’s strategy for global deployment.
The new appointment, which took effect on April 16, came after HTC announced earlier a 70% on-year decline in net profits for the first-quarter of 2012.
Perhaps, the new CFO could help the Taiwan-based smartphone vendor secure more acquisitions to strengthen its global deployment, but it remains to be seen whether HTC is able to regain its growth momentum in 2012 as it faces more challenges in integration of its corporate culture as well as increasing competition.
HTC has created or added a number of high-level positions since the second half of 2010, including the appointments of Ron Loukes as chief strategy officer and Kouji Kodera as chief product officer in July 2010, and Matthew Costello as COO in December 2010. HTC also appointed Jason Mackenzie as its president of global sales and marketing in July 2011.
HTC has also brought in Scott Croyle of One & Company and Shashi Fernando of Saffron Digital responsible for design and content, respectively, through acquisitions of the two companies.
It is also the second time in less than two years HTC has changed its CFO. The newly appointed CFO Chia-Lin Chang replaced Winston Yung, who took the post in January 2011.
If the latest management team is unable to bring back the growth momentum in 2012 that HTC enjoyed during the period from 2010-2011, HTC will no longer be able to compete with Samsung Electronics, Apple and even Huawei Device in terms of economies of scale in production.
While the hiring of talent with management and marketing expertise from abroad, and the acquisition of certain companies overseas are indeed necessary for HTC in its thrust to become a global brand, the impact resulting from the integration of corporate culture on HTC is expected to intensify along with such processes.
Given that nearly all top-rank positions with the exception of the CFO post at HTC have been filled with foreign executives, the promotion of local talent will likely become a major issue of concern in the future.
The Quietly Brilliant Story of HTC [HTC YouTube channel, Nov 23, 2011]
HTC replaces CFO after just one year [15 1/2 months] on the job (update) [The Verge, April 17, 2012]
HTC has issued a statement on the transition:
On Monday, HTC announced the appointment of Chia-Lin Chang as Chief Financial Officer with Winston Yung, his predecessor, transitioning to a corporate development role.
“Media speculation that ties this announcement to HTC’s partnership and investment in Beats By Dre is categorically inaccurate,” said Peter Chou, CEO of HTC Corporation. “HTC and Beats have made impressive progress in innovation and brand awareness and the integration of the Beats brand and technology in the new HTC One series is a clear indication of our commitment to this partnership.”
Amazing camera, authentic sound, iconic design. HTC One has them all. [HTC YouTube channel, Feb 27, 2012]
HTC One series unveiled [from the 2012 HTC press releases or directly on the Canadian site]:
BARCELONA, SPAIN – Mobile World Congress – February 26, 2012 – HTC, a global designer of smartphones, today unveiled its new HTC One series of smartphones that represent its most premium mobile experience with a new level of iconic design and amazing camera and authentic sound experience. …
…
With HTC’s most premium experience, the HTC One series integrates Android 4.0 (ICS) with HTC Sense™ 4, the new version of HTC’s branded user experience that is introducing HTC ImageSense™, a new suite of camera and imaging features that set HTC One apart from other phones. HTC Sense 4 also includes broad enhancements to audio quality and simplifies how people listen to music on their phone.
Amazing Camera
With ImageSense HTC One rivals traditional digital cameras with improvements to every part of the camera, including the lens, the sensor, the software, and even integrating a new custom HTC ImageChip. These enhancements combine to deliver our fastest image capture, best image quality under adverse conditions and easiest interface that enables quick access to capturing stills and videos with side-by-side photo and video capture buttons.
- Superfast Capture – HTC One dramatically reduces the time it takes to capture those key moments. In just 0.7 seconds you’re able to take a shot, and with a new superfast 0.2-seconds autofocus, continue to take nearly unlimited continuous shots simply by holding the shutter button.
- Good photos in adverse conditions – HTC One delivers dramatic enhancements in image capture quality even in adverse conditions such as low light, no light or with bright backlighting. The f/2.0 lens on the HTC One X and HTC One S offers best-in-class performance, capturing 40 percent more light than the f/2.4 lenses available on other high-end phones. HTC One also includes HDR, a market-leading technology, for taking great photos even when there are varying levels of brightness.
- Video Pic (Concurrent Video/Still Capture) – With Video Pic you capture a photo and shoot video at the same time. Now, while you’re shooting HD video, all you have to do is tap the shutter button and it snaps a high-resolution still photo while the video continues to shoot. You are also able to capture a photo frame from a previously recorded video.
… Authentic Sound
With HTC One, Beats By Dr. Dre Audio™ integration is enabled for the first time across the entire experience for richer, more authentic sound whether you’re listening to your favorite music, watching a YouTube™ video or playing a game. … All this makes HTC One the one place to enjoy all your music, wherever you are, with the power of Beats By Dr. Dre Audio and HTC Car.HTC One X
… HTC One X is blazing fast with the new NVIDIA® Tegra 3 Mobile Processor for clear graphics, faster applications and longer battery life. It includes a 1.5GHz Super 4-PLUS-1™ quad-core with an integrated fifth Battery Saver Core and a high-performance 12-Core NVIDIA® GPU. The HTC One X also has an amazing 4.7-inch, 720p HD screen crafted from contoured Corning™ Gorilla Glass. HTC One X will also be available in select 4G LTE markets with a LTE-enabled Qualcomm Snapdragon S4™ processor with up to 1.5GHz dual-core CPU’s.HTC One S
The HTC One S is for people who want a high-end smartphone in a more compact size. It is powered by a Qualcomm Snapdragon S4 processor with up to 1.5GHz dual-core CPU’s. It also includes a 4.3-inch screen crafted from contoured Corning™ Gorilla Glass. …HTC One V
Utilizing the classic, award-winning design of the HTC Legend, the HTC One V brings top-end design to a smartphone with broad appeal and a premium experience that delivers an amazing camera and authentic sound. It features a simple, iconic aluminum unibody design that exudes craftsmanship and quality.Global Availability
With unprecedented excitement, the HTC One series will begin shipping in April with broad global availability available beginning in April through more than 140 mobile operators and distributors globally. For more information and to pre-register for HTC One visit www.htc.com.
HTC Rezound™, the only phone with the Beats Audio™ built in [HTC YouTube channel, Nov 3, 2011]
HTC And Beats By Dr. Dre Set To Introduce New Era In Mobile Audio [from the 2011 HTC press releases]:
Strategic HTC investment to result in Beats integrated HTC phones this Fall.
Taoyuan, Taiwan & Santa Monica, CA – August 11, 2011– HTC Corporation, a global designer of mobile devices, today announced a strategic partnership and investment with Beats™ Electronics LLC, the company redefining the audio market with its iconic Beats by Dr. Dre™ audio experience. The two fast-growing brands will focus on bringing high performance sound to HTC phones. …
… “Beats has found a unique way to harness popular culture in a manner that is unlike any other brand today,” said Peter Chou, CEO of HTC Corporation. “It’s an exciting brand that has been built around providing something very special, and we believe our strategic partnership will provide customers with unbeatable sound on HTC phones. We obsess over every detail of a consumer’s mobile experience and audio is a critical part of that experience.”
… Established in 2006, Beats Electronics is the brainchild of legendary artist and producer Dr. Dre and Chairman of Interscope Geffen A&M Records Jimmy Iovine, who set out to develop a new type of headphone with the capability to reproduce the full spectrum of sound that musical artists and producers hear in professional recording studios. For more information, please visithttp://beatsbydre.com.
The history of last HTC CFOs:
HTC Appoints Hui-Ming Cheng as CFO [HTC press release, Aug 23, 2006]
… he has served as CFO and Spokesperson for the Fubon Financial Holding Co. in Taipei. From October 2003 to February 2006, Mr. Cheng was VP and CFO of Taiwan Mobile and received the honor of being named as Taiwan’s best CFO by Institutional Investor Magazine in 2003. Prior to his appointment with Taiwan Mobile, Mr. Cheng held various senior-level positions with the Finance Center, Winbond Electronics Corp., China Development Industrial Bank, Chase Manhattan Bank, and the Asia Partner Fund.
Mr. Cheng received a BS in Chemical Engineering from National Taiwan University and an MBA from the Kelly School of Business at Indiana University.
HTC Announces Winston K. S. Yung as Chief Financial Officer [HTC press release, Dec 23, 2010]
… Prior to joining HTC, Yung was the Chief Financial Officer for Shin Kong Financial Holding in Taiwan where he played a key role in the company’s success, and also held key positions at McKinsey & Co in Hong Kong. Yung received a bachelor’s degree in social sciences with an economics major from University of Hong Kong and a MBA from the University of Pennsylvania’s Wharton Business School.
… HM Cheng, HTC’s current chief financial officer will retire from the company and move into an advisory role to HTC’s board of directors. Cheng joined HTC in September 2006 and successfully established a complete financial system and was a key contributor to HTC’s corporate governance system and HTC’s overall financial success
HTC names Chia-Lin Chang Chief Financial Officer [from the Latest HTC press releases]
Taoyuan, Taiwan – April 16, 2012 – HTC, a global leader in mobile innovation and design, today announced the appointment of Chia-Lin Chang as Chief Financial Officer and spokesperson effective April 16, 2012.
Chia-Lin Chang’s predecessor, Winston Yung, joined HTC in January 2011. Winston will focus on corporate development, helping HTC maintain its competitive edge by strengthening the organization and corporate talent.
Chia-Lin, previously an investment banker and partner at Goldman Sachs, will be responsible for corporate finance and accounting, strategic acquisition and investment, and investor relations. Chang earned a Ph.D. in Electrical Engineering from Princeton University and an M.B.A. from the Wharton School at University of Pennsylvania. After receiving his Ph.D. degree, Chang served as an engineer at Motorola in the US.
GOLDMAN SACHS ANNOUNCES NEW MANAGING DIRECTORS [Goldmann Sachs press release, Oct 24, 2007] “… it has invited 299 individuals to become Managing Directors as of December 1, 2007, the start of the firm’s fiscal year. … Chia-Lin Chang …”
From: Latest HTC press releases:
HTC releases unaudited results for 1Q 2012
Taoyuan, Taiwan – April 6, 2012 –HTC corporation (TWSE: 2498), a global leader in mobile innovation and design, today announces unaudited consolidated results for 1Q 2012. For the first quarter of 2012, total revenues reached NT$67,790 million, a decrease of 34.92% year-on-year. Unaudited operating income was NT$5,099 million, net income before tax was NT$5,551 million, net income after tax was NT$4,464 million, and unaudited earnings per share after tax were NT$5.35 based on 834,256 thousand weighted average number of shares.
2012 First Quarter Unaudited Consolidated Financial Results
(Unit: NT$ million, Except Earnings Per Share)

*Calculation of the after-tax EPS for first quarter 2011 was based on 807,867 thousand weighted average number of shares.
HTC Reports Fourth-quarter And Annual 2011 Results
Taoyuan, Taiwan, February 6, 2012– HTC Corporation (“HTC”, or the “Company”, TWSE: 2498), a global leader in mobile innovation and design, today announced consolidated results of the Company and its subsidiaries for the fourth quarter of 2011 and for the year.
4Q Highlights
• After-tax profit was NT$10.94bn, EPS was NT$13.06
• Total revenues were NT$101.42bn
• Gross profit margin and operating margin were 27.12% and 12.71%, respectively
2011 Highlights
• After-tax profit was NT$61.98bn, up 56.77% year-on-year; EPS was NT$73.32
• Total revenue was NT$465.79bn, up 67.09% year-on-year
• Gross profit margin and operating margin was 28.30% and 14.77%, respectively
• ROE was 70.37% compared to 56.33% in 2010
“In 2011 we saw growth in the global strength of our brand, as well as earnings and revenue growth,” said Peter Chou, CEO of HTC. “While short term performance may not meet the results as expected, we have gained further experience and advancement in the areas of brand management and product innovation. These fundamental strengths and the groundwork we have laid will take us into 2012 with a renewed focus and determination.”
4Q 2011 Results
HTC’s fourth quarter revenue came in-line at NT$101.42bn, resulting in after-tax earnings of NT$10.94bn and EPS of NT$13.06. Gross profit and operating margins came in at 27.12% and 12.71%, respectively. The decline in gross profit margin was mainly a result of product transition.
2011 Results
2011 annual revenue was NT$465.79bn, a 67.09% increase over 2010 annual revenues (NT$278.76bn), resulting in after-tax earnings of NT$61.98bn. Overall gross profits and operating margins were 28.30% and 14.77%, respectively.
In 2011, in addition to solid growth in revenues and profits, HTC’s brand gained significant momentum in the global landscape, being named one of Interbrand’s 100 Best Global Brands.
2012 Outlook
In 2012, HTC will focus on: growing the Company’s brand value; continuing to create competitive advantages through innovation; enhancing the efficiency of marketing campaigns; and further driving down operating costs.
To expand its brand preference and value, HTC will work at a global level to build emotional connections with consumers, putting more of its marketing resources behind fewer products and driving value in those product brands. By building a globalized marketing campaign, HTC aims to optimize its go-to-market strategy with operators, retail distributors, and end-users, and improve the efficiency of its marketing spend. In emerging markets, such as China, HTC will continue to extend its reach to customers by expanding distribution channels.
Despite temporary weakness resulting from product cycle transition, HTC believes it has the ability to create a new wave of momentum through the upcoming product cycle. It will also continue its attention on mass market consumers by driving product differentiation through design and innovation.
1Q Outlook
The Company’s outlook for the first quarter of 2012 is as follows:
• 1Q revenue expected to be around NT$65-70bn
• Gross margin expected to be around 25%
• Operating margin expected to be around 7.5%
These margins are a temporary phenomenon and will normalized when product cycle transition is over.
The Where Platform from Nokia: a company move to taking data as a raw material to build products
April 7, 2012 6:31 pm / 8 Comments on The Where Platform from Nokia: a company move to taking data as a raw material to build products
This week news regarding the subject in the title are summarized in the following Nokia blog posts and videos embedded in those posts:
– From Where 2.0 To Just Where; With Meh 2.0 Somewhere In The Middle [blog of Gary Gale from Nokia, April 6, 2012] in which you can find his Platform, APIs & Apps: Building the Where Ecosystem presentation at Where 2012 with detailed speaker notes as well
– Nokia Location Platform: The Leading Where Platform [Nokia microsite since Oct 26, 2011]

– Nokia Maps at Where 2012 round-up [Nokia Conversations, April 6, 2012]
– Nokia at Where 2012 [Nokia Conversations, April 3, 2012]
– The location business – Nokia’s Where Platform [Nokia Conversations, April 3, 2012]
– Nokia Location & Commerce – Christof Hellmis, VP, Map Platform, Nokia [nokianederland YouTube channel, Dec 2, 2011]
– Nokia Maps Suite 2.0 graduated! [to a commercial offering] [Nokia Beta Labs blog, April 4, 2012]
– New features on maps.nokia.com and Nokia Maps Suite for Symbian [Nokia Conversations, April 4, 2012], see also the Nokia Maps set the 3D world on fire, with heat maps [Nokia Conversations, July 27, 2011], Nokia Maps for Web update [Nokia Conversations, Oct 25, 2011] and Nokia Maps 3D now with navigation [Nokia Conversations, Dec 7, 2011], as well as the associated Nokia launches photorealistic 3D models of metropolitan areas for Ovi Maps [Nokia press release, April 19, 2011] from last year all related to that
– Nokia Lumia 900 – Drive and Maps [nokia YouTube channel, April 3, 2012]
Experience the amazing everyday and see just why the Nokia Lumia 900http://nokia.ly/HYUsNk is beautifully different. Want to feel like a local anywhere? Nokia Drive and Nokia Maps give you comprehensive mobile navigation and the insider knowledge to make it happen. With support across 95 countries, you’ll get accurate turn-by-turn directions to the destination of your choice, as well as information on all the cool places to visit when you get there. Drive and Maps is one in a series of 5 quick introduction demos to the wonderful world of Nokia Lumia. Each video highlights different hubs and features, letting you dive deeper into the world of Nokia with Windows Phone.
– New Maps, Drive and Transport in depth [Nokia Conversations, March 1, 2012]
– Expert Advice: Location Context, Relevance for Revenue [by Christopher Peralta from Nokia in GPS World, Jan 31, 2012]
– NAVTEQ® Map Selected By Galigeo To Power Geospatial Business Intelligence [Nokia Location & Commerce press release, March 29, 2012]: “Pioneering specialist integrates the ‘Where’ factor into business analysis”
– Peugeot, Dacia, Audi, Scania, Nikon, and Yandex becoming NAVTEQ® Map – just the new additions in Q1 2012 [from Nokia Location & Commerce press releases]
From the all above a reference is particularly relevant to the subject of The Where Platform: Gary Gale’s (Director of Places, Nokia Location&Commerce) session on Tuesday, April 3rd at 1:40pm in Yerba Buena Salon 10-11. There is already a downloadable version of his Platform, APIs & Apps: Building the Where Ecosystem presentation (with speaker notes here) which by itself providing a lot of background information worth to study.
To spare you a lot of time and information search, I am providing below a complete overview of the whole effort:
– with The Where Platform they are trying to get to smart data: i.e. combining sets of behavioral and contextual data about the real world
– such a direction is coming from the observation of the following trend by Nokia:
which has lead to the discovery of the following strategic foundation for their redefined Location & Commerce business, i.e. The (so called) Where Platform:
For developers there is an evolving set of platform APIs:
which were described by Gary Gale’s this week as: [was actually announced on the event]
We already have a set of modular, configurable, highly performant APIs that are easy to use and to integrate, with an active developer community who appreciate our simple and fair terms of use. For the web, we have JavaScript APIs for Maps and for Places as well as a new Places web service API [which was actually announced on the event], more of which in a few moments. We’re going to be unifying the JavaScript APIs for Maps and for Places into single API under the Nokia Maps for JavaScript APIbanner.
There’s also our Map Image web service API and our upcoming Maps API for HTML5, which I’ll talk more about in a few slide’s time.
And for native mobile use, there’s out Maps API for Qt and our Places API for JavaME and coming later this year our Maps API for Windows Phone.
APIs are of course utterly critical to the Where platform and the Where ecosystem but we also to ensure that we cover all the screens that act as touch-point between the digital and real words for people throughout their day. As I move from my computer at work, to my laptop, to my in-car nav system, to my tablet, our goal is to have an offering for virtually any of these screens.
…
We’re also announcing a closed beta of our Nokia Maps HTML5 API, which is the first of many huge milestones we hope to achieve to expand our APIs and presence across screens as quickly as possible.
Note: Go to the Nokia Maps API microsite as an easy to comprehend directory for these APIs. The JavaScript APIs are under the “Web” banner in that directory (including Positioning, Directions and Traffic as well) while the web service APIs under the “REST[ful]” banner.
The very basis for all that is certainly the advanced mapping capability which came to Nokia in 2008 with the acquisition of NAVTEQ company. The state of that mapping was quite well presented a year ago in the following video recorded presentation:
Then in the end of October 2011 (Nokia World 2011, Oct 26-27, 2011) came the announcement of The Where Platform as the enabler for The (so called) Third Phase of Mobility:
where content of different kind is made available to different kind of smart devices via The Where Platform of smart data services (Mapping, Directions/Guidance, Places/Search etc.). Thanks to the built-in learning capabilities of that platform we will have a continuously improving digital and also predictive model of the physical world on hands which is going to empower our everyday life in a tremendous way. Our life as both individuals and as active members of a collaborative society.
Nokia is going with that as far as drawing the following parallel:
Watch the following recorded presentation [on slideshare] about that:
This presentation is describing the framework envisaged by Nokia for the:
Creating the Third Phase
1 Real-World Computable
01 IndexAn Index of the things in the real world |
02 PlatformMaking the real world computable |
CREATING A COMPUTABLE MODEL OF THE REAL WORLD
2 Sensed Behaviour
01 Sensors[see above under the “sensor evolution”] |
02 ExperiencesDrive, Traffic, Transit, Live View, Nearby, Maps, Tracks, Pulse |
GATHERING DATA IN THE REAL WORLD
3 Learning Platform
| Real-World Computable | Sensed Behaviour | Smart Data, Real World Answers |
KNOWLEDGE FOR THE REAL WORLD
As such we arrive to: Nokia’s Smart Data Analytics – Understanding You and the World [nokia YouTube channel, Nov 9, 2011]
Here I would also recommend a presentation [on slideshare] [July 14, 2011] on Nokia’s Big Data and Data Analytics present and future, from which I will include here only the following two illustrations as they are providing a very usefull further explanation for all of the above:
NEW! Location & Commerce To Spearhead Nokia’s Revised Services Mission
Location & Commerce combine Nokia’s and NAVTEQ’s leading positions in social location services and location data
– It is a testimonial of our success and a natural next step in our journey
Together we will deliver a complete and differentiated offering to consumers, business customers and advertisers
– Our intent is to become the leading platform in the world
Success comes from combining unique reference data sets with high consumer engagement
Combining Different Types Of Data to Create Smart Data – Our Unique Opportunity
Derived data – data with intelligent meaning is required to feed online services to what server the user at what time. As relevancy is the differentiator in between useful information and spam, derived data is of extreme value.
Going deeper into Big Data and Data Analytics is possible via the following recorded discussions and presentations:
Amy O’Connor in theCube: Nokia Looks to Hadoop for Transforming Data Solutions and Consumer Apps [SiliconANGLE blog, Nov 9, 2011]
The troubles facing smartphone manufacturer Nokia have been front and center a lot latelyso seeing them at Hadoop World 2011 shines a light on their future intent. Nokia Senior Director of Analytics, Amy O’Connor, came into theCube for an interview with John Furrier and Dave Vellante about how Nokia is using Hadoop and unstructured data to provide data services for their customers. The discussion ran from the gathering of information from customers, some about privacy and anonymization, and most importantly how the cellphone maker intends to use big data solutions such as Hadoop to build and guide their infrastructure decisions.
O’Connor says that Nokia really has two businesses coming together: the mobile phone business and their location-based business. Much of the location-based setups for Yelp, Yahoo!, others happen to be based on Nokia’s maps. The first phase was to allow phones to go mobile, the second phase was making computers go mobile, and the third phase has been to congeal data and physical presence: straight-up augmented reality.
The phones that Nokia produces collect a great deal of data from sensors in the phone, from customer relationships, from how they’re used, and how they interact with the network and one another. As a result, Nokia might be a company who manufactures phones; but they produce a lot of data as exhaust. That means that the smartphone maker has a huge amount of product that they need to then manage.
The data challenges that Nokia faces with respect to data happen to be myriad, but the biggest one has always been privacy. “We’ve traditionally been a company who have leaned towards the side of anonymized data and privacy,” O’Connor told John Furrier. “And we’re a global company…that means it’s the biggest, biggest concern that we have.”
While the Nokia Senior Director wanted everyone to know that privacy is a huge concern and direction for the phone maker, John wanted to know more about how they used Hadoop to perform solutions for all the data they’re gathering.
Nokia is currently running a Hadoop system. Since each division faced a great deal of data challenges people started to begin pulling from the open source community and decided to centralize a bunch of Hadoop solutions. They decided to make one-big-shift and centralize their data analysis division; but they intended to do it with a hub and spokes. The beating heart of analysis at Nokia happens to be a Hadoop system, but it feeds satellite projects and analysiswho can take the data and transform it from that point.
Many of the cellphone maker’s products happen to be consumer apps. These apps are enabled via data, they consume, transform, and manufacture data and all of that needs to move through the infrastructure. As a result, Nokia felt that the centralized aspect of using Hadoop at the center as a command and control and data warehouse center would give them the most agile setup for scaling and bringing data to their customers.
“Technology keeps changing, and I’ve been in the industry a long time, and it keeps changing and if we don’t get in front of that, we’ll fall behind and someone else will take over,” O’Connor said. “We have a 120 terabyte warehouse in Teradata…” Instead of pushing further data into that Teradata warehouse that just won’t fit or would overwhelm the data scientists—or worse runs on unstructured data—Nokia has sought to put it through Hadoop so that it could be transformed and brought back in again.
Hadoop World 2011: Changing Company Culture with Hadoop [Cloudera video with presentation [on slideshare], Nov 9, 2011]
We are living in a time of tremendous convergence, convergence of mobile, cloud and social… This convergence is forcing companies to change. At Nokia, we are changing the way we make decisions, from a manufacturing model to a data driven one. Yet making cultural changes is one of the hardest things to accomplish. In this talk, Amy O’Connor will highlight the journey Nokia is taking to evolve its culture – from building a platform for cultural evolution on top of Hadoop, to the administration of Nokia’s data, to how the company conducts the analysis that is enabling Nokia to compete with data.
See also:
– big data analytics [SearchBusinessAnalytics.com definition article, January 2012]
– Hadoop [SearchCloudComputing.com definition article, June 2008]
– Big data [wikipedia article], Analytics [wikipedia article], Apache Hadoop [wikipedia article]
– Hadoop Players Question Forrester’s Take On Leaders [InformationWeek, Feb 6, 2012]: “Forrester’s first-ever Hadoop market assessment draws mixed reactions, both for its leader rankings and for the players who were left out.”
– The Forrester Wave™: Enterprise Hadoop Solutions, Q1 2012 [Forrester Research, Inc. report, Feb 2, 2012]
Amazon Web Services, IBM, EMC Greenplum, MapR, Cloudera, And Hortonworks Lead This Emerging Market, With Seven Others Serving Key Niches Close Behind
In Forrester’s 15-criteria evaluation of enterprise Hadoop solution providers, we found that in the Leaders category, Amazon Web Services led the pack due to its proven, feature-rich Elastic MapReduce subscription service; IBM and EMC Greenplum offer Hadoop solutions within strong EDW portfolios; MapR and Cloudera impress with best-of-breed enterprise-grade distributions; and Hortonworks offers an impressive Hadoop professional services portfolio. Strong Performer Pentaho provides an impressive Hadoop data integration tool. Of the Contenders, DataStax provides a Hadoop platform for real-time, distributed, transactional deployments; Datameer has a user-friendly Hadoop/MapReduce modeling tool; Platform Computing and Zettaset offer best-of-breed Hadoop cluster management tools; and Outerthought has optimized its Hadoop platform for high-volume search and indexing. HStreaming is a Risky Bet with a solution that is strong in real-time Hadoop.
Nokia: Using Big Data to Bridge the Virtual & Physical Worlds [Cloudera channel on vimeo, April 5, 2012]
Nokia’s goal is to bring the world to the third phase of mobility: leveraging data to make it easier to navigate the physical world. Nokia relies on a technology ecosystem with Cloudera’s Distribution including Hadoop at its core to achieve this goal.
Regarding the new Location & Commerce business see also:
– Nokia renews mission for mobile and location based services; appoints Michael Halbherr Executive Vice President [Nokia press release, June 22, 2011]
– Biography of Michael Halbherr [Nokia Leadership Team]
– Nokia under transition (as reported by the company) [this blog, March 11, 2012] from which I will copy here the following strategic statements:
As of October 1, 2011, the Group formed a Location & Commerce business which combines NAVTEQ and Nokia’s social location services operations from Devices & Services. Location & Commerce business is an operating and reportable segment.
…
Location & Commerce develops a range of location-based products and services for consumers, as well as platform services and local commerce services for the Group’s feature phones and smartphones ([96] in support of our strategic goals) as well as ([96] a portfolio of products for the broader Internet ecosystem, including products for our direct competitors) for other device manufacturers, application developers, Internet service providers, merchants, and advertisers. Location & Commerce also continues to serve NAVTEQ’s existing customers both in terms of provision of content and as a business-to-business provider of map data ([56] providing comprehensive digital map information and related location-based content and services for mobile navigation devices, automotive navigation systems, Internet-based mapping applications and government and business solutions). Location & Commerce has profit and loss responsibility and end-to-end accountability for the full consumer experience.…
Location & Commerce:
[97] Our Location & Commerce business aims to positively differentiate its digital map data and location-based offerings from those of our competitors and create competitive business models for our customers.
In the fourth quarter 2011, we conducted our annual impairment testing to assess if events or changes in circumstances indicated that the carrying amount of our goodwill may not be recoverable. As a result, we recorded a charge to operating profit of EUR 1.1 billion for the impairment of goodwill in our Location & Commerce business. The impairment charge was the result of an evaluation of the projected financial performance of our Location & Commerce business. This took into consideration the market dynamics in digital map data and related location-based content markets, including our estimate of the market moving long-term from fee-based towards advertising-based models especially in some more mature markets. It also reflected recently announced results and related competitive factors in the local search and advertising market resulting in lower estimated growth prospects from our location-based assets integrated with different advertising platforms. After consideration of all relevant factors, we reduced the net sales projections for Location & Commercewhich, in turn, reduced projected profitability and cash flows.
Location & Commerce’s resources are primarily focused on the development of:
(i) content, which involves the mapping of the physical world and places such as roads and points of interest, as well as the collection of activity data generated and authorized for use by our users;
(ii) the platform, which adds functionality on top of the content and includes the development tools for us and others to create on top of it; and
(iii) applications built on the content and platform.
Our Devices & Services business is a key customer of Location & Commerce. Devices & Services purchases map and application licenses from Location & Commerce for its Nokia Maps service sold in combination with GPS enabled smartphones.
Competition:
[61] With respect to digital map data and related location-based content, several global and local companies, as well as governmental and quasi-governmental agencies, are making more map data with improving coverage and content, and high quality, available free of charge or at lower prices. For example, our Location & Commerce business competes with Googlewhich uses an advertising-based model allowing consumers to use its map data and related services in their products free of charge. Google has continued to leverage Google Maps as a differentiator for Android, bringing certain new features and functionality to that platform. Apple has also sought to strengthen its location assets and capabilities through targeted acquisitions and organic growth.
Location & Commerce also competes with companies such as TomTom, which licenses its map data and where competition is focused on the quality of the map data and pricing, and Open Street Map, which is a community-generated open source map available to users free of charge. Aerial, satellite and other location-based imagery is also becoming increasingly availableand competitors are offering location-based products and services with the map data to both business customers and consumers in order to differentiate their offerings.
Strategy for the trend: Location-Based Products and Services Proliferation
[97] A substantial majority of Location & Commerce net sales in 2011 came from the licensing of digital map data and related location-based content and services for use in mobile devices, in-vehicle navigation systems, Internet applications, geographical information system applications and other location-based products and services. Location & Commerce’s success depends upon the rate at which consumers and businesses use location-based products and services. In recent years, there has been a strong increase in the availability of such products and services, particularly in mobile devices and online application stores for such devices. Furthermore, as the use of the Internet through mobile devices has been growing rapidly, the anchor of the Internet is moving from the desktops to mobiles. This shift is making location-based content a key element of most Internet experiences. We expect this trend to continue, but we also expect that the level of qualityrequired for these products and services and the ability to charge license fees for the use of map data incorporated into such products and services may vary significantly. By combining our NAVTEQ business with our Devices & Services social location services operations, we believe our Location & Commerce business will be better positioned to capture emerging business opportunities with a broader offering which is no longer limited to digital map data.
Strategy for the trend: Increasing Importance of Creating an Ecosystem around Location-Based Services Offering
[97] Creating a winning ecosystem around our Location & Commerce’s services offering will be critical for the success of this business. The longer-term success of the Location & Commerce business will be determined by ourability to attract strategic partners and developers to support our ecosystem. Location & Commerce is aiming to support its ecosystem by enabling strategic partners and independent developers to foster innovation on top of their location platform. We believe that making it possible for other vendors to innovate on top of Location & Commerce’s high quality location-based assets will further strengthen the overall experience and make our offering stronger and more attractive.
Strategy for the trend: Emergence of the Intelligent Sensor Network
[98] Mobile Internet devices are increasingly being enabled with a rich set ofsensors such as a GPS, a camera and an accelerometer which enable interaction with the real world. This interaction also enables the collection of large volumes of rich data which, when combined with analytics, enable the development of increasingly sophisticated, contextually-aware devices and services. We believe the combination of NAVTEQ with our Devices & Services social location services operations will enable Location & Commerce toparticipate in this industry development and seize new opportunities todeliver new experiences that bridge the virtual with the real world.
Strategy for the trend: Price Pressure for Navigable Map Data Increasing
[98] Location & Commerce’s net sales are also affected by the highly competitive pricing environment. Google is offering turn-by-turn navigation in many countries to its business customers and consumers on certain mobile handsets at no charge to the consumer. While we expect these offerings will increase the adoption of location-based services in the mobile handset industry, we also expect they may lead to additional price pressure from Location & Commerce’s business customers, including handset manufacturers, navigation application developers, wireless carriers andpersonal navigation device (“PND”) manufacturers, which are seeking ways to offer lower-cost or free turn-by-turn navigation to consumers. Turn-by-turn navigation solutions that are free to consumers on mobile devices may alsoput pressure on automotive OEMs and automotive navigation system manufacturers to have lower cost navigation alternatives. This price pressure is expected to result in an increased focus on advertising revenueas a way to supplement or replace license fees for map data.
In response to the pricing pressure, Location & Commerce focuses on offering a digital map database with superior quality, detail and coverage; providingvalue-added services to its customers such as distribution and technical services; enhancing and extending its product offering by adding additional content to its map database, such as 3D landmarks; and providing business customers with alternative business models that are less onerous to the business customer than those provided by competitors. Location & Commerce’s future results will also depend on Location & Commerce’s abilityto adapt its business models to generate increasing amounts of advertising revenuesfrom its map and other location-based content.
We believe that Location & Commerce’s PND customers will continue to face competitive pressure from smartphones and other mobile devices that now offer navigation, but that PNDs continue to offer a viable option for consumers based on the functionality, user interface, quality and overall ease of use.
Strategy for the trend: Quality and Richness of Location-Based Content and Services Will Continue to Increase
[98] Location & Commerce’s profitability is also driven by Location & Commerce’s expenses related to the development of its database and expansion. Location & Commerce’s development costs are comprised primarily of the purchase and licensing of source maps, employee compensation and thirdparty feesrelated to the construction, maintenance and delivery of its database.
In order to remain competitive and notwithstanding the price pressure discussed above, Location & Commerce will need to continue to expand the geographic scope of its map data, maintain the quality of its existing map data and add an increasing amount of new location-based content and services, as well as using innovative ways like crowd sourcing to collect data. The trends for such location-based content and services include real-time updates to location information, more dynamic information, such as traffic, weather, events and parking availability, and imagery consistent with the real world. We expect that these requirements will cause Location & Commerce’s map development expenses to continue to grow, although a number of productivity initiatives are underway designed to improve the efficiency of our database collection processing and delivery. In addition, we will need to continue making investments in this fast paced and innovative location-based content and services industry, for instance through research and development, licensing arrangements, acquiring businesses and technologies, recruiting specialized expertise and partnering with third parties.
Restructuring in accordance with all that:
[F-64] In September 2011, Nokia announced a plan to concentrate the development efforts of the Location & Commerce business in Berlin, Germany and Boston and Chicago in the U.S., and other supporting sites and plans toclose its operations in Bonn, Germany and Malvern, U.S. As a result, Location & Commerce recognized a restructuring provision of EUR 25 million.
Huawei Enterprise after its 1st year and the 2012 strategy
March 26, 2012 9:06 pm / 1 Comment on Huawei Enterprise after its 1st year and the 2012 strategy
Huawei Enterprise at CeBIT 2012 – Press Conference – Geoff Johnson, Research VP, Gartner [HuaweiEnterprise YouTube channel, March 13, 2012]
Huawei Enterprise at CeBIT 2012 – Press Conference – David He discusses our first year [HuaweiEnterprise YouTube channel, March 13, 2012] THE VIDEO IS THERE, JUST CLICK
Started with: Huawei Boosts Investment In European Enterprise Tech[TechWeekEurope, Sept 16, 2011]
Chinese networking and telecoms company Huawei has announced a programme of investment to kick-start its entry into the enterprise market in Western Europe.
Huawei first launched its Western European enterprise division in 2010, and has since built up a workforce of around 400 employees, spanning the UK, Ireland, France, Germany, Spain, Portugal, Italy, Switzerland and ‘Benelux’. The division is headquartered in Amsterdam.
Although the company was unable to name an exact figure, it said the new investment would enable it to double its headcount in the region year-on-year, as well as to build a new sales channel structure in Europe.
“The European region is a key market for Huawei Enterprise,” said Mario Fan, President for Huawei Enterprise Business for Western Europe. “Since we first established our European presence a year ago, we have made tremendous progress. We will build on this strong start by placing an emphasis on developing partnerships with customers, integrators and resellers at all levels.”
…
Huawei is now commencing a regional tour of 18 cities in its Enterprise Business Roadshow, starting in Amsterdam and ending in Utrecht on 7 December. Customers and prospective partners can visit the company’s showtruck for demonstrations of its data centre and networking technologies, as well as its corporate communications solutions.
…
– Huawei Showtruck Launch [HuaweiEnterprise, Sept 30, 2011]
– Huawei Enterprise rolls into Middle East [ITP.net, Jan 19, 2011] see also: follow-up on that [ITP.net, Aug 7, 2011] + another follow up [CommsMEA, March 26, 2012]: “to grow its enterprise business in the Middle East by between 80-90% in 2012 to reach revenues of up to $600 million … achieved revenues of about $320 million in 2011, with a year-on-year growth rate of about 85-90% … particularly strong growth in the Gulf and Iraq, with government projects and the oil and gas sector”
Huawei has announced the official launch of its enterprise business unit in the Middle East.
The company is kicking off its regional enterprise business, which will focus on providing end-to-end ICT solutions to key regional vertical sectors, with a roadshow to take in UAE, Saudi Arabia and Pakistan.
The company is already well established in the region through its telecoms operations, and already has several enterprise customers such as Saudi Aramco, Saudi Ministry of Health, and the new Maktoum Airport in Dubai, but the new unit will focus on provision of solutions to the enterprise segment.
Huawei will be focusing on government and semi-government entities in the region, particularly in energy and power sector, transportation, oil and gas, and SMART cities. The company’s enterprise offerings include expertise infrastructure solutions from data communications (includes security and firewalls, switches, routers, VPN, voice and video communications solutions including high-end unified communications), to transmission to help with integration industrial automation in manufacturing plants, to setting up large scale wireless broadband and WIMAX solutions.
Speaking at a launch event in Dubai, Mr Dongwu, general manager for the Middle East for Huawei Enterprise, commented: “We see convergence in the IT and communications technology, with this kind of convergence, enterprises need both kinds of technologies. As Huawei we have a solid background and experience in communications technologies over the past 20 years, and our IT experience of the past ten years, through business units like Huawei Software, puts us in a unique position that gives us very good opportunities to penetrate the market.
See also: Huawei unveils Enterprise Business unit [for Middle East] [Reseller Middle East, Jan 18, 2011]
Huawei Climbs ‘Food Chain’ in Cisco Enterprise Challenge [Bloomberg, May 9, 2011]
… “Cisco is clearly the leader in this domain, but we also believe changes are happening,” Leon He, president of solution sales at Huawei’s enterprise business unit, said in an interview in Shenzhen, China. “When those changes occur, the current market and customer needs will change.” …
…
“Enterprise is our core capability,” Cisco Chief Executive Officer John Chamberstold investors at a technology forum on April 7. “We’re an enterprise company. That’s where we started.”
The enterprise business and public sector contribute about 46 percent to Cisco’s sales, Chambers said. David McCulloch, a spokesman for Cisco, declined to comment.
“The market Huawei sees is huge,” Huawei’s He said. “If we digitize, we will bring a revolution. When the shift occurs from digital to smart networks, that will be another revolution.”
…
Huawei aims to double annual sales at its enterprise group to $4 billion this year, from $2 billion last year, He said. Within three to five years sales will more than triple again to between $15 and $20 billion, He said. Huawei’s Roese said the company is moving 10,000 employees into the new enterprise division, including 6,000 research and development staff. That’s about 9 percent of the company’s 110,000 staff worldwide.
The initiative is being led by William Xu Wenwei, one of four executive directors on the company’s 13-member board, according to its annual report released last month.
…
See also a follow-up on that [Bloomberg, Feb 29, 2012]:
…
Growth at Huawei Enterprise may be slower than originally anticipated, Xu said, adding that $15 billion in contract revenue by 2015 is a more realistictarget. Leon He, another Huawei executive, in May last year gave a sales projection of $15 billion to $20 billion for the division.
That more cautious outlook stems from a change in strategy where Huawei now works more through system integrators such as Spain’s Telefonica SA (TEF) to create solutions for specific industries.
“In the past, in the previous strategy there was more high-level integration so there was more conflict with our partners,” Xu said. “As a result our sales revenue might not be as high as in the past strategy, but we’ll have closer cooperation.”
Still, the enterprise business plans to increase its workforce to more than 20,000 people this yearfrom over 10,000 at the end of 2011, Xu said.
The enterprise unit is making about 40 percent of its sales in China, Xu said, adding that that ratio will probably remain steady through 2015.
…
More of the same kind:
– Huawei targets corporate sector [FT, March 8, 2011]
– Huawei enters the enterprise market [in Norway] – a game changer or just another player? [primesource.no, March 18, 2011]
– Huawei Malaysia Forms Enterprise Division [Hardware Zone Malaysia, July 7, 2011]
(But partner driven entrance to Malaysia began on Feb 8, 2011: see the Huawei Enterprise Business [Feb 8, 2011] presentation delivered by HD Technology Sdn Bhd. (a distributor of storage products and solutions), since 51% of acquired by Vasseti Berhad owned by Vasseti (UK) plc controlled by rich entrepreneur Syed Mohd Yusof Bin Tun Syed Nasir (the owner of the Concorde Hotel chain in Malaysia), and an investment holding company focused on acquiring the majority of the supply chain of the telecommunications and information and communication technology industry.)
– Huawei appoints new VP of Enterprise [in UK and Ireland] [mobile news, Sept, 2011]
– Huawei to Launch HEAP [Huawei Enterprise Advantage Partner] Partner Program in Australia [ARNnet, Sept 16, 2011]
– Huawei Launches Enterprise Business Unit in India [Indo-Asian News Service, Sept 28, 2011]
– Huawei Launches Its U.S. Enterprise Business Through Channels [Huawei Enterprise press release, Oct 5, 2011]
– Huawei Building Up Its Enterprise Muscle [Digital Life, Singapore, Nov 2, 2011]
– Huawei Launches Partner Program in HK and Macau [telecomasia.net, Hong Kong, Nov 4, 2011]
– Webcom Appointed as Huawei Reseller [in South Africa and sub-Saharan Africa] [IT-Online, Nov 11, 2011]
– Huawei Builds Channel Red Army in Europe – a distribution deal with SDG to punt its enterprise kit to resellers in the UK, France and the Netherlands [The Register, Nov 24, 2011]
– Huawei to Reach a Thousand Partners in Europe [Dealer World, Spain, Jan 1, 2012]
– Huawei Hails Thailand as Regional Hub [for enterprise business in Southeast Asia] [Thai News Service, Nov 24, 2011]
2012 market expansion from Germany’s CeBIT to the whole western hemisphere: Huawei Germany Celebrates 10 Year Anniversary [HuaweiEnterprise YouTube channel, March 7, 2012]
Involving resources of Taiwan as well: Huawei’s Enterprise Market Expansion Attempt to Benefit Taiwan’s ICT Supply Chains [CENS, March 26, 2012]
Huawei Technologies Co., a world leading ICT supplier headquartered in mainland China, will place orders with Taiwan’s supply chains in a big way in line with its aggressive goal of boosting sales of its enterprise equipment operation to US$15-20 billion in 2015 from current US$3.9 billion.
The company would record higher outsourcing to Taiwan in 2012 than in 2011, when which it purchased NT$110 billion (US$3.6 billion at US$1:NT$30) worth of products from Taiwan. In 2010, Huawei contracted Taiwanese manufacturers to supply NT$99.5 billion (US$3.3 billion) of products.
To quickly gain more market share worldwide, Huawei has decided to increase contracts to Taiwan for ICT products, including servers, switchers, routers, mobile phones, network connectivity cards, tablet PCs and touch screens. The company’s global vice president for enterprise business operation, Jia Cholong, stressed that Taiwan is a strong logistics backup for his company’s aggressive global plan.
The company’s contract suppliers in Taiwan include Hon Hai Precision Industry Co., Ltd., Accton Technology Corp., Unizyx Holding Corp., Gemtek Technology Corp., Alpha Networks Inc., and Unimicron Technology Corp.
Jia emphasized that Huawei will team up with distribution channels of local markets worldwide to expand market pie in international enterprise ICT sector, instead of acting as a price killer.
He noted that the company’s enterprise ICT operation saw sales rise to US$3.9 billion in 2011 from US$2 billion in 2010 and US$100 million in 2009. The company aims at shooting the No.2 title in enterprise ICT market and No.3 spot in cloud-computing market in 2015.
The company’s enterprise ICT products include video conferencing equipment, data center, cloud solution, switching equipment, router, firewall, and servers.
In line with its aggressive market plan, the company will double its marketing staff to 20,000 worldwide by the end of this year. In Taiwan, it now has 200 marketing staffers and will open Huawei Certified Datacom Associate (HCDA), Huawei Certified Datacom Expert (HCDE) and Huawei Certified Datacom Profession (HCDP) centers to verify telecom equipment for it.
With major product expansion in Huawei Server line [Huawei Enterprise product catalogue, Feb 27, 2012]
…
Huawei Enterprise at CeBIT 2012 – Press Conference – Johann Strauss on Technology and Intel [HuaweiEnterprise YouTube channel, March 13, 2012]
See also:
– Huawei Unveils Tecal V2 Servers with Intel® Xeon® E5-2600 Inside [Huawei Enterprise press release, March 20, 2012]
Huawei Showcases A Better Way for Enterprises in the ICT Era at CeBIT [Huawei Enterprise press release, March 5, 2012]
Huawei, a leading global information and communications technology (ICT) solutions provider, along with its partners, will be showcasing its comprehensive range of integrated ICT solutions at CeBIT, one of the largest technology trade shows. Addressing specific needs of enterprise customers across various industries, Huawei helps global organizations tap on opportunities presented by the changing trends in today’s ICT era.
Bringing to life its vision of “A Better Way” for enterprises, Huawei, along with 12 of its solution partners, will be showcasing its comprehensive portfolio of products and solutions at CeBIT.
“CeBIT is one of the most important trade shows globally and is a key platform for us to showcase how Huawei’s integrated ICT approach can help enterprises meet the challenges of tomorrow,” said, David He, President of Marketing, Huawei Enterprise. “We will also be making a series of major announcements regarding our channel partner program, as well as unveiling our latest line-up of products and solutions.”
At the exhibition booth, Huawei experts and consultants will be on-site to conduct demonstrations on how its innovations and capabilities in cloud computing, network solutions, and unified communication and collaboration can help enterprise organizations improve business operations and achieve competitiveness in today’s changing ICT landscape. Vertical solutions that address unique customer needs in their respective industries will also be showcased. Demonstrations will also be held in a custom-built Huawei Container Data Centre. Highlights of the products and solutions on display include:
Huawei @ MWC 2012: Industry Trends in Cloud Computing. Ron Raffensberger, Director of Cloud Computing Marketing Ron Raffensperger discusses Huawei’s approach to the latest trends in cloud computing, including consulting services, software as a service and ecosystem alliances.Huawei Modular Data Center (IDS2000). Huawei talks about how their modular data center system provides Simple Deployment, Scalable Design, Energy Saving and Smart Management for their Modular Data Center solution for enterprises, anyone building data centers.Cloud computing – Overview of Huawei’s cloud computing capabilities and related IT/IP products, as well as data centre security solutions, desktop cloud solution, media cloud solution, etc.
Huawei eSight Mobile, mobile/tablet app for Network Administrators. Huawei provides a new application on iPad/iPhone, soon Android, to let IT Administrators do some or all of their Network Management on thei mobile devices. Here’s Huawei’s presentation at CeBIT 2012 presenting their new system.Enterprise ubiquitous broadband network – Introduction to Huawei’s network solution, as well as solutions for WAN connectivity, campus networks and enterprise branches; Huawei’s network management solution eSight will also be showcased.
Huawei eSpace Unified Communications. Huawei provides this system to connect teams of people that collaborate in enterprise, schools and other.Unified communication & collaboration– Demonstration of Huawei’s high-definition telepresence solution, Huawei’s eSpace Unified Communications Solutions, and eSpace Cloud Contact Center, etc.
Industry solutions– Comprehensive overview of vertical-specific solutions including eGovernment and public services, Virtual Teller Machines (VTM) for the finance industry, and Intelligent Transportation System (ITS), etc.
In addition to showcasing its products and solutions, Huawei will conduct a series of open speeches and technical symposiums at its booth throughout the week-long CeBIT event. Ranging from 15 minutes to 50 minutes, these sessions allow industry professionals to better understand how Huawei’s customer-centric innovations can be applied to their business. A full schedule of open speeches and technical symposiums is available at http://www.huawei.com/minisite/cebit2012/index.html.
Huawei will also organize a series of events specially for customers and channel partners on March 7. At the ICT Transformation and Innovation Forum, Huawei customers will hear from senior Huawei executives and industry experts on the outlook for the industry. Channel partners will learn more about Huawei’s outlook and direction for the year at the Huawei Channel Conference.
On March 8 and 9, various Huawei executives will speak at CeBIT-organized events, including Broadband World Forum, CeBIT Lab Talk. At Broadband World Forum, Huawei executives will present on Omnipresent Wireless Broadband, as well as its vision and strategies for Enterprise Network in the Era of Cloud and Internet of Things (IoT). Huawei will also address industry professionals at CeBIT Lab Talks on topics including, green intelligent cities, the cloud era, and enhancing public-private clouds by optimizing IT infrastructure.
Huawei’s booth is located at Hall 13, Booth C23 and its custom-built Container Data Centre is located at the open-air sites outside Halls 12 and 13. CeBIT is held from March 6 to 10, 2012, in Hanover, Germany. For more information on Huawei’s participation and events, please visithttp://www.huawei.com/minisite/cebit2012/index.html.
Huawei’s Booth at CeBIT2012
About Huawei Enterprise Business Group
Huawei Enterprise Business Group (“Huawei Enterprise“) is one of Huawei’s three business groups (BGs) [the other two are Devices and Telecoms Infrastructure]. Leveraged by its strong R&D capabilities and comprehensive technical expertise, Huawei Enterprise provides wide ranging and highly efficient ICT solutions and services. Together with partners, Huawei Enterprise offers solutions for vertical industry and enterprise customers globally including government and public sectors, transportation, power grids, energy, and finance, as well as commercial enterprises in many fields. These innovative and leading solutions cover network infrastructure, UC&C, cloud computing & data center, and industry application solutions.
For more information, please visit http://enterprise.huawei.com
Follow us on Twitter: www.twitter.com/huaweiENT
Facebook: http://www.facebook.com/HuaweiEnterprise
LinkedIn: www.linkedin.com/groups/Huawei-Enterprise-4070523?home=&gid=4070523
CeBIT 2012 – Keynote – John Roese – The ICT Approach to a Smarter Enterprise [HuaweiEnterprise YouTube channel, March 12, 2012]
Huawei Outlines A Better Way to Accelerate the Enterprise Evolution [Huawei Enterprise press release, March 6, 2012]
Huawei, a leading global information and communications technology (ICT) solutions provider, along with its partners, today showcased a wide range of integrated ICT solutions that meet the specific needs of different industries, demonstrating how Huawei’s capabilities are able to help global organizations tap on opportunities presented by the changing trends in today’s ICT era at CeBIT, one of the largest technology trade shows.
At the keynote presentation on the first day of CeBIT, John Roese, Senior Vice President of Huawei’s North America R&D, shared with over 400 enterprise CIOs and IT experts the new reality for enterprises brought about by the “consumerization of IT”, and the paradigm shift required to address it. Compared to the continuous innovations in the device and internet industries which have resulted in the rapid growth and development of the consumer market, the enterprise technology market has lagged behind in innovation.
Enterprises need to respond to new challenges brought about by the “consumerization of IT”, and can no longer ignore new technologies that are being introduced into the enterprise space. With the convergence of communications, enterprise and consumer technologies, enterprise organizations need to fully leverage innovative ICT technologies to better address challenges and manage their business so as to enhance competitiveness.
“Huawei’s comprehensive capabilities and experience in communications technology, enterprise and consumer industries places us in a unique position to help our enterprise customers succeed in today’s era of ICT convergence. Based on customer-centric innovations, Huawei’s solutions help enterprise customers accelerate the shift in the evolution and development of ICT,” said Roese.
John Roese, Senior VP of Huawei’s North America R&D, Delivering a Keynote Speech On Day One of CeBIT
Bringing to life its vision of “A Better Way” for enterprises, Huawei, along with 12 of its solution partners, will be showcasing its comprehensive portfolio of integrated ICT products and solutions at CeBIT. This includes capabilities in cloud computing, data centers, network solutions, and unified communications and collaboration that help enterprise organizations improve business operations and achieve competitiveness in today’s changing ICT landscape. Additionally, sector-specific ICT solutions will also be showcased for industries including government and public sector, electricity, transportation, finance, energy and large enterprises. Huawei’s container data centre will also be situated outside the Huawei exhibition hall to showcase mobile data center capabilities.
NEW TABLETS PREMIERE AT HANNOVER CeBIT [HuaweiEnterprise YouTube channel, March 12, 2012]
Huge sector-specific marketing efforts already started (excerpts only if any):
Only Energy so far: they are definitely piloting the return effects from such an approach
– Huawei to Appear at World Top Gas Conference In 2012 [Huawei Enterprise press release, Jan 18, 2012]
– Huawei Participates In 2012 Summits of Energy Industry [Huawei Enterprise press release, Jan 18, 2012]
– Huawei Assisted PetroChina To Organize 2011 Annual Convention [Huawei Enterprise press release, Jan 17, 2012]
– Accelerating Energy Business Development, Huawei Attracts Visitors From Home and Abroad [Huawei Enterprise press release, Jan 17, 2012]
– Huawei to Attend 2012 Offshore Technology Conference [Huawei Enterprise press release, Jan 16, 2012]
– Huawei Makes a Debut in the 20th World Petroleum Congress [Dec 4, 2011, Doha, Quatar] [Huawei Enterprise press release, Jan 15, 2012]
– Huawei Eyeing Opportunities with Smart Energy Technology [Huawei Enterprise press release, Oct 24, 2011]
– Huawei Accelerates Enterprise Informationization with Digital Energy Solution [Huawei Enterprise press release, March 20, 2012]
Huawei, a leading global information and communications technology (ICT) solutions provider, today announced the launch of its digital energy solution at the 12th China International Petroleum & Petrochemical Technology and Equipment Exhibition (“CIPPE”), the world’s largest petroleum exhibition. At the exhibition, Huawei’s Enterprise Business Group will be showcasing how its ICT solutions provide “A Better Way” to meet the specific needs of energy enterprises to enhance productivity and accelerate informationization.
Hank Stokbroekx, Deputy of Enterprise Services, Huawei Enterprise, announces Global Programs for Channel Partners and ICT Training and Certification [HuaweiEnterprise YouTube channel, March 13, 2012]
Huawei Enterprise Launches Global Programs for Channel Partners and ICT Training and Certification [Huawei Enterprise press release, March 7, 2012]
Huawei, a leading global information and communications technology (ICT) solutions provider, today officially launched its channel partner program, aimed at driving growth for its Enterprise business, as well as the industry’s most comprehensive ICT training and certification program.
“We are dedicated to providing highly efficient ICT solutions and services to our customers, and an important component of our plan is to expand our reach by building a healthy channel partner ecosystem,” said William Xu, Senior President of Huawei and CEO of Huawei Enterprise Business Group. “Together with our channel partners, we intend to lead the industry with a dual devotion to customer-centric innovation and service, while tapping in to Huawei’s vast experience to help enterprise customers navigate the challenges and opportunities in today’s ICT era.”
“We laid the foundation for Huawei Enterprise in 2011 by introducing our value proposition that Huawei Enterprise represents ‘A Better Way’ to do business,” said David He, President of Marketing, Huawei Enterprise. This credo guides everything we do at Huawei Enterprise—from our commitment to our customers, partners, and the entire industry to our devotion to innovation through extensive investments in R&D.”
Supporting the long-term development of the Enterprise business, Huawei Enterprise’s latest channel strategy is designed to broaden its roster of partners by offering a comprehensive product portfolio, cutting edge R&D, and strong long-term expansion opportunities.
Huawei will be building the channel partner ecosystem by recruiting the following partners:
Tier-1 Resellers: Distributors and value-added partners (VAPs) are partners who purchase products directly from Huawei and receive support directly.
Tier-2 Resellers: Platinum, gold and silver partners, and recognized partners add market capabilities and influence in their specific regional/industry markets.
“The channel is a key component to Huawei Enterprise’s business growth strategy, and we are approaching partners with a Win-Win model, creating business opportunities for our partners while maximizing value for end customers,” said Robert Yang, President of Channel Sales Department, Huawei Enterprise. “As we grow our business, our broad and comprehensive product portfolio, backed by our extensive R&D expertise, provides our partners with huge opportunities for business growth.”
Industry’s most comprehensive ICT training and certification program
In parallel with its development in the Enterprise market, Huawei has launched the industry’s most comprehensive ICT training and certification program. The Huawei Enterprise Training and Certification Program draws on Huawei’s more than 20 years of experience in developing ICT talent in more than 160 countries around the world. It is the only program of its kind that covers all ICT technical fields, and Huawei Enterprise has designed and positioned the program so that it will eventually become the leading ICT technical qualification.
Huawei Enterprise has quickly assumed a leadership position in the ICT industry, and this effort will help establish training and certification standards for the entire industry to facilitate its growth in the future. The company aims to certify 300,000 professionals by 2015. Huawei will also be recruiting partners to become Huawei Authorized Learning Partners (HALP).
In 2011, Huawei Enterprise’s global sales contracts totaled US$3.8 billion, up from US$2 billion in 2010. Established in 2011, Huawei Enterprise has capitalized on Huawei’s overall strength as one of the world’s leading ICT companies in IP and mobile and fixed networks with an international presence in over 140 countries and longstanding investments in R&D, for its continued growth in the market.
Huawei Enterprise at CeBIT 2012 – Press Conference – Andreas Neuherz Discusses New Products [HuaweiEnterprise YouTube channel, March 12, 2012]
Huawei Enterprise Bolsters Comprehensive Portfolio with New Enterprise Network and Server Products [Huawei Enterprise press release, March 7, 2012]
Switches, access routers, WLAN products and servers combine customer-centric innovation with latest ICT technologies
Huawei, a leading global information and communications technology (ICT) solutions provider, today announced the launch of multiple enterprise products at CeBIT 2012, including S9700 series high-end switches (3 models), S5700-LI series mid-range switches (8 models), AR200/150 series enterprise access routers (7 models), WLAN products (7 models of ACs and APs), Open Service Platform (OSP) forum and TecalTM V2 servers (6 models) that are supported by Intel® Xeon® E5 processors. Based on its customer-centric approach to innovation, these new products demonstrate Huawei’s continual efforts to meet the ever-changing ICT needs of global enterprises, providing customers with a better way to do business.
“Customer-centric innovation is at the core of everything we do at Huawei Enterprise,” said David He, President of Marketing, Huawei Enterprise. “When it comes to designing hardware solutions, we approach innovation from the viewpoint of our customers, examining what their specific needs are in terms of features and operability. Our new WLAN products, switches, routers and servers embody this focus on the customer and we are confident that our enhanced product portfolio will set new industry standards.”
Switches, access routers and WLAN products that promote the evolution of “10G Cloud Campus”, “Enterprise Branches” and “Enterprise Mobility”
The S9700/S5700-LI series switches supplement Huawei’s current campus network product family. Among these new products, the S9700 provides a 320 Gbps per slot switching capacity and the highest 10GE/40GE port density in the industry to cope with the emerging high-definition video services and fast-developing “10G cloud campus”. In addition, the S5700-LI follows an energy-conservation design principle, adopting an innovative port sleeping, hibernate and awakening technology to reduce the total power consumption by over 40%.
Here’s Huawei’s new advanced router for enterprise networks. This is the type of router retail stores, mid-sized companies etc can buy.WiFi technology is promoting the evolution of enterprise networks to integrated wired/wireless networks, and enterprises have an urgent need for a unified network management platform to manage wired, wireless, and IT devices. Huawei eSight enterprise network management software will help enterprises manage and maintain their network devices on a unified basis and improve network service quality. The 7 models of AR200/150 enterprise access routers offer various industry-leading wireless network solutions for customers to choose from. The AR G3 routers, based on industry-leading third generation architecture, are supported by multi-core CPUs and non-blocking switching network and integrate voice, data, and security services on one device. This all-in-one design extends cloud-based services to enterprise branch networks, while reducing customer’s investment by 30%. In addition, the AR G3 routers are based on the Open Service Platform (OSP), allowing partners and end customers to customize based on their needs and requirements.
Mobile office technology is changing how people work. The explosive increase in WiFi-supported smartphones, tablet computers, and laptops, and the increase in WiFi hotspots all contribute an ever-growing demand for mobile access. However, the current radio technology hinders development of mobile office. When an AP has a certain number of users connected, the bandwidth allocated to each user decreases dramatically. The WLAN products Huawei launched today are the only WLAN devices that provide fine-grained QoS control based on user groups. This mechanism provides differentiated services for each WiFi user. Enterprises and WiFi service providers can use flexible QoS policies to guarantee QoS for high-priority users.
These new products launched today provide more flexible choices for enterprises to keep pace with the fast development of cloud computing, WiFi, and smart terminal technologies in the “10G cloud campus” and “enterprise mobility” era.
Liu Shaowei, President of Huawei Enterprise network product line, said: “Huawei will continuously increase investment in the enterprise network market and devote our efforts to providing competitive products and solutions though constant innovation.”
Intel® Xeon® E5 Processor Powered TecalTM V2 Servers
Huawei’s new TecalTM V2 servers include rack mount server RH2288 V2, high-performance blade server BH622 V2 (for E6000), XH620 V2 and XH621 V2 nodes (for high-density datacenter server X6000), DH620 V2 and DH621 V2 nodes (for green and power efficient cabinet server X8000), all of which focus on applications for data centers and enterprises. These servers use the newest Intel® Xeon® E5 CPUs. These servers bring better performance, larger storage capacity and better virtualization usage, which will help internet and enterprise customers to accelerate their applications, and will also greatly increase server efficiency. Huawei helps customers to accelerate applications such as virtualization, database and big data application by providing hardware and software acceleration solutions. With an overall-considered system level design and well chosen components for power consumption control, they will help customers to achieve commercial success.
Huawei servers achieved more than 30 world records in SPEC (Standard Performance Evaluation Corporation) tests due to its continuous innovations. By adhering to the concepts of green and energy-savings, Huawei TecalTM servers received an ROHS certificate. Through the highly efficient use of raw materials, TecalTM servers achieved compliance for environmental protection across its manufacturing, product usage and treatment processes. By leveraging its power saving expertise, Huawei’s servers are able to save 5% to 10% of energy compared to similar products on the market, which helps significantly reduce a customer’s operation expenses.
“TecalTM servers have witnessed outstanding growth in the past three years. With a wide range of products including rack servers, blade servers, high density and scalability data center servers, Huawei servers are used by top ISPs and other well-known enterprises all over the world. We have also been continuously developing and launching competitive products and solutions to accelerate applications by innovation,” said Chen Shijun, General Manager of Huawei Server Product & Storage Domain.
Huawei Aggressive as Hell in Chasing New Partners [CRN (Connecting The Australian Channel), March 9, 2012]
Huawei this week confirmed more information about its expanding global enterprise channel as it seeks to burrow further into territory dominated by Cisco, HP and other vendors with monster worldwide channel programs.
The broad channel push comes on the heels of a US channel program Huawei launched last year. The global Enterprise Business Group also was formed in 2011 following a restructuring of Huawei Technologies Co. Ltd., the Shenzhen, China-based parent company.
Huawei executives told CRN at the time the company’s goal is to do 100 percent of enterprise sales in the US through channel partners and that “phase one” of the program includes Huawei’s Ethernet switches, routers and video telepresence products made available through US-based solution providers. In recent months, Huawei also has continued to recruit Western-based channel management talent, including former 3Com and HP executive Alex Dobson, now Huawei’s vice president of sales, U.S. Enterprise Group.
Now comes the global program, which Huawei unveiled this week at the CeBIT conference in Germany and which includes several program levels.
Tier-1 Resellers are Huawei distributors and VARs that purchase products directly from Huawei and receive support from Huawei.
Tier-2 Resellers, the group in which Huawei includes its Silver, Gold and Platinum-level partners, add market capabilities and are described by Huawei as “influential” in their specific regions and industry segments.
Huawei also is adding a training and certification specific to information and communications technology (ICT), through which it hopes to certify 300,000 professionals by 2015. It also plans to recruit partners to become Huawei Authorised Learning Partners.
William Xu, senior vice president of Huawei and CEO of Huawei Enterprise Business Group, said that a healthy partner ecosystem is a high priority for Huawei’s international expansion.
“Together with our channel partners, we intend to lead the industry with a dual devotion to customer-centric innovation and service, while tapping into Huawei’s vast experience to help enterprise customers navigate the challenges and opportunities in today’s ICT era,” Xu said in a statement.
According to Huawei Enterprise, it did $US3.8 billion ($A3.5 billion) in sales contracts in 2011, up from $2 billion in 2010. Industry sources peg Huawei’s global revenue at about $35 billion.
Several US-based solution providers contacted by CRN said they’d been approached by Huawei and were at least intrigued by the vendor.
“Say this for them: They’re aggressive as hell,” said the CEO of a longtime West Coast-based networking solution provider, who asked that his name not be used because his team is currently discussing the Huawei option. “It’s a good product and it’s priced reasonably, and it seems like they’re in this for the long haul. We’re going to look at the program pretty closely.”
Huawei has sought a stronger global enterprise presence for years but has seen growth stymied due to continued concerns over its alleged ties to the Chinese military – ties that Huawei has continued to deny.
The company has continued to broaden its enterprise product portfolio all the while. At CeBIT this week, it unveiled three high-end S9700 switches, eight S5700-LI midrange switches, seven AR200/150 enterprise access routers, seven WLAN access points, and six Tecal V2 servers supported by Intel Xeon E5 processors.
Huawei Launches Its U.S. Enterprise Business Through Channels [Huawei Enterprise press release, Oct 5, 2011]
Huawei, a leading global information and communications technology (ICT) solutions provider, today announced the formal launch of its Enterprise Business group in the United States. Huawei Enterprise, one of the three main business groups including Devices, and Telecoms Infrastructure, is well positioned to help fuel future growth – internationally and in the U.S. market.
Huawei’s enterprise business will deliver its broad product portfolio and solutions to channel partners (VARs, distributors, system integrators and carriers as a channels). Through our partners, Huawei is addressing the broader enterprise market, while Huawei and its channel partners will also focus on vertical segments. Solutions will include campus networks, branch access, IP backbone, data center and video conferencing.
“Huawei is uniquely positioned to combine our growing device expertise and market presence with our traditional telecommunications infrastructure products and services solutions with emerging leadership in open-application-based cloud computing and mass storage solutions”, said Karen Yu, President of Huawei’s Enterprise business in the U.S. “Success in our Enterprise business will focus on nurturing an open, interoperable and partner-based ecosystem to ensure long-term and maximum value for our channel partners and their enterprise customers.”
Three key products are the focus of the initial launch of the U.S. enterprise business: next-generation teleconferencing, environmentally-friendly switches and enterprise-class routers.
“As we transition towards the next phase of “consumerization of IT” to deliver enterprise-grade networking and communication solutions, it’s imperative that vendors develop expertise in mobility, cloud, video and unified communications, and other intelligent network infrastructures in addition to traditional data and voice networking capabilities”, said Rohit Mehra, Director, Enterprise Communications Infrastructure, IDC.
Huawei Channel Partner’s will reap the benefits of being united with a leading global technology provider who is truly committed to our channel partners’ long-term growth, and who has a strong local platform to support the business.
Please visit us at Booth #517 for more information about Huawei’s Enterprise Business Group, speak with our channel team and see demonstrations of the products that are part of the U.S. launch.
James Whittaker’s Quality Software Crusade from Academia to Microsoft, then Google and now back to Microsoft
March 14, 2012 3:37 pm / 4 Comments on James Whittaker’s Quality Software Crusade from Academia to Microsoft, then Google and now back to Microsoft
Updates: Why I joined Microsoft [published: March 21, 2012; written: March 13, 2012]
– James Whittaker @docjamesw 7:06 AM – 20 Mar 12 via web · Details
A web futurist is someone who hates the web as it is now and envisions a better future for it.
– Why I hate search [MSDN Blogs > JW on Tech , March 15, 2012]
…
We start from scratch each time. We search for things we’ve already found.
The problem with Internet search is that being stupid about it is profitable. The more ugly blue links you serve up, the more time users have to click on ads. Serve up bad results and the user must search again and this doubles the number of sponsored links you get paid for. Why be part of the solution when being part of the problem pays so damn well? It’s 2012 and we are still typing search queries into a text box. Now you know why, a ‘find engine’ swims in the shallow end of the profit pool. Is it any surprise that technology such as Siri came from a company that doesn’t specialize in search? (Where do you place an ad in a Siri use case?)
There’s no more reason to expect search breakthroughs from Google than there is to expect electric car batteries to be made by Exxon.
We can do better. We’ve been searching for over a decade. We know every place possible where the online equivalent of car keys are found. We know where our online pet is, always. We know so many things about the world that no longer need to be served up as search “results.” (Results indeed! If users ever wake up and divorce their search engine, the “results” page is likely to be exhibit A in the separation hearing.)
Search, my friends, is broken. Finding things has become secondary to monetizing the search process. Fixing this situation is not in the best interest of the incumbents. Which, actually, is all well and good because the fix will need a more web-wide effort anyway. The companies that own the data sources, the companies that ingest, store and conflate that data, the myriad small development shops that do interesting things with the data, the cleverness of the people who curate the data and the power of crowdsourced know-how need to come together and make search … better? No, not better, irrelevant.
Search is dead. The web doesn’t need it and neither do we.
– Now you see it, in 20 years you won’t [MSDN Blogs > JW on Tech , April 12, 2012]
Google’s Marissa Mayer gave a Flintstonian glimpse at what search might look like in 20 years including “predicting what restaurant you might like in a new city” and “connecting you with strangers based on common interests.” Things that take entire seconds today will take … entire seconds in 2032. Thankfully, for Mayer at least, violating Moore’s Law carries no actual criminal or civil penalties.
In a nutshell, what Mayer (and I assume Google) is proposing is that in twenty years Google and the web will still be standing between knowledge and its consumption. Google has 40 billion reasons to be patient regarding the future.
…
You want a prediction of the future? The trend of disappearing search will continue. The web will melt into the background and humans will progressively be removed from their labor intensive and frustrating present by automation. In five years the web is likely to be completely invisible. You will simply express your intent and the knowledge you seek will be yours. Users will be seamlessly routed to apps capable of fulfilling their intent. Apps won’t need to be installed by a user; they will be able to find opportunities to be useful all by themselves, matching their capabilities with a user’s intent. You need driving directions? Travel reservations? Takeout? Tickets to a show? Groceries? Tell your phone, it will spare you the ugly links. It will spare you the landing page. It will spare you the ads. It will simply give you what you asked for. This is already happening today, expect it to accelerate.
…
End of Updates
James Whittaker @docjamesw 8:03 PM – 29 Feb 12 via web · Details
I got my team today. Hmm…what shall I do with 300 developers? You won’t have to wait long to find out.
About JW on Tech [MSDN Blogs > JW on Tech > About James Whittaker, March 13, 2012]
James Whittaker is a technology executive with a career that spans academia, start-ups and top tech companies. He is known for being a creative and passionate leader and in technical contributions in testing, security and developer tools. He’s published dozens of peer reviewed papers, five books and has won best speaker awards at a number of international conferences. During his time at Google he led teams working on Chrome, Google Maps and Google+. He is currently at Microsoft reinventing the web [in a Partner Development Manager role as per LinkedIn, and as a web futurist at Microsoft according to his twitter account].
Want to read more? James wrote How to Break Software, How to Break Software Security (with Hugh Thompson), and How to Break Web Software (with Mike Andrews). While at Microsoft, James transformed many of his testing ideas into tools and techniques for developers and testers, and wrote the book Exploratory Software Testing [Sept 4, 2009]. His current book was written when he was a test engineering director at Google and is called How Google Tests Software (with Jason Arbon and Jeff Carollo) [Whittaker’s twitter: getting close to end of printing, otherwise April 8, 2012].
How Google Tests Software [Google Testing Blog, May 26, 2011]:
Part 1 – Part 2 – Part 3 – Interlude – Part 4 – Part 5 – Part 6 – Q&A – Part 7
Large-scale Exploratory Testing: Let’s Take a Tour [SQEVideo, published on Oct 2, 2011]
[James Whittaker’s Google+ post, Feb 13, 2012]
Signing off of Google+
This will be my last post on Google+. Anyone interested in my post-Google career can follow me on Twitter (@docjamesw).
[James Whittaker’s Google+ post, Feb 3, 2012]
There comes a time when all good things must end and my time at Google is one of them. This is not one of those “Google let me down” rants, nor is it a “I love this company, keep up the good work” farewell … just a realization that even as my perf scores and profile within the company has risen my ability to lead has diminished. It’s time to stop being part of a team changing the world and time to go lead one. Unfortunately, the place to do that is elsewhere. Today is my last day.
Keep in touch with me on Twitter @docjamesw. Or not.
James Whittaker interview [TCLGroupLimited, Dec 3, 2011]
James Whittaker’s testing blog posts while with Google [Google Testing Blog, June 8, 2009 – Nov 15, 2011]
James Whittaker joins Google [Google Testing Blog, June 2, 2009]
By Patrick Copeland
I’m excited to announce that James Whittaker has joined us as our newest Test Director at Google.
James comes to us most recently from Microsoft. He has spent his career focusing on testing, building high quality products, and designing tools and process at the industrial scale. In the not so distant past, he was a professor of computer science at Florida Tech where he taught an entire software testing curriculum and issued computer science degrees with a minor in testing (something we need more schools to do). Following that , he started a consulting practice that spanned 33 countries. Apparently, fashion is not high on his list as he he has collected soccer jerseys from many of these countries and wears those during major tournaments. At Microsoft he wrote a popular blog,and in the near future you can expect him to startcontributing here.
He has trained thousands of testers worldwide. He’s also written set of books in the How to Break Softwareseries. They have won awards and achieved best seller status. His most recent book is on exploratory testing is coming out this summer. It is not a stretch to say that he is one of the most recognizable names in the industry and has had a deep impact on the field of testing. If you have a chance, strike up a conversation with James about the future of testing. His vision for what we’ll be doing and how our profession will change is interesting, compelling and not just a little bit scary.
Join me in welcoming James to Google!
James Whittaker’s testing blog posts while with Microsoft 1st time [posted between July 8, 2008 and May 21, 2009]
tour of the month: the exit-stage-right tour [MSDN Blogs > JW on Test, May 21, 2009]
All tours much eventually come to an end and thus it is with my tour with Microsoft. I have resigned my position and am leaving the company. It was a great ride.
But the tours will continue. My book Exploratory Software Testing: Tips, Tricks, Tours and Techniques to Guide Manual Testers is in press and will appear through Addison-Wesley sometime this summer. I am truly thankful for the many wonderful testers at Microsoft who contributed wisdom, thoughts and even case studies to the effort. Special thanks go to Nicole Haugen, Geoff Staneff, David Gorena Elizondo, Shawn Brown and Bola Agbonile. Microsoft is full of great testers and even here, these guys manage to stand out.
I imagine that I will not be long in setting up a new blog as I have very much enjoyed this experience and being the only tester in Developer Division’s top ten bloggers was quite an honor. For that I thank you.
In case you are interested in my landing place, I can imagine that one or two of the more popular testing blogs around town will be talking about it.
Wish me luck …
before we begin [MSDN Blogs > JW on Test, July 8, 2008]
For those of you familiar with my writing I plan to update some of my more dated work (history of testing, testing’s ten commandments, and so forth) and preview some of the information that I will be publishing in paper and book form in the future. Specifically, I now (finally) have enough notes to revise my tutorial on manual exploratory testing: How to Break Software and will be embarking on that effort soon. This blog is where I’ll solicit feedback and report on my progress.
For now, here’s an update on what’s happening, testing-wise, for me at Microsoft:
- I am the Architect for Visual Studio Team System – Test Edition. That’s right, Microsoft is upping the ante in the test tools business and I find myself at the center of it. What can you expect? We’ll be shipping more than just modern replacements for tired old testing tools. We’ll be shipping tools to help testers to test: automated assistance for the manual tester; bug reporting that brings developers and testers together instead of driving them apart; and tools that make testers a far more central player in the software development process. I can’t wait!
- I am the Chair of the Quality and Testing Experts Community at Microsoft. This is an internal community of the most senior testing and quality thought leaders in the company. We kicked off the community with record-breaking attendance (the most of any of Microsoft’s technical network communities) at our inaugural event this past spring where some of our longest-tenured testers shared a retrospective of the history of testing at Microsoft followed by my own predictions for the future of the discipline. It was a lively discussion and underscored the passion for testing that exists at this company. In this quarter’s meeting we’re doing communal deep dives into the testing-related work that is coming out of Microsoft Research. MSR, the division responsible for Virtual Earth and the Worldwide Telescope also builds test tools! I can’t wait to ship some of this stuff!
- I am representing my division (DevDiv) on a joint project with Windows called aQuality Quest. Our quest is concerned with quality, specifically, what we need to do to ensure that our next generation of platforms and services are so reliable that users take quality for granted. Sounds like I took the blue pill, doesn’t it? Well, you won’t find us dancing around acting like our software is perfect. Anyone who has ever heard me speak (either before or after I joined Microsoft) has seen me break our apps with abandon. In this Quest, we’ll leave no stone unturned to get to the bottom of why our systems fail and what processes or technology can serve to correct the situation.
New hire into our group – James Whittaker [MSDN Blogs > Michael Howard’s Web Log, May 5, 2006]
I’m pleased to announce, actually I’m *thrilled* to announce, that James Whittaker has joined our group [SDL – Security Development Lifecycle]. James is a well-known author and speaker on software testing and security. He most recently worked as a professor of computer science at Florida Tech where he ran a huge software security research team. James created the “How to Break…” book series with Addison Wesley. He wrote How to Break Software [May 19, 2002], How to Break Software Security [May 19, 2003] and How to Break Web Software [Feb 12, 2006].
He’s also one of the folks behind the Holodeck testing tool.
He’s a cool guy, sharp as a tack, with a very dry sense of humor, so we should get along just fine! He’ll be a peer of mine, reporting to Steve Lipner [Trustworthy Computing Initiative chief], and is initially focused on our internal security and privacy training.ne of the folks behind the Holodecktesting tool.
As I’m sure most of you will agree, hiring good security people takes time, and hiring talent like James is rare indeed.
Welcome, James!
[Michael Howard is now the chief security officer for Microsoft as a so called Principal Cybersecurity Architect working with customers and partners. Before that he was a long-time member of the Security Development Lifecycle team, in fact a co-founder of that in 2001, the SDL being also closely related to the now 12 years old Trustworthy Computing Initiative by Microsoft. ]
GTAC 2008 Keynote Address: The Future of Testing by James Whittaker of Microsoft [GoogleTechTalks, published on Apr 7, 2009]
An Interview with James Whittaker [Dr.Dobb’s Journal, Sept 26, 2006]
Michael Hunter interviews James Whittaker, noted testing guru and author, to shed some light on his testing philosophy.
James Whittakeris, I dare say, one of the celebrities of the testing world. He was long a professor of computer science at the Florida Institute of Technology, where he became well-known for his efforts to find ways to make testing a teachable skill. He and his research group there created innovative testing technologies and tools, including the popular runtime fault injection tool Holodeck, and became highly skilled at breaking software security. James founded Security Innovation to productize his work, but recently he has left both that company and teaching to join Microsoft as a Security Architect, where he is working to integrate testing into the Security Development Lifecycle (SDL).
James wrote How To Break Software – one of my favorite books on testing, co-wrote How To Break Software Security (also very good) with Hugh Thompson, and co-wrote How To Break Web Software(haven’t read it yet) with Mike Andrews. James’ talks at Microsoft are always standing room only; this interview will give you a taste of why.
DDJ: What was your first introduction to testing? What did that leave you thinking about the act and/or concept of testing?
JW: I was in graduate school in a software engineering group studying high assurance software engineering methodologies (cleanroom to be specific) and the bloody dev group met at 7:30 on Saturday mornings! I missed the first three meetings (dude, in grad school the nerd act doesn’t happen that early on a weekend) so the professor put me in charge of the independent test team (which I discovered was just me). So that left me with the idea that testers get more sleep than devs but that we need it because we are woefully outnumbered.
And that perception remains, sans the sleep part.
DDJ: What has most surprised you as you have learned about testing/in your experiences with’ testing?
JW: The sheer number of people *passionate* about testing, particularly at Microsoft. It gives me a great deal of confidence in the future knowing that such skill and talent is being applied to the hardest problem the discipline has to offerwhich is quality.
DDJ: What is the most interesting bug you have seen?
JW: The most interesting bug is always the latest bug. Just today everyone in our group was surprised at an Inbox with thousands of recall status messages. Someone sent a mail from an alias of 1275 members, then recalled it. The recall then sent success/failure notices to EVERYONE on the alias. That’s 1275 x 1275 (about 1.6 million) emails! How’s that for exploiting a design flaw!
DDJ: How would you describe your testing philosophy?
JW: Eyes open, brain on, test! Or the longer explanation covered in How to Break Software. Thanks for the chance to plug one of my books!
DDJ: What do you see as the biggest challenge for testers/the test discipline for the next five years?
JW: There are a number of trends that testers are going to have to grapple with. The first is that software is getting better. The result of this is that bugs are going to become harder and harder to find and the weaker testers will be relegated to Darwinian insignificance. Keeping sharp, building skills and maintaining a cutting edge testing knowledge has never been more important.
The second is that software process is finally taking over. For years processes haven’t much affected the way software is built (which doesn’t say much for legacy processes). But here at Microsoft the SDL is revolutionizing the way software is constructed. Testers have to figure out their role in this process. We have to be there, working, at project initiation and play a key role in every single phase of the lifecycle. Testing is not a task for the latter stages of the ship cycle. Testers who realize this and customize their work accordingly will rise in prominence within their product group and be able to influence the growth of the SDL rather than be steamrolled by it.
[See my Table Of Contents post for more details about this interview series.]
“if Microsoft is so good at testing, why does your software suck?” [MSDN Blogs > JW on Test, Aug 11, 2008]
What a question! I only wish I could convey the waythat question is normally asked. The tone of voice is either partially apologetic (because many people remember that I was a major ask-er of that same question long before I became an ask-ee) or it’s condescending to the point that I find myself smiling as I fantasize about the ask-er’s computer blue-screening right before that crucial save. (Ok, so I took an extra hit of the kool-aid today. It was lime and I like lime.)
After 27 months on the inside I have a few insights. The first few are, I readily concede, downright defensive. But as I’ve come to experience firsthand, true nonetheless. The last one though is really at the heart of the matter: that, talent notwithstanding, testers at Microsoft do have some work to do.
I’m not going down the obvious path: that testing isn’t responsible for quality and to direct the question to a developer/designer/architect instead. (I hatethe phrase ‘you can’t test quality in,’ it’s a deflection of blame and as a tester, I take quality directly as my responsibility.)
But I am getting ahead of myself. I’ll take up that baton at the end of this post. Let’s begin with the defensive points:
- Microsoft builds applications that are among the world’s most complex. No one is going to argue that Windows, SQL Server, Exchange and so forth aren’t complex and the fact that they are in such widespread use means that our biggest competitors are often our own prior versions. We end up doing what we call “brown field” development (as opposed to ‘green field’ or version 1 development) in that we are building on top of existing functionality. That means that testers have to deal with existing features, formats, protocols along with all the new functionality and integration scenarios that make it very difficult to build a big picture test plan that is actually do-able. Testing real end-to-end scenarios must share the stage with integration and compatibility tests. Legacy sucks and functionality is only part of it…as testers, we all know what is really making that field brown! Be careful where you step. Dealing with yesterday’s bugs keeps part of our attention away from today’s bugs.
(Aside: Have you heard that old CS creationist joke: “why did it take god only seven days to create the universe?” The answer: “No installed base.” There’s nothing to screw up, no existing users to piss off or prior functionality and crappy design decisions to tiptoe around. God got lucky, us…not so much.)
- Our user-to-tester ratio sucks, leaving us hopelessly outnumbered. How many testers does it take to run the same number of test cases that the user base of, say, Microsoft Word can run in the first hour after it is released? The answer: far more than we have or could hire even if we could find enough qualified applicants. There are enough users to virtually ensure that every feature gets used in every way imaginable within the first hour (day, week, fortnight, month, pick any timescale you want and it’s still scary) after release. This is a lot of stress to put our testers under. It’s one thing to know you are testing software that is important. It’s quite another to know that your failure to do so well will be mercilessly exposed soon after release. Testing our software is hard, only the brave need apply.
- On a related point, our installed base makes us a target. Our bugs affect so many people that they are newsworthy. There are a lot of people watching for us to fail. If David Beckham wears plaid with stripes to fetch his morning paper, it’s scandalous; if I wore my underpants on the outside of my jeans for a week few people would even notice (in their defense though, my fashion sense is obtuse enough that they could be readily forgiven for overlooking it). Becks is a successful man, but when it comes to the ‘bad with the good’ I’m betting he’s liking the good a whole lot more. You’re in good company David.
But none of that matters. We’ll take our installed base and our market position any day. No trades offered. But still, we always ready to improve. I think testers should step up and do a better job of testing quality in. That’s my fourth point.
- Our testers don’t play a strong enough role in the design of our apps. We have this “problem” at Microsoft that we have a whole lot of wicked smart people. We have these creatures called Technical Fellows and Distinguished Engineers who have really big brains and use them to dream really big dreams. Then they take these big dreams of theirs and convince General Managers and VPs (in addition to being smart they are also articulate and passionate) that they should build this thing they dreamt about. Then another group of wicked smart people called Program Managers start designing the hell out of these dreams and Developers start developing the hell out of them and a few dozen geniuses later this thing has a life of its own and then someone asks ‘how are we going to test this thing’ and of course it’s A LITTLE LATE TO BE ASKING THAT QUESTION NOW ISN’T IT?
Smart people who dream big inspire me. Smart people who don’t understand testing and dream big scare the hell out of me. We need to do a better job of getting the word out. There’s another group of wicked smart people at Microsoft and we’re getting involved a wee bit late in the process. We’ve got things to say and contributions to make, not to mention posteriors to save. There’s a part of our job we aren’t doing as well as we should: pushing testing forward into the design and development process and educating the rest of the company on what quality means and how it is attained.
We can test quality in; we just have to start testing a lot sooner. That means that everyone from TF/DE through the entire pipeline needs to have test as part of their job. We have to show them how to do that. We have to educate these smart people about what quality means and take what we know about testing and apply it not only to just binaries/assemblies, but to designs, user stories, specs and every other artifact we generate. How can it be the case that what we know about quality doesn’t apply to these early stage artifacts? It does apply. We need to lead the way in applying it.
I think that ask-ers of the good-tester/crappy-software question would be surprised to learn exactly how we are doing this right now. Fortunately, you’ll get a chance because Tara Roth, one of the Directors of Test for Office is speaking at STAR West in November. Office has led the way in pushing testing forward and she’s enjoyed a spot as a leader of that effort. I think you’ll enjoy hearing what she has to say.
Test Talk with James Whittaker [Oct 3, 2011]
James Whittaker is in software testing as long as he can remember. During his study he wrote his graduation paper about Model Based Testing. He made fame at Microsoft and recently he ´joined the enemy´ by going over to Google. His books “how to break software” are bestsellers and the presentation in which he is hacking websites live in front of the audience are fantastic. His last book about Exploratory Software Testing is released last year. I was in the opportunity to ask him the questions below.
1. Can you introduce yourself and explain how you already became a tester during college at the University of Tennessee-Knoxville, judging the name of your dissertation.
My name is James Whittaker and I am a Director of Engineering at Google. I own Test for a bunch of Google products including Chrome browser, Chrome operating system, Google Toolbar as well as some Search and Geo products and a bunch of back-end data center infrastructure applications. I also own Development of engineering tools including both developers and testing tools.I got into test when I was a grad student. Mostly it was by default as my software engineering research team met on Saturday mornings and I had better things to do early Saturday than spend it with a bunch of coding nerds. My professor gave me two choices: get fired or be a tester. Neither he or I knew what a favor he did for me at the time. I really hit the ground running and did a lot of innovative work in model-based testing. In fact, I got my PhD two years before any of those developers. Testing was a great career move for me even back then.
2. How hard was it to change from a Microsoft employee towards a Google employee, and is testing very different at these two companies?
Not hard at all difficult. Microsoft was great preparation for Google. Culturally they are polar opposites with Microsoft being more top down whereas Google is more engineering driven. At Microsoft the high ratio of testers to developers is a case in point. The numbers of people on a project is made by execs and managers. At Google it is made by engineers and no engineer at Google believes a 1:1 ratio is necessary or even healthy. The fewer testers on a project means more involvement by developers for QA. I had a meeting yesterday with the development director for Chrome OS and the entire subject was what they could do to make our job easier. The director was genuinely concerned that his developers were engaging deeply enough on testing issues. A culture like that makes the 1:1 ratio irrelevant … everyone on the project is a tester.3. I read somewhere that you are busy at Google with forging a future in which software just works. Is that possible, a world without software bugs?
Not in my lifetime. However we are getting closer. Even a few years ago I had to pull the battery on my smart phone every week or so. I’ve not even turned my current one off for 3 months and it works fine. Quality assurance for software is much like health care for humans. Humans will always get sick but with good prevention, good hygiene and regular maintenance our bodies do ok. We need to make testing like this: continuous and ongoing. One of the things that annoys me is the whole “push quality upstream” movement. Some people seem to believe that we can rig it so we just write perfect code. That’s like taking all your vitamins when you are a baby and then expecting a long healthy life. Obviously upfront debugging is good, but quality is an ongoing endeavor. It starts at the beginning and is a constant activity throughout the life of a product.4. Patrick Copeland said that your vision on the future of testing is interesting, compelling and not just a little bit scary. Can you shortly tell us your vision, so we don’t get scared?
I am happy that it scares people and honored that it scares smart people like Patrick who thinks deeply on these matters. Too many people are dogmatic about testing. Some say “avoid rigor and do only exploratory testing” and they say it with a fervor that reminds me of religious fundamentalist who see only black and white. Others say the same of automation with the same amount of self righteousness. One thing I do know is that when you think your world view is the only view, there is a problem. People like this have stopped thinking about alternatives. They’ve stopped being open minded. They’ve definitely stopped being right.I am also not going to stand in the middle and start every answer with ‘it depends.’ It turns out that there are some absolutes. There are some testing problems that can be driven to extinction with automation. There are some problems where exploratory testing is exactly the opposite of a good idea. I think it is smart to be problem-oriented and not solution-oriented. The latter is the proverbial hammer solution where every problem looks like a nail because you sell hammers for a living. I’ve laid out my full vision for software testing in my latest book but let me just say here that the part people find scary is that my vision requires far fewer testers than the world currently employs.
5. Your last book is about Exploratory Testing. Can you explain how taking the supermodel tour will improve our testing skills?
All the tours focus a tester’s attention. The idea is to test on purpose. Exploratory testing does not have to lack rigor and it does not have to encompass endless wandering hoping that you find a bug. It also should be about finding important bugs. I find myself endlessly annoyed by speakers who show bugs that no one would care about. Any exploratory method can find easy bugs; what about the hard ones?Many of the tours focus on a general class of bug. The Supermodel Tour as a specific case focuses on presentation layer bugs. It asks you to first identify important properties of the UI and then choose paths that force those properties to change and then be displayed on the UI. We called it the Supermodel Tour to get the idea across that we are looking only skin-deep for bugs (only at the UI level). The tour gives both general guidance in terms of focusing on displayable properties and specific guidance about what part of the application should be visited during an exploratory session (the functions that allow you to change and then display those values). So you see that it requires some pre-work and planning but then allows for exploration once that planning is done. For example, in Maps we run the Supermodel Tour on our classification of landmarks. We make a list of all the landmarks (national parks, places of interest and so forth) in advance and explore the UI to find each location. We (actually Brendan Dhein) found a bug where Arlington National Cemetery was classified as a restaurant! It’s a subtle bug if you are just exploring. But if you are running the Supermodel Tour is jumps out at you. The idea is that a good tester can become a great tester with the right focus and by testing on purpose.
6. What will your next book be about?
I’m writing a book called How Google Test Chrome which details our testing process start to finish on Chrome OS. It’s a totally open kimono assessment of everything that we are going. Right, wrong, false starts, great ideas, cool innovation, new tools and every test artifact we generate from plans to test cases to open source automation. I am psyched about this as I don’t believe anyone has every fully documented and published a complete project before, particularly one of the complexity of an operating system.I plan to include the browser too in this but as I have only written the first chapter on the test plan I do not want to over commit!
7. How is testing managed at Google? From one place or per country or per application or … ?
It’s divided by product lines or what we call “Focus Areas.” In my case I own the Client Focus Area. However since I am in a remote site I also have authority over all the work that goes on in Kirkland and Seattle Washington. I’m a busy boy but I like it that way. A single product would bore me.It’s funny that on the Dev side each product has a Director in charge of it. Whereas I am the Director over many products. I have Test Managers over each product who have to interact with a Director on the dev side. So if you match us up one to one, you might have a Test Manager matching wits on a daily basis with Development Director three levels above them in rank. You talk about character building, this is the place for that. Google test managers are a breed apart. Cream of the crop.
8. Are you still collecting soccer jerseys? And if so, is there one you really want to add to your collection?
Yes and I cannot wait to wear them during the World Cup. By tradition I never wear anything else during that tournament (sorry for the visual). When people invite me to speak I often get them as gifts and I have dozens. I am hoping I get a Swiss one this trip (hint, hint) and I lost my Australian one (please don’t ask) and am looking for a replacement. But I have a lot of club jerseys too and will relish the chance to wear different colors. Send me a jersey and I’ll send you some signed books!9. I hear you will giving a keynote at the Swiss Testing Day. Can you give us a sneak preview on what it will be about?
“Testing On Purpose” is the title. I am talking in far more depth about how we are testing Chrome at Google. I hope to see you there.
All that testing is getting in the way of quality by James Whittaker, Part One [TCLGroupLimited, Dec 8, 2011]
All that testing is getting in the way of quality by James Whittaker, Part Two [TCLGroupLimited, Dec 8, 2011]
A Brave New World of Testing? An Interview with Google’s James Whittaker by Forrest Shull [IEEE Software, March/April 2012 pp. 4-7]
… In their introduction, the guest editors have compiled a list of questions related to what our future, cloud-intensive world is going to look like—many of which I’ve heard myself from colleagues in government and commercial positions. The one that I hear most often is this: How should organizations leverage the power of this approach to improve testing and quality assurance of software? To get an answer, I turned to James Whittaker, an engineering director at Google, which has been at the forefront of leveraging the cloud. James is a noted expert and author on software testing, whose team has been managing Google’s cloud computing testing. Some excerpts of our conversation:
What is it like right now, looking across cloud computing testing at Google? It sounds like a pretty major undertaking.
…
In one of your previous interviews, I came across a statement of yours that has become one of my favorite thought-provoking quotes. You said, “Anyone who says that testing is getting harder is doing it wrong.” Could you expand on this a bit?
…
In the cloud, all the machines automatically work together; there’s monitoring software available, and one test case will run anywhere. There’s not even a test lab. There’s just a section of the datacenter that works for you from a testing point of view. You put a test case there and it runs. And all of the different scheduling software that any datacenter uses to schedule tasks can be used to schedule tests. So, a lot of the stuff that we used to have to write and customize for our test labs, we just don’t need anymore.
…
The other thing the cloud has done is brought us closer to our users. Think of Google Maps: it’s really impossible to hire a group of testers to exhaustively test it. It’s literally a piece of software of planetary proportions. If there’s a bug in my address on Google Maps, I’m likely to be the only one who will find it. But the cloud also enables us to reach out to users who are early adopters to get better and richer bug feedback than we were ever able to do back in the client-server days, when once software got to the field it was very difficult to update and instrument. Now, it’s easy to update a datacenter, it’s easy to instrument a datacenter. If a customer finds a bug, it’s easy for them to tell us about it, and it’s easy for us to fix it and push that fix to all our users, by just refreshing a browser.
So the cloud really does change things. It’s a different model of development; it’s a different model of testing; it’s a different model of usage.
Regarding testers and the skill sets that they’ve traditionally been applying on the job, does the same skill set still apply? Or are people being asked to develop new skills to take advantage of all these cloud features?
…
So, if I can paraphrase what you’ve been saying, the cloud is changing the whole underlying economics of software development and software testing. It’s easier and quicker for a company to try something, push it out to users, hear from the users what the problems are, and fix them, than it is to follow the traditional path of getting the requirements right up front, then getting the architecture right and nailed down, then getting the coding done well ….
Absolutely. By the time you do all that stuff, you’re too late. Your competitor’s beaten you to the market. On the cloud, you can really release and iterate—that’s much more the development model of modern times.
But you have to be careful: Google’s not pushing software out to its users saying, “Hey, is this any good? We’re not sure!” There are a lot of intermediate steps. We have an internal process we call dogfooding, as in, if you’re trying to sell dog food, you should eat your own product first to make sure it’s okay. All our software is used internally first by Googlers before we push it out to the world. If you look at something like Google+, which we released last year, we used that internally among Googlers for many months before we released it. In that process of dogfooding Google+, we found far more bugs and far richer bugs than the test team associated with Google+.
The points you’re making, about having representative users from the beginning who are able to use the product and help mature it, represents a much bigger paradigm shift than I had originally realized.
To me, that is just one of the most crucial things that companies absolutely have to get good at. In the past, if you found a bug in, say, your browser, you didn’t know how to report it. You’d have to find some bug-reporting page on the vendor’s site, and it would ask you what operating system you were using and what version of the browser you were using, and what other plug-ins you had installed…. But the machine knows all that stuff! So the idea is that once you crash, or once a user finds a bug, you just grab that machine state and send it back to the vendor so that they can understand the state the user was in exactly.
This seems like a very concrete model to use for functional testing. But does the same paradigm work if I’m worried about things like reliability, performance, or throughput?
Or better yet, security, privacy, and so on. I agree with you completely. I think the idea of paying top dollar for engineers to do functional testing really is an artifact of the 1990s and 2000s, and shouldn’t be something that companies invest in heavily in the future. But things like security, privacy, and performance are very technical in nature. You don’t do security testing without understanding a lot about protocols, machine states, or how the Web works; a lot of a priori knowledge is required. You can’t replace that. So when I give advice to functional testers who say that I’m predicting the end of their job, specialization is one of the things I recommend. Specialization is crucially important.
…
How does the simplistic testing model that we all learned in school—where you go first through unit testing, then integration testing, then system testing—adapt to the new paradigm?
We do integration testing, but we call it something different. People always say that Google just likes to change the names of things, but we did this one on purpose. We don’t have to integrate it from environment to environment, but we do have to integrate it across developers. So developer A writes one module, developer B writes another module; to us, integration testing hits both developer A’s anzsoftware that you simply do not have to run on the cloud: any sort of configuration test, and any sort of load testing, just isn’t necessary in this new modern environment. Load is taken care of for you; if it slows down, new cells in the datacenter are spun off automatically.
When you hire new testers for your teams at Google, is there something in particular that you’re looking for? You mentioned specialization as being important, but is there anything else that makes a good cloud tester versus just a good tester?
…
For folks who are trying to move legacy systems onto the cloud, does their development and testing process look a lot different from what they’d use when trying to do something more greenfield?
…
Where are things going in the future? Will abstractions allow developers and testers to worry about even fewer issues over time, or will there be new things that we do need to worry about as more and more people go on the cloud?
There are definitely some new things that we’ll need to worry about. First and foremost, connecting to customers is going to be really important. As much as we have the server side of it down (instead of having a massively complex server, we just have this cloud that takes care of itself), there’s still a lot of variation on the device/user side. If you look at the number of Android devices that are out there, and the number of operating systems and apps that people have configured onto them, that is still a hard testing problem.
The cloud actually makes that easier, too. Crowdsourcing companies are now connecting certain specific people with specific devices to people who are writing apps on those devices. So the idea of leveraging the crowd through the cloud is definitely something that hasn’t been done before, and is a new phenomenon that we’re watching really carefully here.
One thing is for sure, we’re never going to settle on a single platform. Humankind doesn’t seem to be capable of doing that, and I don’t think it would be a good thing to eliminate competition among platforms. The Linux/Windows competition has always been healthy, and the same thing is happening in the mobile space now. So we’re always going to have to develop for multiple platforms, and those platform owners are going to want to innovate as quickly as they can and they’re not always going to be checking with you or each other on those innovations, so the developers are just going to have to be on their toes.
Learn More
My conversation with James touched on many more issues than I could note here. If you’re interested in hearing more of the conversation we had, which ranged over additional issues such as cloud testing tools and handling privacy and robustness, then check out our half-hour audio interview at http://doi.ieeecomputersociety.org/10.1109/MS.2012.23.
More than anything else, my conversation with James made me aware again of the significant changes to the way we do business that accompany the cloud, and the new skills that are becoming important. Perhaps the best summary was James’ comments that “People really need to take the cloud seriously and rethink testing from the ground up. There are a lot of sacred cows in testing that just go away with the transition to the cloud. Keeping an open mind and taking advantage of the efficiencies of the cloud are going to be really important.” I certainly hope the remainder of this special issue on cloud computing will help give you useful food for thought in doing so.
Why I left Google [MSDN Blogs > JW on Tech, March 13, 2012]
Ok, I relent. Everyone wants to know why I left and answering individually isn’t scaling so here it is, laid out in its long form. Read a little (I get to the punch line in the 3rdparagraph) or read it all. But a warning in advance: there is no drama here, no tell-all, no former colleagues bashed and nothing more than you couldn’t already surmise from what’s happening in the press these days surrounding Google and its attitudes toward user privacy and software developers. This is simply a more personal telling.
It wasn’t an easy decision to leave Google. During my time there I became fairly passionate about the company. I keynoted four Google Developer Day events, two Google Test Automation Conferences and was a prolific contributor to the Google testing blog. Recruiters often asked me to help sell high priority candidates on the company. No one had to ask me twice to promote Google and no one was more surprised than me when I could no longer do so. In fact, my last three months working for Google was a whirlwind of desperation, trying in vain to get my passion back.
The Google I was passionate about was a technology company that empowered its employees to innovate. The Google I left was an advertising company with a single corporate-mandated focus.
Technically I suppose Google has always been an advertising company, but for the better part of the last three years, it didn’t feel like one. Google was an ad company only in the sense that a good TV show is an ad company: having great content attracts advertisers.
Under Eric Schmidt ads were always in the background. Google was run like an innovation factory, empowering employees to be entrepreneurial through founder’s awards, peer bonuses and 20% time. Our advertising revenue gave us the headroom to think, innovate and create. Forums like App Engine, Google Labs and open source served as staging grounds for our inventions. The fact that all this was paid for by a cash machine stuffed full of advertising loot was lost on most of us. Maybe the engineers who actually worked on ads felt it, but the rest of us were convinced that Google was a technology company first and foremost; a company that hired smart people and placed a big bet on their ability to innovate.
From this innovation machine came strategically important products like Gmail and Chrome, products that were the result of entrepreneurship at the lowest levels of the company. Of course, such runaway innovative spirit creates some duds, and Google has had their share of those, but Google has always known how to fail fast and learn from it.
In such an environment you don’t have to be part of some executive’s inner circle to succeed. You don’t have to get lucky and land on a sexy project to have a great career. Anyone with ideas or the skills to contribute could get involved. I had any number of opportunities to leave Google during this period, but it was hard to imagine a better place to work.
But that was then, as the saying goes, and this is now.
It turns out that there was one place where the Google innovation machine faltered and that one place mattered a lot: competing with Facebook. Informal efforts produced a couple of antisocial dogs in Wave and Buzz. Orkut never caught on outside Brazil. Like the proverbial hare confident enough in its lead to risk a brief nap, Google awoke from its social dreaming to find its front runner status in ads threatened.
Google could still put ads in front of more people than Facebook, but Facebook knows so much more about those people. Advertisers and publishers cherish this kind of personal information, so much so that they are willing to put the Facebook brand before their own. Exhibit A: http://www.facebook.com/nike, a company with the power and clout of Nike putting their own brand afterFacebook’s? No company has ever done that for Google and Google took it personally.
Larry Page himself assumed command to right this wrong. Social became state-owned, a corporate mandate called Google+. It was an ominous name invoking the feeling that Google alone wasn’t enough. Search had to be social. Android had to be social. You Tube, once joyous in their independence, had to be … well, you get the point. Even worse was that innovation had to be social. Ideas that failed to put Google+ at the center of the universe were a distraction.
Suddenly, 20% meant half-assed. Google Labs was shut down. App Engine fees were raised. APIs that had been free for years were deprecated or provided for a fee.As the trappings of entrepreneurship were dismantled, derisive talk of the “old Google” and its feeble attempts at competing with Facebook surfaced to justify a “new Google” that promised “more wood behind fewer arrows.”
The days of old Google hiring smart people and empowering them to invent the future was gone. The new Google knew beyond doubt what the future should look like. Employees had gotten it wrong and corporate intervention would set it right again.
Officially, Google declared that “sharing is broken on the web” and nothing but the full force of our collective minds around Google+ could fix it. You have to admire a company willing to sacrifice sacred cows and rally its talent behind a threat to its business. Had Google been right, the effort would have been heroic and clearly many of us wanted to be part of that outcome. I bought into it. I worked on Google+ as a development director and shipped a bunch of code. But the world never changed; sharing never changed. It’s arguable that we made Facebook better, but all I had to show for it was higher review scores.
As it turned out, sharing was not broken. Sharing was working fine and dandy, Google just wasn’t part of it. People were sharing all around us and seemed quite happy. A user exodus from Facebook never materialized. I couldn’t even get my own teenage daughter to look at Google+ twice, “social isn’t a product,” she told me after I gave her a demo, “social is peopleand the people are on Facebook.” Google was the rich kid who, after having discovered he wasn’t invited to the party, built his own party in retaliation. The fact that no one came to Google’s party became the elephant in the room.
Google+ and me, we were simply never meant to be. Truth is I’ve never been much on advertising. I don’t click on ads. When Gmail displays ads based on things I type into my email message it creeps me out. I don’t want my search results to contain the rants of Google+ posters (or Facebook’s or Twitter’s for that matter). When I search for “London pub walks” I want better than the sponsored suggestion to “Buy a London pub walk at Wal-Mart.”
The old Google made a fortune on ads because they had good content. It was like TV used to be: make the best show and you get the most ad revenue from commercials. The new Google seems more focused on the commercials themselves.
Perhaps Google is right. Perhaps the future lies in learning as much about people’s personal lives as possible. Perhaps Google is a better judge of when I should call my mom and that my life would be better if I shopped that Nordstrom sale. Perhaps if they nag me enough about all that open time on my calendar I’ll work out more often. Perhaps if they offer an ad for a divorce lawyer because I am writing an email about my 14 year old son breaking up with his girlfriend I’ll appreciate that ad enough to end my own marriage. Or perhaps I’ll figure all this stuff out on my own.
The old Google was a great place to work. The new one?
The future of Windows Embedded: from standalone devices to intelligent systems
March 9, 2012 5:46 pm / 6 Comments on The future of Windows Embedded: from standalone devices to intelligent systems
Updates: Kevin Dallas on intelligent systems [WindowsEmbedded YouTube Channel, March 28, 2012]
http://www.microsoft.com/presspass/presskits/embedded
– Feature: Intelligent Systems in Motion as Businesses Share Their Success Stories [Microsoft, March 28, 2012]
– Feature: Hillcrest Realizes Efficiency and Improved Data Access With Intelligent Systems [Microsoft, March 28, 2012]
– Talking Intelligent Systems with Ford [Microsoft, March 29, 2012]
Next at Microsoft Blog: Microsoft’s Steve Clayton talks with Jim Buczkowski from Ford Motor Company about the car as a consumer electronics device.
– Ford Rolls Out New 2013 Models, Featuring Improved MyFord Touch [Microsoft, March 29, 2012]
Test drive event provides first glimpse at new models and faster and easier-to-use infotainment system powered by Microsoft.
End of updates
Virtual footwear wall exhibited at NRF Expo 2012.
This had been deployed at Adidas Oxford Street in London. In two weeks, the London store sold as many shoes as three other stores did in six weeks.
Windows Embedded Standard 8 Community Technology Preview [Microsoft microsite, March 1, 2012]
Windows Embedded Standard 8 – Key Benefits [from the datasheet, Feb 29, 2012]
- Create modern, fresh user experiences with Metro style applications and touch interactions from Windows 8.
- Compete more efficiently by taking advantage of the scale provided by the Windows 8 hardware ecosystem.
- Meet hardware bill-of-materials targets. Select the software components you need to create custom OS images for your devices.
- Use the enhanced security and lockdown features to protect your device, data, and network.
- Deliver extended mobility with new power management and wireless technologies in Windows 8 that keep your devices connected, no matter what they’re doing.
- Deliver stunning Web experiences with HTML5 support in Internet Explorer 10.
- The connectivity options you want, including Wi-Fi, Bluetooth, mobile and USB 3.
- Near-Field Communication support enables an entirely new, easy way to exchange data with users. The intuitive interface allows users to easily interact with tap-to-pay vending machines or download manuals from a service kiosk.
Building Intelligent Systems With Windows [Microsoft press release, March 6, 2012]
Windows Embedded Standard 8 community technology preview is the next step for enterprises and OEMs to harness the power of Windows for specialized devices running line-of-business applications
Over the past year, Microsoft has been discussing the vast possibilities offered by the emergence of a new category within the traditional embedded market —intelligent systems.
A critical component of Microsoft’s enterprise strategy, intelligent systems enable an unprecedented flow of data with the power to transform industries such as retail, manufacturing and medicine, by connecting devices where data is generated through employees and customers to back-end systems and services where it is translated into strategic insight to inform business decisions.
Last November, Microsoft outlined a product road map for its intelligent systems vision. As part of that plan, today Microsoft is making the Windows Embedded Standard 8 community technology preview (CTP) available on the x86 architecture.
With Windows Embedded, developers can use the same trusted tools used in building applications for Windows 8 to build specialized devices within line-of-business applications — extending the power of Windows 8 and the cloud to intelligent systems. Using the Windows Embedded platform, retailers can build smart digital signage and intelligent kiosks. Manufacturers can connect shop floor devices to back-end IT. In medicine, equipment for ultrasounds, x-rays and MRIscan deliver results directly to doctors at the bedside.
The Windows Embedded Standard 8 CTP is an important step in giving developers and enterprises an opportunity to evaluate Windows 8 technologies for connecting specialized devices to powerful back-end software. With the recent release of the Windows 8 Consumer Preview and Windows Server 8 beta, along with this week’s launch of SQL Server 2012 and Windows Embedded Standard 8 CTP, Microsoft is offering a good look at a new, common set of technologies that spans specialized devices, servers, PCs and applications.
About Windows Embedded Standard 8
Windows Embedded Standard 8 is a componentized version of Windows 8that includes features designed to meet the needs of specialized devices within intelligent systems, such as these:
- Advanced device lockdown through flexible keyboard filters and a new unified write filter that combines previous write filters into a single, streamlined solution for better management and enhanced uptime.
- Support for customized experiences and branding from boot to shut down, so apps can take advantage of Windows with a customized look and feel from the first press of the power button.
- Componentization of the OS to scale the system up or down depending on the device, including a set of modules that have been preconfigured and tested to enable device-specific scenarios.
- IT pro management tools, including the Universal Configuration Tool (UCT) that provides for local and remote management, integrated into the security infrastructure.
- OS development tools, including the new Module Designer that allows developers to easily integrate third-party software into the OS, including the ability to copy files, execute commands, install drivers and modify the registry.
With the release of this CTP, developers for specialized devices and intelligent systems now have access to all the key functionalityof Windows 8. Windows Embedded Standard 8 targets the specialized device world with all the native security improvements to the stack and core operating system components as Windows 8. Also, since it aligns with the new Metro app model, Windows Embedded Standard 8 will also allow developers to scale apps across types of specialized devices, customizing the interface for each type of device while maintaining their own proprietary branding and experience.
How to Get the CTP
Interested? Head over to the Windows Embedded Standard 8 download center and get started today. It is free, and the only limit is your imagination.
Intelligent Systems in the Inwindow Outdoor Experience Station [channelintel YouTube channell, Feb 9, 2012]
More information:
- InfoBlast Newsletter: February 2012 [Feb 24, 2012]
- Intelligent Systems in Retail on Display at RetailTech Japan
Windows Embedded, Windows Embedded Standard 7, Windows Embedded, Retail
March 08, 2012 - Microsoft Creates New Opportunities for Retail and Hospitality Organizations [Microsoft feature story, March 5, 2012]
- Life Care Centers of America Chooses Microsoft for Business Intelligence
Microsoft Office 365, Business Decision Maker
March 07, 2012 - Free Webinar Explores Intelligent Systems for Healthcare Device Development
Windows Embedded, Healthcare
March 06, 2012 - Microsoft Releases SQL Server 2012 to Help Customers Manage “Any Data, Any Size, Anywhere”
Ted Kummert, Business Intelligence, Business Decision Maker
March 06, 2012 - Intelligent Systems on Display at Microsoft TechDays [in Paris] [Microsoft feature story, Feb 9, 2012]
- Microsoft’s vision for intelligent systems in retail [WindowsEmbedded YouTube channel, Jan 16, 2012]
- Intelligent Systems Help Retailers Operate More Efficiently [Microsoft feature story, Jan 16, 2012]
- Intelligent Systems Featured at NRF EXPO [Microsoft feature story, Jan 16, 2012]
- Family Dollar: 7,000 Stores, 5,000 Items, One Intelligent System [Microsoft feature story, Jan 16, 2012]
- UK’s Co-Op Unearths Data Gold Mine [Microsoft feature story, Jan 16, 2012]
- Intelligent Systems for Healthcare [WindowsEmbedded YouTube channel, March 2, 2012]
- Microsoft Unveils Product Road Map Delivering on Intelligent Systems Vision [Microsoft feature story, Nov 14, 2011]
- Agile Development and Intelligent Systems [WindowsEmbedded YouTube channel, Nov 11, 2011]
- Microsoft Drives Agile Approach to Intelligent Systems [Microsoft feature story for the embedded roadmap, Nov 14, 2011]
Windows Embedded builds foundation for immersive experiences with Natural User Interface, helps customers harness data in new ways.
When Microsoft’s Windows Embedded group was formed in 1996, embedded technology was typically disconnected from company networks. Today, embedded devices not only talk to the network, in some cases they are the network; and the number of devices in existence is on a rapid increase. IDC Research predicts that by 2015 embedded CPUs will outnumber PC CPUs by six to one, and virtually all of them will be connected to the Internet, enabling a new kind of”intelligent system.”*
Under the direction of general manager Kevin Dallas, the Windows Embedded Business Group has mapped out a strategy for the move to intelligent systems, which includes how the company will work with hardware and software manufacturers to create specialized devices and solutions that help customers capture business intelligence.
A critical part of that plan is a shift in how Microsoft approaches product development.”In the past we generally built our software to enable rich experiences on the device and the ability to run the application on the device, and that’s where the task really stopped,” says Dallas.”In an intelligent system, these applications are distributed and aware of the system in which they run, and the cloud to which they’re connected.”
Dallas calls for partners to waste no time in thinking more about how to drive value for the data that these intelligent systems can capture. He calls this the”big shift” that the company, and the industry, is making.
“You see more and more customers really trying to move toward these connected, more immersive experiences. We’ve seen it in retail, we’ve seen it in banking, and aspects of it have also appeared in the automotive and logistics industries,” says Dallas.
A More Agile Approach
Creating an intelligent system begins with tight integration between devices and the back-end infrastructure. To achieve that, the development team is optimizing the next version of the Windows Embedded platform to process structured and unstructured data generated by an array of devices. They’re also working on technologies that allow for more customization and differentiation, with features for touch, gesture and speech control, and a new user interface.
And to fuse these elements into a more cohesive experience, Microsoft is using an agile software development methodology, which is best summed up by the following values, as established by the Agile Alliance:
- Individuals and interactions over processes and tools
- Working software over comprehensive documentation
- Customer collaboration over contract negotiation
- Responding to change over following a plan
The primary goal of agile computing is to improve collaboration and communication between programmers and business experts, and help streamline the development process through more frequent delivery of code.
Heading up this effort is Ben Smith, director of Program Management for Windows Embedded. Smith recently transitioned from product development for Xbox 360 and Kinect, during which time he introduced agile methodologies.
“The entire Xbox 360 development team, including the hardware division, embraced the idea of speeding up our development process, and of building on the foundation as we progressed. As a result, we were able to design what is essentially an intelligent system,” says Smith.
After the launch of Kinect for Xbox, Dallas approached Smith to share his vision for intelligent systems and big data. To Smith, the opportunities were immediately apparent.
More Focus, Less Monolith
According to Smith, the power and complexity of tomorrow’s distributed computing, such as intelligent systems, will require a shift from less frequent, full-scale software upgrades, often the industry standard, to ones that are more frequent and incremental.
“The industry has reached a point where successful companies are those that can iterate the smartest and drive value in terms of the customer experience,” says Smith.
With that in mind, Microsoft has made the following specific changes:
- Combining the development teams for each of the Windows Embedded solutions — Windows Embedded Standard, Windows Embedded Enterprise, Windows Embedded Compact 7 — into one larger team focused on creating many products with a common platform
- Adopting agile methodologies that help developers avoid last-minute feature cuts and respond to customer feedback with midstream course adjustments
- Creating more focused and frequent code release cycles
Smith says”partners should expect releases that are more focused — less like a Swiss Army Knife with 47 different features and giant instruction manual, and more about specific scenarios that drive clear business and technical value.”
With this move to agile computing comes a heavy emphasis on feedback to help drive and inform the development process. As such, partners should expect to receive software releases earlier than they have in the past.
One of the main applications of embedded technology is to provide a stable system that will power machines for long periods of time with little or no need for IT support. That mindset remains as part of the agile development method.
Says Smith:”We’re counting on our partners to use and provide feedback on releases that will be much earlier than typical software beta releases. Through this collaboration, we’ll be able to release more polished final product in less time than ever before.”
* IDC, Worldwide Intelligent Systems 2011-2015 Forecast: The Next Big Opportunity, doc #230242, September 2011
Microsoft’s Road Map for Intelligent Systems [Nov 14, 2011]
For the past several months, Microsoft’s Windows Embedded Business has discussed the emergence of intelligent systems that can extend enterprise software and cloud services out to everyday devices. From the device to the cloud and back again, intelligent systems represent the intersection of technology and society by realizing the conceptual Internet of Things.
The opportunity is a big one, and under the direction of general manager Kevin Dallas, theWindows Embedded team has mapped out a strategy to help customers and partners make the most of it. Using new agile development methods, Windows Embedded delivers platforms and tools for hardware and software manufacturers to create intelligent systems that harness business intelligence in the enterprise.
“Intelligent systems offer endless possibilities for organizations to collect and make use of information,” Dallas says, “from understanding customer buying habits to tracking product shipments around the globe.”
Integrating tightly with Windows 8, Windows Embedded will help drive the innovation in intelligent systems. As part of the discussion, Microsoft recently laid out its plans for the Windows Embedded Enterprise, Windows Embedded Standard and Windows Embedded Compact platforms. Some highlights:
- Windows Embedded Enterprise v.Next will be available a quarter after Windows 8 is generally available for PCs
- Windows Embedded Standard v.Next will undergo a community technology preview for developers during the first quarter of 2012, with general availability three quarters after Windows 8. It will support the ARM architecture and all of the management and security functionality provided by Windows 8.
- Windows Embedded Compact v.Next will follow in the second half of 2012, introducing support for Visual Studio 2010.
MWC 2012: the 4G/LTE lightRadio network
March 8, 2012 3:57 pm / 7 Comments on MWC 2012: the 4G/LTE lightRadio network
Day 1: Alcatel-Lucent at MWC 2012 – lightRadio [AlcatelLucentCorp YouTube channel, Feb 28, 2012]
Telefónica unveils smart 4G experience at Mobile World Congress [Telefónica press release, Feb 26, 2012] or After a Year of Close Collaboration with Alcatel-Lucent and with the Support of Samsung, Telefonica Unveils Smart 4G Experience at Mobile World Congress [Alcatel Lucent press release, Feb 26, 2012]
– Telefónica’s smarter 4G delivers increased capacity in high data-traffic density areas and greater bandwidth than current LTE networks, as well as superior indoor coverage.
– Small-cell technology speeds up network deployment, reduces costs and makes more efficient use of spectrum use and costs.
– First LTE deployment of its kind at 2.6Ghz band frequency and the first time that real users will be able to experience the benefits of the 4G technology of the future.
Telefónica today announced the first live experience of the world’s ‘smartest’ 4G network, in the most ambitious technological innovation ever deployed at the Mobile World Congress[1]. The network – the world´s first of its kind in the 2.6Ghz frequency band – provides download speeds of 100 Mbps, between 40-60Mbps on upload, vastly improved indoor coverage and can increase capacity by up to 400 per cent in high density data-traffic areas.
Based on Alcatel-Lucent‘s lightRadio technology, Telefónica‘s network is a revolutionary first step towards a real ‘HetNet’ network, which greatly improves mobile coverage by bringing small-cell base stations closer to the customer. In this deployment conventional radio base stations co-exist with 4G metro cells (small base stations incorporating antennas and radio) working on the same frequency and with no interference. As a strong supporter for Telefonica‘s LTE service initiative, Samsungis also presenting the first LTE smart mobile devices for band 7 (2.6Ghz), GALAXY S II LTE smartphone and GALAXY Tab 8.9 LTE tablet, which can be used on this new high-capacity network.
The most significant feature of Telefónica‘s LTE network is its increased capacity – with each cell comfortably supporting 30 people browsing simultaneously with average speed of 30Mbps. Tests conducted on this network show that a 400% increase in capacity can be made available to users compared to a conventional network design, and that significantly higher capacity gains could be delivered with denser metro-network design. The network supports speeds of up to 10 times those offered by the 3G network, with download speeds of 100Mbps, upload speeds of 40-60 Mbps and with latency times of around 20-25 milliseconds.
‘Today the future of mobile networks begins’, said Enrique Blanco, Telefónica‘s Chief Technology Officer. ‘The deployment of LTE that Telefónica has brought to the MWC, together with Alcatel -Lucent, gives us a glimpse of a tomorrow where everyone and everything is seamlessly connected, and in superfast time. But the challenge ahead is to ensure that all the technologies currently being deployed – 2G, 3G, LTE, Wi-Fi and Fiber – can co-exist to deliver next generation communications. Telefónica’s strategy is to develop intelligent networks that allow these different technologies to co-exist efficiently and cover customers’ growing connectivity needs in markets that are at different stages of development’.
Another key benefit of the advanced feature network is that it utilises the same frequency for several network layers, allowing for far more efficient use of spectrum. This solution would also reduce network deployment costs by as much as 40 per cent, as the installation of small-cells significantly reduces the amount of construction, installation and configuration work needed. Additionally, small-cells use less powerful amplification equipment, resulting in energy savings of up to 35 per cent and a guaranteed reduction in environmental impact.
Wim Sweldens, President of Alcatel-Lucent‘s Wireless Division said: ‘This close collaboration with Telefónica through our co-creation programme is a clear articulation of the future of mobile broadband – rather than merely evolving their current architecture, which was designed for voice and messaging, Telefónica is making fast progress toward building a mobile broadband network designed with the future in mind. The wireless network of the future needs to be lighter, greener and closer to customers and deliver much higher capacity – that is what lightRadio is all about’.
‘Samsung has been actively collaborating with Telefónica for LTE trial services with a variety of device line-up from LTE dongles to MiFi and LTE smartphones across European markets. I am excited to join Telefónica’s breakthrough LTE demonstration with our representative LTE models, GALAXY S II LTE and GALAXY Tab 8.9 LTE’, said DJ Lee, executive vice president and head of Sales and Marketing team for SamsungMobile. ‘We are fully committed to supporting Telefónica’s LTE roll-out and hopefully will try to expand our LTE partnerships not only for European markets but also including Latin American markets. Our ambition is to become a number one LTE partner for Telefónica’.
Visitors to MWC will be able to experience first-hand the wide range of options generated by this innovative network at the Telefónica stand (Hall 8), where the smallest metro cells in the market and the latest LTE devices from Samsung are being exhibited. A video wall will demonstrate the capabilities of the network by projecting filmed live at the congress and delivered via LTE. The Samsung LTE devices will be used for high-definition videoconferences. This will demonstrate for both applications the improvements in speeds of both links for uploads and downloads. There will be real-ime gaming demonstrations as well as augmented reality and hypermedia applications that highlight the latency and versatility of the network.
Background information:
– Alcatel-Lucent collaborates with Telefonica to bring the first superfast 4G mobile broadband services to consumers in Spain [Alcatel-Lucent press release, Sept 14, 2011]
– lightRadio™: Evolve your wireless broadband network for the new generation of applications and users [Alcatel-Lucent microsite, Feb 27, 2012]
– Wi-Fi goes mobile with Alcatel-Lucent [Alcatel-Lucent press release for lightRadio WiFi, Feb 14, 2012]
– lightRadio WiFi [Alcatel-Lucent microsite, Feb 14, 2012]
Day 2: Alcatel-Lucent at MWC 2012 – Bell Labs [AlcatelLucentCorp YouTube channel, Feb 29, 2012]
lightRadio™ Network: A New Wireless Experience [ by Rasika Abeysinghe, PhD in Enriching Communications, Alcatel-Lucent business e-zine, Feb 27, 2012]
Highlights
- Operators can deliver up to 10-times more capacity by deploying a 1:10 ratio of small- to macro cells
- The lightRadio Network can improve capacity by up to 70%
- A modular and virtualized architecture lets operators adapt to changing usage patterns
The rising demand for mobile broadband services is straining legacy wireless networks. Operators face increasing pressure to deliver the rich quality of experience (QoE)their customers and partners expect. To meet these expectations and remain competitive, they need cost-effective and sustainable network architectures that can deliver increased connectivity and capacity on demand.
QoE: The new currency in the mobile value chain
As the focus of mobile wireless communications shifts from voice to data, users attach greater importance to QoE. Today’s users expect fast wireless networks, comprehensive coverage and uninterrupted connectivity. There’s no room for delays, dropped connections or peak-time congestion in their vision of mobile broadband.
Users clearly value QoE, but application and content providers (ACPs) depend on it. Whether ACPs offer TV streams, interactive apps or video conferencing services, QoE plays a central role in their success. They have a vested interest in ensuring that users enjoy the best possible experience. For this, ACPs rely on mobile operators and their networks.
To move up the mobile value chain and attract partnerships with ACPs, operators have to deliver on QoE. Operators can control QoE, for example, by managing bit rates or by making it easy for users to switch between 2G/3G/LTE networks and Wi-Fi hotspots. But they need to control it more efficiently to prove their value as partners and providers and position themselves as the ideal channel for delivering value-added applications and content.
The QoE and capacity challenge
Legacy macro networks were built to support voice services, a task they perform extremely well. But the demand for mobile broadband data services adds new and more complex challenges to wireless networks. Operators who retrofit voice networks for data face a host of new challenges.
For example, operators don’t always have spectrum for mobile broadband services. This makes it tough to meet demand for data. Increasing indoor wireless use also presents problems. Outdoor macro towers can’t always deliver sufficient data rates, coverage and capacity to users in homes and offices.
Today, operators are constantly trying to squeeze more capacity out of legacy networks. One common strategy is cell splitting — adding cells, towers and sites. This can be complex and expensive, and zoning rules can even make it impossible in some areas. Operators that don’t evolve their networks — or don’t evolve fast enough — may be left behind by customers and competitors who embrace next-generation equipment.
Building wireless networks for an unpredictable future
Mobile operators want wireless networks that can help them tackle the challenges of today and tomorrow. These challenges include:
- Adding capacity where users want and need it
- Ensuring that customer QoE is met
- Building a cost-effective foundation for addressing future demand
- Delivering eco-sustainable solutions
Smartphone penetration and mobile data traffic are increasing rapidly. According to Vision Mobile, in the 3rd quarter of 2011 smartphone shipments penetration surpassed 29% globally.[1]People still use their phones mostly for voice — on a time basis. However, they consume more data with apps including video streaming, music, web browsing and social networking from their homes, offices and in the community. They connect to hotspots in high-traffic areas like stadiums, public squares and hotels. Operators have to provide more capacity in more locations to ensure that QoE follows users wherever they go.
While no one can say for certain what capacity needs will be in 5 years, we do have reasonably good models for the next 6 months to a year. However, if a new type of device like the Apple iPhone® or iPad®[2]arrives on the market it could cause a major disruption. What we know right now is that new wireless devices — smartphones, tablets, gaming consoles, in-car devices — will fuel demand by supporting smarter applications and richer content. Wireless networks will need to be flexible enough to handle whatever demand the future brings. And they’ll need to do it while keeping costs low.
It’s not all about delivering more capacity and richer experiences. Operators need to consider the environment, too. The next generation of wireless network architectures must have a smaller carbon footprint. This means consuming less power. It also means deploying elements that use less space and blend in with what’s around them. No one wants to see more towers and more bulky equipment.
The lightRadio™ Network advantage
Alcatel-Lucent has introduced the lightRadioNetwork to empower operators to deliver on their present and future challenges. It seamlessly increases capacity and extends it to more places, helping operators satisfy users and generate new revenue. It reduces power consumption and footprint, enabling operators to promote sustainability and bottom-line growth. And it provides an effective foundation for supporting future demand, helping operators manage capacity and cost.
For users, it all comes down to QoE. With the lightRadio Network, users get higher throughputs to support the rich experiences they crave. In contrast to traditional wireless networks, this support is continuous: Whether indoors, outdoors or on the move, users switch seamlessly to the best possible network. There’s no need to pause a video or interrupt an application to select a hotspot or enter a password.
A closer look
The lightRadio Network is inherently heterogeneous bringing together a broad range of technologies and different types of access nodes. At the same time, the architecture is homogenous: Its components share the same platforms, control and management. These components can include:
- Small cells, which extend coverage indoors and in hotspots. Small cells perform efficiently in residences and businesses. They work best when deployed close to users, for example, on lamp posts or walls in train stations or shopping centers. In a given network, operators can deliver up to 10-times more throughput by deploying a 1:10 ratio of small cells to macro cells.[3]
- lightRadio wideband active antenna arrays (WB-AAA), popularly known as cubes, that use advanced interference management algorithms to create overlapping zones of high signal strength. Known as vertical sectorization, this increases capacity and coverage for a given area. These comparatively low-power elements make more efficient use of spectrum. When deployed in a macro environment, they can improve capacity by up to 70%.[4]This improved capacity can help operators attract users and generate more revenue.
- Wi-Fi hotspots, that allow operators to offer additional options for access to high bandwidth data users. This has the dual benefit of keeping the end user satisfied and allowing the operator to take some traffic off costly cellular spectrum. The lightRadio architecture uses a common core network to support Wi-Fi and cellular access. Users can seamlessly switch between the two without having to enter a new password.
All of these components support sharing and virtualization, which help operators deliver more flexible capacity and control. For example, operators can connect lightRadio cubes to external baseband units (BBUs) to serve hotspots that require massive capacity, such as sports arenas. Or, operators can scale and share control capacity to cost-effectively improve performance at specific places and times. This can help overcome traffic spikes that arise as new devices connect to the data network.
Making the move
Operators face no significant barriers to making the move to the lightRadio Network. While each operator has a unique starting point based on its own business needs and operating environment they have a number of things in common. They need modular, flexible wireless networks that can address data demand and keep costs in check.
This new network architecture helps operators kick-start transformation with the wireless infrastructure, spectrum and multivendor networks they have now. An effective transformation includes:
- Targeting capacity problems in hotspots and indoors
- Migrating to LTE for efficient spectrum usage
- Adding a WB-AAA architecture for more capacity per site
- Virtualizing capacity and control for more flexibility
Operators can control costs by scaling capacity in manageable increments. These strategies and savings can extend to many parts of the network, including wireless backhaul links, small sites and legacy equipment.
By alleviating concerns about capacity, scalability and cost, the lightRadio network architecture offers operators the chance to rethink the challenges of the present and future. It can help them swap a defensive stance — coping with demand — for a positive approach focused on turning mobile broadband demand into new revenue.
To contact the author or request additional information, please send an email to enrich.editor@alcatel-lucent.com.
Footnotes
[1] Mobile Platforms: The Clash of Ecosystems, VisionMobile, Nov. 2011; http://www.visionmobile.com/blog/2011/11/new-report-mobile-platforms-the-clash-of-ecosystems/. ↩
[2] iPhone® and iPad® are trademarks of Apple Inc. ↩
[3] Based on Alcatel-Lucent study, 2011. ↩
[4] ibid. ↩
Day 3: Alcatel-Lucent at MWC 2012 – Cloud solutions [AlcatelLucentCorp YouTube channel, Feb 29, 2012]
Alcatel-Lucent and China Mobile accelerate development of lightRadio™ to support exploding customer demand for mobile broadband in China [Alcatel Lucent press release, Feb 28, 2012]
Teams sign lightRadio™ architecture co-creation agreement to accelerate the development and delivery of lightning fast mobile broadband services in a sustainable way
Alcatel-Lucent (Euronext Paris and NYSE: ALU) and China Mobile have signed a co-creationagreement under which teams from the two companies are conducting joint development and test activities on a series of lightRadio projects at Alcatel-Lucent’s Stuttgart lab. The work will help to accelerate the smooth commercial introduction of this groundbreaking new product family to meet China Mobile’s business initiatives and support growing customer demand for high-bandwidth mobile broadband services.
China Mobile is the world’s largest mobile provider. Benefiting from the dramatic growth in mobile Internet use, China Mobile had around 650 million subscribers by the end of 2011. – and that number is growing at a rate of 11.2% year over year. Those customers are increasingly adopting smart phones, tablets and other mobile devices which is driving massive demand for high-bandwidth mobile data services such as video and Internet gaming.
This co-creation agreement follows the non-binding MoU signed between Alcatel-Lucent and China Mobile in mid 2011. It also builds on the collaboration between the companies on the delivery of superfast mobile broadband using TD-LTE technology and the announcement of the first Trans-Pacific lightRadio video call. This agreement defines the projects that will be undertaken by the two companies and kicks off the co-development activities. This collaboration will speed the introduction of the lightRadio product prototype to the second half of the year.
Alcatel-Lucent’s lightRadio™ reduces the size of traditional mobile base stations to a Rubik’s cube, while lowering power-consumption and allowing the transfer of vast amounts of data at lightning fast speeds.
Ben Verwaayen, CEO of Alcatel-Lucent, said: “The ability for us to pair with the world’s largest mobile provider to gain deeper insight into its customer behaviours and the way services are evolving will ensure we develop lightRadio in the right direction. The knowledge we gain from implementing lightRadio to support the delivery of mobile broadband services to 650 million people will help us to meet the growing demand for services across the globe.
The co-creation agreement was signed by Bill Huang, president of China Mobile Research Institute and Wim Sweldens, president of Alcatel-Lucent Wireless Division, on January 13 at Alcatel-Lucent’s Bell Labs headquarters in Murray Hill, New Jersey. Li Zhengmao, Vice President of China Mobile, and Jeong Kim, President of Alcatel-Lucent Bell Labs attended the signing ceremony.
Under the terms of the agreement, China Mobile engineers have already begun working with Alcatel-Lucent’s R&D team in the company’s lab in Stuttgart Germany. The two companies will co-work on a series of lightRadio joint development projects including the cube-based radio, baseband unit (BBU) pooling and redefining the radio architecture. By bringing together talents from both companies, these projects will support China Mobile’s evolving business initiatives, including the introduction of high-speed TD-LTE mobile broadband technology, encourage idea-generation and facilitate the smooth commercial implementation of lightRadio.
Day 4: Alcatel-Lucent at MWC 2012 – Rise above the data storm [AlcatelLucentCorp YouTube channel, March 2, 2012]
Video of Day 4 at Mobile World Congress 2012. The big discussion at MWC is about the data storm, the demand for mobile broadband services driven by the customers and anytime, anywhere access to these services. … How operators rise above the data storm? For capacity (serving more users in the peak hours), for having lower costs per bit (to get to the mass market), and for monetizing more often (in 65% of time or more the networks are not in a peak). How they do that? They do that with lightRadio which gives them an IP platform, mobile broadband platform end-to-end. … Best infrastructure technology 2012 goes to lightRadio and the Cube. … “This thing has transformed our company. It has transformed the way that we develop products. It has trasformed the way people looked at us. And this will for the long period to come be the symbol of the new Alcatel-Lucent.” — Ben Verwaayen, CEO, Alcatel-Lucent.
MWC: The fastest show on Earth [by Ben Verwaayen, CEO, Alcatel-Lucent, March 1, 2012]
![]()
And the GSMA award for best infrastructure technology goes to… lightRadio!
MWC, the fastest show on earth for anybody who is somebody in the telecoms world, is always an amazing event.
What is clear this year is that the world is flatter than ever before. Innovation is coming from all corners of the globe and there is no region that is excluded from the information and video revolution.
As a result players are on the move. Policy makers, regulators, operators, technology providers, service providers, application developpers, all are chasing the customer driving the change.
She happens to be 14 year old and she was dearly missed at the exhibition. Her focus is services, video, chatting, facebooking on multiple screens.
She wants to be identified as a person, recognized for her preferences and protected against undue approaches. She doesn’t care about cloud or tablet or smartphone, she wants cool services on a cool device and be able to afford it.
There are so many that agree with her that the market dynamics are changing worldwide.
Alcatel-Lucent had a great win in Barcelona. We won the infrastructure of the year award for lightRadio, our cubesized basestation, that performed in real traffic wonders for Telefonica. Customers loved the stand, they agreed with the need to make the network relevant in the journey ahead. We have never seen so much interest in our latest portfolio of Services, Applications and Products.
So, days packed, staff worked 24/7 but it was a great experience.
Ben
Details of the lightRadio technology (copied from the earlier post: Good TD-LTE potential for target commercialisation by China Mobile in 2012 [July 13, 2011 – Feb 8, 2012])
lightRadio: Alcatel-Lucent at “Best Practice Live” virtual conference [July 5, 2011]
lightRadioTM is a disruptive Wireless Architecture that enables operators the opportunity to develop next generation converged 2G/3G/LTE Radio Networks. Valérie Layan – VP Wireless Solutions EMEA at Alcatel-Lucent outlined how this unique solution offers a dramatic new way of building networks that will enable Macro and Small Cell integration, offer Opex savings of more than 50% compared to Classic BTS design and set the course for Wireless & Wireline convergence.
LIGHTRADIO CONNECTS THE WORLD [June 15, 2011]
The world’s first long-distance, high-quality mobile video-call using lightRadio™ – a breakthrough system pioneered by Alcatel-Lucent (Euronext Paris and NYSE: ALU) to transform the economics and efficiency of mobile telephony– has successfully taken place from the historic desk of Alexander Graham Bell.
Industry executives, technology leaders and analysts witnessed the inaugural lightRadio video call made from the headquarters of Bell Labs, the innovation engine of Alcatel-Lucent and now home to Graham Bell’s desk, from which he made the first-ever long-distance phone call.
Chris Lewis, Group Vice President of industry analysts IDC, hosted the call from Bell Labs in Murray Hill, New Jersey, connecting with Ben Verwaayen, Chief Executive of Alcatel-Lucent in Paris, and delegates at a business conference in Miami.
lightRadio is the name of a family of technologies which are set to transform mobile communications, improving the quality of network services for consumers while dramatically reducing the size, carbon footprint and energy consumption of mobile base stations.
After participating in the call, Ben Verwaayen, said: “We have taken lightRadio from the drawing-board to a fully working system, creating an entirely new system to connect customers around the world.”
The launch of lightRadio will help address exploding demand for mobile broadband services and increasing global consumption of wireless content. This has been fuelled by the adoption of smartphones and the popularity of video applications, social networking and mobile gaming services– all requiring wireless service providers to provide greater speed and capacity everywhere.
Network operators such as France Telecom/Orange, Telefonica and China Mobile are now engaged with Alcatel-Lucent in co-creating the market implementation of lightRadio. The system is expected to deliver significant operational savings for carriers and infrastructure owners by marking an end to the existing system of complex base stations and large cell towers.
This week’s inaugural call demonstrates lightRadio’s ability to handle high levels of data, meeting demand from customers increasingly using mobile video on Internet-networks. Among breakthroughs promised by the system, it will reduce mobile network energy consumption by 50% – compared with current equipment; enable roll-out of mobile broadband services to new marketsusing sustainable-power sources; and deliver major savings for operators.
Alcatel-Lucent predicts that lightRadio will help cut the cost of mobile infrastructure site, energy consumption, operations and maintenance. Bell Labs estimates that the total cost of ownership of mobile networks, the sum spent by mobile operators on access systems, reached 150 billion Euros in 2010.
More information about Alcatel-Lucent’s lightRadio portfolio can found online at http://www.alcatel-lucent.com/lightradio.
China Mobile and Alcatel-Lucent partner to develop next-generation RAN [Feb 15, 2011]
Alcatel-Lucent today announced it has signed a Memorandum of Understanding (MOU) with China Mobile, the world’s largest mobile operator and a leader in TD-SCDMA and TD-LTE, for the development of a next-generation radio access network (RAN). The MOU was signed by Alcatel-Lucent Shanghai Bell, Alcatel-Lucent’s flagship company in China.
Alcatel-Lucent and China Mobile will jointly launch technical and economic studies and investigate the technologies essential to build a centralized, collaborative, Cloud-based RAN (C-RAN) in order to set new standards for cost-effectiveness, network intelligence and energy-efficiency (“green”). The C-RAN will provide a common platform for multi-mode wireless standards such as GSM, 3G, and LTE, enabling to significantly improve network quality and coverage, reduce transmission resource consumption and lower OPEX by up to 50% and CAPEX by 15%.
Rajeev Singh-Molares, president of Alcatel-Lucent’s activities in Asia-Pacific said: “The partnership with China Mobile is directly addressing the challenges of high energy costs, explosion of mobile video and sustainable development. By helping them replace traditional network designs with flexible cloud-like architectures, we are preparing the future and help show the way in terms of technology and economic models.”
The strategic partnership for C-RAN will leverage Alcatel-Lucent’s recently-announced lightRadio, a breakthrough in mobile and broadband infrastructure to streamline and radically simplify mobile networks. Pioneered by Bell Labs, Alcatel-Lucent’s research and development arm, the new lightRadio system will dramatically reduce operating costs, technical complexity and power consumption. This is accomplished by taking today’s base stations and massive cell site towers, typically the most expensive, power hungry, and difficult to maintain elements in the network, and radically reducing and simplifying them.
lightRadio represents a new architecture where the base station, typically located at the base of each cell site tower, is broken into its components elements and distributed through the antenna or the network for cloud-like processing. Additionally the various cell site tower antennas are combined and shrunk into a single small powerful, Bell Labs-pioneered multi frequency, multi standard (2G, 3G, LTE) device that can be mounted on poles, sides of buildings or anywhere else there is power and a broadband connection.
The partnership with China Mobile also reflects Alcatel-Lucent’s strong commitment to sustainable development and to Green as testified, in particular, by its leading role in the
GreenTouch™ Consortium, a global research initiative dedicated to dramatically improving the energy efficiency of information and communications technology (ICT) networks by a factor of 1,000. GreenTouch™ recently presented a Large-Scale Antenna System proof of concept offering the potential for tremendous energy savings thanks to its novel wireless transmission techniques.
Alcatel-Lucent maps the future of mobile technology [Feb 7, 2011]
Alcatel-Lucent (Euronext Paris and NYSE: ALU) today announced lightRadio™, a breakthrough in mobile and broadband infrastructure that streamlines and radically simplifies mobile networks. The solution was unveiled at a major press launch event in London supported by partners Freescale and HP.
Pioneered by Bell Labs, Alcatel-Lucent’s unique research and development arm, the new lightRadio system will dramatically reduce technical complexity and contain power consumption and other operating costs in the face of sharp traffic growth. This is accomplished by taking today’s base stations and massive cell site towers, typically the most expensive, power hungry, and difficult to maintain elements in the network, and radically shrinking and simplifying them.
lightRadio represents a new architecture where the base station, typically located at the base of each cell site tower, is broken into its components elements and then distributed into both the antenna and throughout a cloud-like network. Additionally today’s clutter of antennas serving 2G, 3G, and LTE systems are combined and shrunk into a single powerful, Bell Labs-pioneered multi frequency, multi standard Wideband Active Array Antenna that can be mounted on poles, sides of buildings or anywhere else there is power and a broadband connection.
Alcatel-Lucent’s new lightRadio product family, of which initial elements ready to begin customer trials in the second half 2011, provides the following benefits:
- Improves the environment:lightRadio reduces energy consumption of mobile networks by up to 50% over current radio access network equipment. (As a point of reference, Bell Labs research estimates that basestations globally emit roughly 18,000,000 metric tons of CO2 per year). Also, lightRadio provides an alternative to today’s jungle of large overcrowded cell site towers by enabling small antennas anywhere.
- Addresses digital divide: By reducing the cell site to just the antenna and leveraging future advances in microwave backhaul and compression techniques, this technology will eventually enable the easy creation of broadband coverage virtually anywhere there is power (electricity, sun, wind) by using microwave to connect back to the network.
- Offers major savings for operators: Thanks to lightRadio’s impact on site, energy, operations and maintenance costs; when combined with small cells and LTE, this new solution can lead to a reduction of total cost of ownership (TCO) of mobile networks up to 50% (as a point of reference, Bell Labs estimates that TCO spent by mobile operators in mobile access in 2010 was 150 billion Euros).
Ben Verwaayen, CEO of Alcatel-Lucent, said: “lightRadio is a smart solution to a tough set of problems: high energy costs, the explosion of video on mobile, and connecting the unconnected.”
Alain Maloberti, Senior Vice President, Network Architecture and Design, France Telecom/Orange said: “Alcatel-Lucent’s new vision and strategy of mobile broadband is quite exciting: the new wireless network architecture and innovative radio proposal will potentially help us to achieve significant operating cost savings and be better prepared for future challenges. We look forward to work closely with Alcatel-Lucent to explore and test this new approach.”
Tom Sawanobori, VP Technology Planning, Verizon Wireless, said: “Verizon looks forward to learning more about the benefits of lightRadio technology and how they could be applied as we continue to expand and evolve our LTE network.”
Alcatel-Lucent is also in advanced planning with China Mobile as well as a number of other carriers around the globe around co-creation and field trials of the lightRadio solution.
Alcatel-Lucent studies have concluded that the total addressable opportunity for the multi-technology radio market1, which lightRadio addresses, will be over 12 billion Euros in 2014, representing more than 55% of the total wireless RAN market. The cumulative total addressable market will be over 100 billion Euros from 2011-2018.
Alcatel-Lucent’s lightRadio portfolio integrates a number of breakthrough innovations and technologies from Alcatel-Lucent’s Bell Labs research arm and ecosystem of companies:
Market Impact Technology Innovation A new generation of active antennas allows vertical beam-forming that improves capacity in urban and suburban sites by about 30%, supports all technologies (2G, 3G, and LTE) and covers multiple frequency bands with a single unit. lightRadio cube – A unique Bell Labs antenna technology, the lightRadio Cube includes an innovative diplexer type, radio, amplifier, and passive cooling in a small cube that fits in the palm of the hand. By moving former basestation components to a System on a Chip (SOC), lightRadio places processing where it fits best in the network – whether at the antenna or in the cloud. System-on-a-chip (SoC) jointly developed with Freescale Semiconductor, integrates intelligent software from Alcatel-Lucent onto fully remotely programmable state-of-the-art hardware. The economics of radio networks are substantially improved by reducing the number and cost of fiber pairs required to support the traffic between the antenna and the centralized processing in the cloud. Unique compression algorithms provide nearly a factor of three compression of IQ sample signals. Matching of load to demand through ‘elastic’ controller capacity, delivered on sets of distributed and shared hardware platforms, will improve cost, availability, and performance of wireless networks. Virtualized processing platforms. Alcatel-Lucent will use innovative virtualization software and will collaborate with partners like HP to enable a cloud-like wireless architecture for controllers and gateways. The lightRadio Product Family The new Alcatel-Lucent lightRadio product family is composed of the following components: Wideband Active Array Antenna, Multiband Remote Radio Head, Baseband Unit, Controller, and the 5620 SAM common management solution. The Wideband Active Array Antenna will be trialed later this year and have broad product availability in 2012. Additional product family members will be available over 2012, 2013 and 2014.
For detailed information on these elements please as well as a webcast replay of today’s press conference please visit http://www.alcatel-lucent.com/lightradio(replay available at 2:30 pm GMT). The lightRadio approach and technology path will be shown and explained further at Mobile World Congress in Barcelona on 14-17 February.
[1] The multi-technology radio market consists of radio access base stations that simultaneously support 2G, 3G, and LTE, and multiple frequencies, in the same platform.
“Alcatel-Lucent’s lightRadio approach is a revolutionary step in evolving traditional telecommunication networks to more heterogeneous networks with higher capacity and lower cost,” said Lisa Su, Senior Vice President and General Manager of Freescale’s Networking and Multimedia Group. “Freescale is collaborating with Alcatel-Lucent to provide the chip-based architectures through our new system-on-chip technology that supports the highly-flexible, multi-standard, programmable capability required to make lightRadio a reality.”
“Communication service providers will be better able to meet the shifting and growing demands placed on their networks as a result of the new lightRadio product family from Alcatel-Lucent,” said Sandeep Johri, vice president, Strategy and Solutions, Enterprise Business, HP. “As part of the lightRadio evolution, HP intends to work with Alcatel-Lucent in a co-creation fashion around the use of cloud and virtualization technologies in the mobile access space.”
“The day has finally come when service providers need to take a serious look at the road ahead in terms of technology and their economic models,” said Phil Marshall of Tolaga Research. “To survive and thrive, service providers must evolve network designs, embrace small cell sites and all-IP architectures and replace traditional network designs with flexible cloud-like architectures that can truly meet the data demands of the future.”
The Disappearing Mobile Masts and Towers [Feb 7, 2011]
The looming global gridlock in mobile communications promises to be averted following the launch today of pioneering technology which will remove the bottlenecks constraining mobile networks and help deliver universal broadband coverage.
Alcatel-Lucent (Euronext Paris and NYSE: ALU), the leading network technology group, has joined forces with industry partners to develop lightRadio™, a new system that signals the end of the mobile industry’s reliance on masts and base stations around the world.
Ben Verwaayen, Chief Executive Officer of Alcatel-Lucent, said: “Today’s and tomorrow’s demands for coverage and capacity require a breakthrough in mobile communications.”
He added: “lightRadio will signal the end of the basestation and the cell tower as we know it today.”
Governments and regulatory bodies are expected to welcome the technical development, which will help meet targets for universal broadband access by laying the foundation to address the so-called “digital divide.”
Other major benefits from lightRadio™ include:
- Shrinking the carbon footprint of mobile networks by over 50%
- Reducing the Total-Cost-of-Ownership of mobile operators by up to 50%
- Improving end user services by significantly increasing bandwidth per user thanks to the deployment of small antennas everywhere
Wim Sweldens, President of Alcatel-Lucent’s Wireless Division said: “lightRadio will help mobile operators evolve their networks to address the mobile broadband deluge.”
lightRadio represents a new approach where the base station, typically located at the base of each cell site tower, is broken into its components elements and then distributed into both the antenna and throughout a cloud-like network.
lightRadio also shrinks today’s clutter of antennas serving 2G, 3G, and LTE systems into a single powerful, Bell Labs-pioneered antenna that can be mounted on poles, sides of buildings or anywhere else there is power and a broadband connection.
The innovation coincides with growing demand for third-and-fourth generation mobile networks and devices, involving the mass adoption of wireless television services and other forms of broadband content. The total addressable market for the radio technology necessary to serve such networks and devices is expected to exceed €100bn1 over the next seven years.
Alcatel-Lucent announced the lightRadio™ technical specifications and launch timetable at an industry event in London today. Visit http://www.alcatel-lucent.com/lightradiofor product press release and link to event replay (available at 2:30 GMT).
[1] This is the total addressable market for multi-technology radio solutions that consist of radio access base stations that simultaneously support 2G, 3G, and LTE, and multiple frequencies in the same platform
Freescale introduces industry’s first multimode wireless base station processor family that scales from small to large cells [Feb 14, 2011]
Freescale Semiconductor – the communications processing leader and provider of industry-leading DSP technology – is transforming the future of wireless infrastructure equipment with the introduction of a highly integrated base station-on-chip portfolio built on advanced heterogeneous multicore technology. Freescale’s new QorIQ Qonverge seriesis the first scalable family of products sharing the same architecture to address multi-standard requirements spanning from small to large cells.
The explosion of smart connected devices with increasing data and video content has created a mobile data tsunami, requiring OEMs and carriers to dramatically boost network performance while controlling capital expenditure costs, increasing power efficiency and supporting the emergence of 4G technologies.
The QorIQ Qonverge portfolio of base station-on-chip products is based on a common architecture and integrates communications processing, digital signal processing and wireless acceleration technologies into a single system-on-chip in various configurations optimized for next-generation femtocell, picocell, metrocell and macrocell base stations. Advanced process technology and exceptional integration allow the convergence of multiple functions traditionally performed on separate FPGAs, ASICs, DSPs and processors to be incorporated on a single device. This integration lowers part counts and delivers significant power, cost and footprint reductions for base stations. The common architecture spanning from femto cells to macro cells optimizes R&D investments and software reuse.
“The current explosion in mobile data traffic worldwide provides unique challenges and opportunities for wireless infrastructure equipment providers as they race to increase capacity and capability,” said Lisa Su, senior vice president and general manager of Freescale’s Networking and Multimedia Group. “Freescale’s highly integrated QorIQ Qonverge portfolio enables base station manufacturers to provide a dramatic, step-function improvement in performance, power and cost in a single, flexible architecture.”
QorIQ Qonverge technology can deliver 4x cost reduction and 3x power reduction for LTE + WCDMA macro base stations, and 4x cost and power reductions for LTE + WCDMA pico base stationswhen compared to wireless infrastructure equipment powered by discrete silicon products.
“By integrating multiple industry-leading technologies into one scalable product line, Freescale’s QorIQ Qonverge portfolio delivers significant innovation that advances the state of wireless networking at this pivotal time for the industry,” said Will Strauss, president and principal analyst of Forward Concepts. “The QorIQ Qonverge portfolio presents a unique solution and strengthens Freescale’s position as a processing technology leader in the wireless infrastructure space.”
Freescale leveraged its broad R&D scale, deep application knowledge of the wireless space and extensive IP portfolio to develop the new product family. QorIQ Qonverge processors combine multiple Power Architecture® cores and high-performance StarCore DSPs with a MAPLE multimode baseband accelerator, packet processing acceleration engines, interconnect fabric and next-node process technology. The portfolio’s products support multiple standards, including GSM, LTE – FDD & TDD, LTE-Advanced, HSPA+, TD-SCDMA and WiMAX. In addition, the family’s flexible architecture allows support for evolving standards with software upgrades.
“Freescale’s innovative QorIQ Qonverge platform provides the integration, performance, energy efficiency and unmatched scalability that our new lightRadio™ product portfolio requires,” said Wim Sweldens, president of Alcatel-Lucent’s Wireless Division. “Game-changing products like lightRadio disaggregate the base station between the network and the wideband active antenna, produce dramatic cost savings and need components that provide giant leaps forward such as Freescale’s new QorIQ Qonverge technology.”
“Freescale’s QorIQ Qonverge product line gives us the flexibility to cost-effectively address the widest possible small cell market by providing a common architecture and multimode capabilities, along with the programmability for us to incorporate our own advancements,” said Michael Clark, Airvana’s general manager for femtocell business. “We look forward to working with Freescale to help accelerate the deployment of small cells in next-generation wireless networks.”
According to analyst firm Infonetics, radio access network base station spending is projected to be $197 billion worldwide over the next four years.
Complete solutions
Customers can develop best-of-breed solutions with ease by combining their own differentiated IP with off-the-shelf components from Freescale and ecosystem partners. Freescale has assembled a rich ecosystem of technology leaders focused on wireless applications. Products and services from these partners can be combined with third party tools, as well as Freescale’s CodeWarrior technologies and VortiQa application software. This ecosystem can provide ODMs and OEMs Layer 1 – 4 software, transport and security stacks, RF technologies, test and measurement capabilities and ODM solutions.
A development platform based on the P2020-MSC8156 AMC bundled with partner software and RF solutions is available immediately for rapid software development. In addition, Freescale offers a wide portfolio of GaAs MMICs and LDMOS RF solutions for consumer and enterprise pico and femto cells.
QorIQ Qonverge products
The QorIQ Qonverge portfolio includes four distinct products optimized for small cell (femto and pico) and large cell (metro and macro) applications. It also supports remote radio head and emerging cloud-based radio access network (C-RAN) configurations.
The first products in Freescale’s QorIQ Qonverge multicore portfolio are built in 45-nm process technology and planned for availability in the second half of 2011. The products are the PSC9130/PSC9131 femto SoCs and PSC9132 picocell/enterprise femto SoC devices. Freescale plans to introduce portfolio members targeting larger cell (metro and macro) base stations built in 28-nm process technology later this year.
PSC9130/31 Femto SoC
8-16 users (WCDMA, LTE, CDMA2K) and simultaneous multimode
2×2 MiMO
1x e500 and 1x SC3850
MAPLE-B2F acceleration
PSC9132 Pico/Enterprise Femto SoC
32-64 users (WCDMA, LTE) and simultaneous multimode
2×4 MiMO
2x e500 and 2x SC3850
MAPLE-B2P acceleration
About Freescale Semiconductor
Freescale Semiconductor is a global leader in the design and manufacture of embedded semiconductors for the automotive, consumer, industrial and networking markets. The privately held company is based in Austin, Texas, and has design, research and development, manufacturing and sales operations around the world. www.freescale.com.
Supporting Partner Quotes Follow
Enea “Enea currently provides a breadth of leading software solutions to support Freescale’s extensive portfolio of networking IP,” said Marcus Hjortsberg, vice president of Marketing for Enea. “We look forward to playing a role in unleashing the innovative capabilities of Freescale’s new QorIQ Qonverge hybrid multicore portfolio.”
Green Hills “With a long history of optimized support for Freescale’s multicore and multiprocessor platforms, we are excited to see Freescale’s next-generation wireless base station solution,” said Dan Mender, vice president of Business Development, Green Hills Software. “QorIQ customers use our multicore development tools and scalable real-time operating systems, MULTI and INTEGRITY, to conquer today’s multicore challenges and we look forward to supporting them as they adopt the QorIQ Qonverge portfolio.”
Mentor Graphics “The integration of StarCore DSP technology with Power Architecture cores in the new Freescale QorIQ Qonverge portfolio is a major advancement for the wireless industry. We see great potential for this class of heterogeneous multi-core designs,” said Glenn Perry, general manager of the Mentor Graphics Embedded Software Division. “The Mentor Embedded Linux platform for Freescale devices combined with CodeSourcery software development tools will enable our mutual customers to develop advanced, innovative and scalable systems with increased performance and power efficiency.”
Aricent “We are thrilled to be partnering with Freescale to accelerate development of new best-in-class solutions in the wireless infrastructure market,” said C.P. Murali, executive vice president and general manager at Aricent. “Our comprehensive suite of software frameworks and product engineering services enable customers to rapidly introduce innovative solutions based on Qonverge technology.”
Continuous Computing “We are proud to be a member of Freescale’s technology partner program and for Freescale to be a member of the Continuous Computing Network,” said Todd Mersch, director of Product Line Management at Continuous Computing. “Together we offer customers a complete range of femto to macro base station solutions consisting of Trillium wireless software and the latest advances in the QorIQ Qonverge portfolio of processors.”
Critical Blue ”Freescale’s QorIQ Qonverge platform is architecturally very innovative. Meeting next-generation network speed requirements will require software developers to make knowledgeable choices in application partitioning and task allocation to the different types of cores on these platforms,” said David Stewart, chief executive officer of CriticalBlue. “The development program we have ongoing with Freescale will ensure that our Prism tool has all the capabilities needed to support a smart methodology for software developers, enabling them to get the maximum benefit from targeting the QorIQ Qonverge platform.”
L&T Infotech “L&T Infotech is excited to collaborate and build world-class wireless solutions based on Freescale’s QorIQ Qonverge portfolio,” said Sudip Banerjee, chief executive officer for L&T Infotech. “Our end-to-end telecom proficiency spans the entire wireless domain, with proven expertise on LTE/WiMAX, multicore technologies, network security and optical transport networks, ultimately enabling accelerated time-to-market for our client’s products.”
Signalion “We are pleased to support Freescale’s QorIQ Qonverge portfolio with our world-class wireless test technologies to ensure high-performance equipment, service and end-user experiences,” said Tim Hentschel, managing director for Signalion GmbH. “Freescale is charting new territory with the QorIQ Qonverge hybrid portfolio that promises to transform the future of wireless infrastructure equipment.”
Tata-Elxsi “The introduction of theQorIQ Qonverge portfolio means OEMs now have a single-architecture, compatible family of products to address all their base station design needs,” said Shyam Ananthnarayan, head of the Communications Business Unit at Tata Elxsi. “As a key member of Freescale’s rich ecosystem, Tata Elxsi will offer market-leading LTE eNodeB software stacks optimized to ease customers’ development of best-of-breed solutions based on Qonverge technology.”
Wireless support and network functions converge in QorIQ Qonverge processors [By Tom Thompson, June 16, 2011]
Wireless communication seems ubiquitous these days–until you wander into a dead zone and lose the network connection to your laptop, tablet, or mobile phone. Telco carriers are working hard to eliminate such areas by installing more macrocell towers. Sometimes installing one of those big bruisers in an area isn’t possible, so the carriers fill in the coverage gaps by scaling down. Scaling down in this case means building smaller wireless installations, such as microcell (also known as metrocell), picocell, and femtocell base stations.
You don’t have to be a rocket scientist to realize that deploying such a diverse array of gear can be a nightmare, both in terms of hardware design, embedded software development, and support. Every base station has various wireless formats to manage, and the smaller base stations must also implement certain wired backhaul technologies such as Ethernet and ET/T1 so that they can connect to the carrier’s infrastructure. One way to alleviate this headache of multiple base station designs is to reduce the different types of hardware used. For this scheme to work, however, the signal processing capabilities of a DSP and the networking functions of an application processor must converge into one unified part.
Freescale happens to be well-positioned to provide such a converged solution. First, the company makes its StarCore DSPs, which are 32-bit multicore processors engineered for high data processing throughput and support for a variety of wireless protocols. Second, the company makes high-performance network processors, notably those that comprise its QorIQ Processing Platform. These are 32-bit processors based on a low-power, high-performance Power Architecture core that manages several high-speed communications interfaces. Variants of both the StarCore and Power Architecture families feature fewer cores or lack hardware accelerators, which enable them to hit a specific price point or power consumption target.
Freescale’s convergence strategy is simple in concept, yet presented an engineering challenge. First, you take the core subsystems of these two processors and place them on a single chip. Next, surround the cores with a bevy of enhanced communications interfaces. Finally, knit all of these elements together with a high-speed switching fabric. The result is the QorIQ Qonverge processor, a system that is essentially a base station on a chip. Let’s delve deeper into the microarchitecture of the QorIQ Qonverge and see how it offers a comprehensive solution.
A Tale of Two Processors
The block diagram in Figure 1 depicts the major logic blocks that make up the QorIQ Qonverge PSC3191E, a part suitable for femtocell and picocell base station designs.
Figure: Block diagram of the QorIQ Qonverge PSC9131E processor.
The StarCore subsystem consists of an SC3850 DSP core that has six execution units (four data ALUs, and two address units) that operate in parallel to retire six instructions simultaneously per clock. The ALUs support integer and fractional arithmetic, including multiply-accumulate (MAC) and other sophisticated instructions. The core is therefore capable of reading, processing, and writing a continuous stream of data. The subsystem has its own internal L1/L2 caches, an MMU, an interrupt controller, and timers.
The Power Architecture subsystem consists of an e500 core, which is a superscalar processor with multiple execution units that can issue and retire two instructions per clock cycle. It has its own internal L1/L2 caches, an interrupt controller, and timers.
Each core has separate 32 KB instruction and data caches to reduce latency and boost throughput. The Harvard architecture implementation of these caches requires more transistors, but it helps to ensure that the cores receive a continuous stream of data and instructions. The unified L2 caches can be configured so that a portion of them acts as a low-latency L2 memory for time-critical data or variable storage.
Both subsystems would grind to a halt if they could not access memory or peripheral devices rapidly. To minimize this bottleneck, a high-performance communications interface, known as the Chip-Level Arbitration and Switching System (CLASS) fabric was used. This high-bandwidth, low-latency switching fabric is a fully-pipelined, device interconnect that provides direct access to the resources of the subsystems and on-chip peripherals.
The DMA engine, which can be programmed by either core, uses the CLASS fabric to manage data transfers. It has four bidirectional channels. Off-chip memory is accessed through a DDR memory controller. The controller supports DDR3/DDR3L devices, and can manage a 32-bit interface at a maximum 800 MHz data rate.
Hardware Gives a Hand
As you can see, the QorIQ Qonverge processor is one busy piece of silicon. Among its many duties is to process various wireless formats and encrypt communications sessions. These wireless and encryption algorithms are complex and require substantial processing power. While they can be done in software, the QorIQ Qonverge processor has dedicated execution units that can off-load the computational demands of these algorithms from the core subsystems.
The Multi Accelerator Platform Engine for Femto BaseStation Baseband Processing (MAPLE-B2F) unit provides hardware acceleration for baseband algorithms such as channel decoding/encoding, UTMS chip rate processing, and LTE uplink/downlink processing. It also accelerates the computation of Fourier transforms, matrix inversions, CRC algorithms, convolution and filtering operations, Turbo encoding/decoding, and Viterbi decoding. It is a second-generation design that builds upon an established predecessor used in certain StarCore DSPs.
For encryption duties there is the security engine, a cryptographic and assurance acceleration unit. It uses a job queue interface that can schedule multiple cryptographic tasks in parallel, and its multiple accelerators can be shared among different applications. In concert with the DMA engine, this module can use scatter/gather operations to collect data that is distributed throughout memory. The module has hardware accelerators for public key, message digest, ARC four, SNOW 3G f8 and f9, and Katsumi cryptographic operations. It also has accelerators that manage DES, AES, and CRC operations, and it supports a variety of cryptographic authentication schemes.
Note that acceleration capabilities are not limited exclusively to these particular modules. Other modules can accelerate a subset of their functions. For example, the Ethernet controller can off-load and accelerate certain TCP/IP stack operations such as IP header recognition and checksum, plus TCP/UDP checksum and verification.
Smart Controllers
The PSC9131E has several controllers that manage complex I/O operations concurrently. The Antenna Interface Controller (AIC), as its name implies, handles transactions between the processor and an external Radio Frequency (RF) subsystem. It supports CDMA, WCDMA-DD, LTE-FDD, LTE-TDD, and GSM (receive only) network modes. Data received from the transceiver is reformatted and stored by the AIC into system memory or in the MAPLE-B2F unit. Data to be transmitted is transferred by DMA to the AIC where it frames the data for the proper network format and sends it to the transceiver. The AIC can handle up to a maximum of four data lanes, depending upon the wireless format in use.
The Ethernet controller features two enhanced Gigabit Ethernet interfaces that can operate at speeds of 10 Mbps, 100 Mbps and 1 Gbps. These interfaces are IEEE 802.3, 802.3u, 820.3x, 802.3z, 802.3ac, and 802.3ab compliant. As mentioned previously, the controller can accelerate the identification and retrieval of standard and non-standard protocols present on the Ethernet connection.
The USB controller is USB revision 2.0 compliant and can function as both a host and a device controller. As a host, it supports low-, full-, and high-speed transfer rates. It contains its own DMA engine that reduces the interrupt load on the processor and minimizes the bus bandwidth necessary to service any USB transactions.
In summary, these several controllers provide sophisticated wireless, Ethernet, and USB services, yet without adding a considerable burden to the processor’s operation, especially when it is conducting network/wireless routing.
Ports Aplenty
The PSC9131E provides a number of ports that enable you to connect a large cast of supporting peripherals to the processor. These are:
- Enhanced SPI
- Two DUARTs
- Integrated Flash memory Controller (IFC)
- Two I2C controllers
- General-Purpose I/O (GPIO) interface with 32 bidirectional ports
- Universal Subscriber Identity Module (USIM) interface for communicating with a SIM card
- PWM optimized to generate sound
- Enhanced Secured Digital Host Controller (eSDHC) for interfacing to SD/SDIO/MMC cards
As a unit, QorIQ Qonverge processors represent a fusion of many existing, field-proven Freescale technologies. However, the resulting processor is far greater than the sum of its parts. Since the QorIQ Qonverge processor implements the level-1, -2, and -3 processing layers required for network/wireless communications on-chip, it only lacks some external hardware, such as a power supply, flash memory, DRAM, Ethernet line-driver and a RF transceiver to implement a stand-alone femtocell or picocell base station. It is designed to replace both the DSP and the applications processors at the heart of many such base station designs, as shown in Figure 2. By doing so, the QorIQ Qonverge part can reduce complexity, processing latencies, and the bill of materials for a base station design.
Figure 2. The QorIQ Qonverge-based picocell design (bottom) uses fewer parts than a design based on separate DSP and application processors (top).
A Processor for Many Uses
The QorIQ Qonverge processor isn’t limited to short-range base stations, however. It can also scale up: Multicore variants can support microcell and macrocell base station designs. This allows you to assemble a range of base station designs around one part.
Besides simplifying the base station design, the QorIQ Qonverge processor also allows you to reuse existing software. For example, existing StarCore MSC8156 DSP code and P2020 application code can be migrated to the QorIQ Qonverge processor, since the cores are nearly identical. The same CodeWarrior tool suite for StarCore DSPs and CodeWarrior tools for Power Architecture can be used to write and debug the software. Furthermore, the code written for–say, a picocell base station–can be reused in microcell and macrocell base station designs. Revising the code for a multicore processor can be tricky, but you can start the process with the knowledge that the application code was stress-tested on smaller base stations. Also, Freescale’s partner, CriticalBlue, has a multicore simulation tool to assist you in this process for Power Architecture-based software. All of this adds up to be a comprehensive solution for embedded base station designs.
Turn the lightRadio on [March 8, 2011]
Development hopes to double network capacity while halving power consumption. By Roy Rubenstein.
Mobile operators face significant challenges, given the rapid growth in mobile broadband traffic. They are starting to roll out the latest mobile technology, Long Term Evolution (LTE), as yet another overlay alongside the existing wideband CDMA and GSM networks. Mobile sites are thus being crammed with antennas and basestation equipment.
“The cellular network is 30 years old,” said Tom Gruba, marketing director for wireless activities at Alcatel-Lucent. “You cannot just keep adding more basestations in the network to solve the [data] capacity problem; the business model doesn’t work.” Alcatel-Lucent’s solution is lightRadio, which moves the processing power to the antenna or into the network, like cloud computing. The system vendor points out that architecture change is being industry led; what Alcatel-Lucent is claiming is that the lightRadio portfolio of products is the first to support the new architecture.
Announced in the run up to Mobile World Congress 2011, lightRadio promises to double network capacity, while halving power consumption. The lightRadio products include a wideband active array antenna that integrates the amplifier and antenna elements, a radio SoC developed with Freescale, and a multimode radio controller platform being developed with HP. Integrating the amplifier alongside the antenna achieves better coupling of the signal to the antenna. Less power is wasted, such that a smaller amplifier can be used.
The wideband active array antenna is implemented as a 6cm cube, pictured left. The wideband operation covers 400 to 4000MHz, allowing one cube to support 700MHz and 2600MHz bands. “These can be stacked, depending on how much power is needed, and you can have two or three columns to serve two or three frequencies and any technologies you want,” said Gruba.
Being an active design, the antenna boosts cell capacity through beam forming and multiple input, multiple output (MIMO) technology. Combining the amplifier-antenna with the radio chip forms a compact basestation that can be mounted on masts or within buildings. Such a combined baseband/remote radio head takes little space and avoids the need for air conditioned cooling associated with traditional basestations.
LightRadio will also enable a cloud computing style radio network architecture, where the basestation is separated from the antenna-amplifier. Traditionally, the radio amplifier was connected to the baseband via a backplane. The advent of the remote radio head led to the creation of the common public radio interface (CPRI) to connect the amplifier at the antenna with the baseband unit. With a cloud based radio network, basestations from 25 or 30 cell sites could be placed in a facility up to 40km away, with the CPRI signal carried over an optical link.
Alcatel-Lucent estimates the maximum lightRadio bit stream needed to be carried over the CPRI link is 10Gbit/s. Compression technology will reduce this by a factor of three, so operators can avoid installing a dedicated 10Gbit optical link. At the core of the baseband processing is the SoC developed with Freescale.
“Dimensioning the various aspects of the SoC is critical,” said Preet Virk, Freescale’s director, networking segment. The SoC design uses Freescale’s recently announced QorIQ Qonverge technology that supports designs spanning femtocells to macro basestations. Two devices have been announced – for femtocells and picocells – that are implemented using a 45nm cmos process. Alcatel-Lucent’s radio ic will be implemented in 28nm cmos and will be available from 2012.
Freescale is not willing to detail the basestation SoC yet, but the scalable design uses cores and IP blocks that are shipping in Freescale products, such as the e500 Power Architecture core and the StarCore SC3850 dsp as well as baseband acceleration blocks.
“Scalability comes in many forms,” said Barry Stern, Freescale’s baseband DSP & SoC products, marketing manager, wireless access division, networking and multimedia group. “From a few users to hundreds of users; from 1.25 to 20MHz bandwidths and beyond; simultaneous multimode support; and enabling OEMs to use the same software across different basestation designs, saving on development costs.”
Freescale’s femtocell SoC supports 8 to 16 users and uses an e500 core and a dsp core. The picocell SoC supports 32 to 64 users and uses two e500s and two dsp cores. Freescale’s metro and macro cell SoCs will support hundreds of users, requiring multiple dsp and cpu cores. Other features will include several DDR3 memory controllers; baseband acceleration for turbo coding, fast Fourier transforms and MIMO; and interfaces for Ethernet, PCI Express and CPRI, according to Virk.
“The SoC in the cloud is going to give us the ability to do all sorts of new things,” said Tod Sizer, head of Alcatel-Lucent’s Bell Labs’ wireless research domain.
Intercell communication
Having baseband processors concentrated at one location enables intercell communication. One application is Coordinated Multipoint (CoMP), what Alcatel-Lucent calls networked MIMO, which will be a feature of the 3rd Generation Partnership Project’s (3GPP) Release 10 cellular standard.
Currently, only one cell serves a user, even if the user is commonly near the cell edge and is sensed by adjacent cells. With CoMP, MIMO technology can be used such that different streams are transmitted between the basestations and the user, boosting throughput. And it is this technique, says Alcatel-Lucent, which will double overall capacity.
The cloud like architecture will also enable new uses that benefit energy consumption. “One we are going to see in the coming years is coordination on the basis of energy usage,” said Sizer, citing how, for example, all users could be moved to the 3G network, with the LTE basestations turned off to save power, based on time of day and subscriber requirements. “You have that capability of moving users if you have control of both technologies from a single cloud,” said Sizer.
Power consumption has become a key issue for operators, with the likes of France Telecom looking to reduce the energy consumption in its network by 15% by 2020. In turn, US operator Verizon stipulates that each new piece of equipment must be at least 20% more energy efficient than its predecessor if it is to be deployed. Alcatel-Lucent is developing a virtualised radio controller architecture as part of the portfolio, working with HP to consolidate three generations of radio controllers into one platform. In GSM, the basestation controller (BSC) connects to multiple cell sites, while a radio network controller (RNC) is used in 3G.
“If I make the BSC or RNC a software routine, the software becomes independent of the platform and I can put both functions in one box,” said Gruba. Alcatel Lucent is basing the design on an ATCA version 2 based general purpose processor design, while HP is providing server and virtualisation expertise to the controller design. Alcatel-Lucent expects to be trialling the wideband active array antenna in the autumn before it becomes commercially available in 2012.
The remaining lightRadio elements will appear from 2012 onwards. Ken Rehbehn, principal analyst at the Yankee Group, says lightRadio is arguably the most important wireless equipment development made by Alcatel-Lucent since its 2006 merger. However, he points out that other vendors are pursuing comparable strategies that might challenge much of the lightRadio vision.
lightRadio: hideous cell towers to get smaller, lose the “hut” [Feb 2011]
Even when they’re disguised like fake trees or church steeples, cell towers are ugly. Most have a hut at the bottom, stuffed with baseband processing gear that does the hard work of creating and decoding, say, an LTE signal. These huts often contain signal amplifiers, big units that push power up the tower to the actual antennas—and half the signal is lost just moving through the tower’s wiring. At the top, rectangular antennas bristle from the tower. One set might be for 2G support, one for 3G, and another for 4G.
Alcatel-Lucent, one of the world’s biggest wireless gear makers, turned to its Bell Labs research division to rethink this aging architecture. First step: apply the “data center” model of centralization to baseband processing and consolidate all that rack-mounted hardware into a few locations per city, each connected to the towers it serves by fiber optic cable.
Right now, a cell tower fault might require a truck roll and a drive through traffic. When the tech gets to the tower site, it might turn out to be at the top of a hotel, and permission to access it must be obtained from the site manager. Put all the processing gear in a single remote location, however, and repairs to it get cheaper and faster.
Clustering the baseband units also makes it easier to do load balancing across a region. When commuters are driving into work, for instance, the baseband cluster can turn its combined energy to handling the signal load coming from towers along the highways and train lines. During the day, processing could handle heavy downtown traffic, while it shifts focus to the suburbs in the evening. Such load-balancing doesn’t produce any additional spectrum or data throughput, but it does mean that a carrier can operate fewer baseband processors, saving the carrier cash.
The third advantage to centralizing the baseband processors is that the interconnection fabric between them can operate at high speeds, fast enough to support a standard called CoMP, or Co-ordinated Multipoint. CoMP, which is currently moving through standardization, relies on the fact that, in many locations, a user’s wireless gadget is in range of multiple towers (the closer one comes to the edge of each cell, the more towers can typically see the device).
This is usually a waste, since multiple towers spend bandwidth contacting the gadget but can’t independently deliver different data. CoMP turns it into a bonus by dividing up requested download data and using all cells in the area to deliver a different slice of it at once—akin to the way BitTorrent operates. The phone then combines the data from all the towers in the proper order. This additive approach to using different towers means that a user’s total throughput can go up substantially, but it requires centralized baseband to function.
Finally, the new lightRadio baseband bear can do software-defined protocols. Upgrading to LTE? Just upgrade the software on the baseband processor. (Traditional rack-mounted baseband processors required dedicated units for each protocol.) A new baseband chip from Freescale makes it possible, but it gets even cooler when used in conjunction with the new wideband antennas.
LightRadio uses a new antenna that, in Alcatel-Lucent’s words, collapses three radios into one. The radios are tiny cubes of 2.5 inches square, and each can operate between 1.8GHz and 2.6GHz. They use tiny amps that can be located atop the tower, built into the antenna enclosure, which keeps the amp size down and dramatically cuts down on the power loss.
These radio cubes are stacked in groups of 8 to 10 in order to make an antenna element, and when one cube in the array goes down, the others remain unaffected. (In a traditional system, the whole antenna unit would fail.) The amps cover enough different frequencies that, in many cases, simply changing the software configuration on the baseband unit can control whether each antenna offers a 2G, 3G, or 4G signal.
The antennas also do “beam forming”—fine-grained directional control over the radio signal—in both the horizontal and vertical dimension to better connect with local wireless devices. Alcatel-Lucent claims capacity improvements of 30 percent through the use of vertical beam-forming alone.
The end result of the system: lightRadio cell towers don’t need huts, they don’t need air conditioners and heaters, big amps, fans, or even local processing gear. Baseband processing moves closer to the data center model and gets cool new capabilities like CoMP and load-balancing. The system’s cost savings come from power (Alcatel-Lucent claims a 50 percent reduction), along with lower construction and site rental fees. The total macro capacity of the system should double while cutting operator costs dramatically.
Though it will take months for any carrier to roll out this or similar gear, advances like lightRadio are crucial as wireless usage continues to soar and smartphones break out of the enterprise and the technorati and into the mainstream. And by making cell infrastructure smaller, cheaper, and less power-hungry, this sort of gear brings wireless networking into reach of more people, especially in rural areas and developing countries.
Alcatel-Lucent’s lightRadio™ portfolio wins NGN magazine leadership award for transforming mobile broadband networks [May 19, 2011]
Alcatel-Lucent (Euronext Paris and NYSE: ALU) today announced that its lightRadio portfolio was recognized as the outstanding new achievement in broadband Internet communications by the leading industry magazine NGN, as part of its NGN Leadership Awards contest. The awards program recognizes outstanding products, services and technologies relating to next generation networks.
“This award underlines the sweeping impact our lightRadio portfolio is having on the wireless communications industry,” said Wim Sweldens, President of Alcatel-Lucent’s Wireless activities. “lightRadio isn’t just redefining the shape of the wireless base station, it also offers a compelling vision for what wireless networks will look like in the future.”
“This award for Alcatel-Lucent’s LightRadio is a great testament to their innovation. They have brought to market a solution designed to solve the most critical issues facing the wireless industry, starting with the quasi impossibility to add new sites to increase capacity and improve coverage,” said Stéphane Téral, Principal Analyst, Mobile and FMC Infrastructure, Infonetics.
lightRadio™ is a new product offering from Alcatel-Lucent that will dramatically reduce operating costs, technical complexity and power consumption in mobile broadband networks. Designed to meet the long-term needs of mobile operators seeking to ensure their networks can handle increasing traffic loads, lightRadio radically shrinks and simplifies today’s base stations.
The lightRadio portfolio is designed to increase network capacity while simultaneously reducing the cost of delivery, on a per bit basis. The overarching goal is to give operators more options and a flexible path forward for the next decade. By increasing the capacity at a reduced cost the operators can pursue a whole new market segment, the mass market. In addition, being able to use the lightRadio cube technology in various forms means Small Cells can leverage the technology and rural villages can get wireless coverage at lower costs helping to cross the digital divide.
lightRadio promises greener, simpler, lighter networks, and the benefits are substantial, including:
- 50% reduction in total cost-per-bit as compared to 3G when adding a comparable amount of capacity
- 50% reduction in energy consumption when compared to conventional ground based solutions
- Small and easily deployable – can be deployed anywhere there is a power source and broadband connection and deals with less zoning restrictions
- Nearly invisible – the WB-AAA is two products in one. It’s adding another radio in the same size form factor with no additional lease cost or further pollution of the urban skyline.
The Alcatel-Lucent “lightRadio” product family is composed of the Wideband Active Array Antenna, the Multiband Remote Radio Head, the “lightRadio” Baseband Processing, the “lightRadio” Control, and the 5620 SAM common management. The Wideband Active Array Antenna will be trialed later this year and have broad product availability in 2012. For more information on the lightRadio portfolio please click here.
Bell Labs lightRadio™ Breakthroughs [Feb 7, 2011]
The world of mobile communications moves fast. With new mobile devices, new applications and ever-growing and changing consumer demands the wireless networks in use today have to evolve. Rather than take an incremental approach to meet these challenges, Bell Labs took a leap and developed a radically new approach to wireless technology. In order to do this, Tod Sizer, head of Bell Labs Wireless Research, challenged his team to think not just “outside the box,” but to think “inside the cube.” In six short months, the team developed a cube-shaped antenna that would fit in the palm of a hand– and was ready to test it with customers.
Tod Sizer, Head of Wireless Research for Alcatel-Lucent Bell Labs, talks about developing the lightRadio antenna module. lightRadio represents a new architecture where the base station, typically located at the base of each cell site tower, is broken into its components elements and then distributed into both the antenna and throughout a cloud-like network. Additionally today’s clutter of antennas serving 2G, 3G, and LTE systems are combined and shrunk into a single powerful, Bell Labs-pioneered multi frequency, multi standard Wideband Active Array Antenna that can be mounted on poles, sides of buildings or anywhere else there is power and a broadband connection.“There are many different types and sizes of base stations, from very small to very large, depending on where they are located, such as in an urban or rural area,” explained Sizer. “I realized that we needed to design a new and flexible type of antenna array for different environments– including one designed to the smallest possible size – ‘invisible antennas’ – in order to be flexible enough to meet the growing needs of all of our wireless service provider customers.”
A radio antenna element is a component of an antenna system that transmits signals from the wireless base station to a wireless end-user using a mobile phone, smart device or laptop. By reducing the size of the element itself, an antenna array can be scaled to fit any wireless need simply by adding more of these elements to the array.
Bell Labs wireless researchers weren’t daunted by the challenge of building something that was roughly ten percent of its current size. Several wireless research teams in Stuttgart and Ireland focused on different aspects of the problem, combining their unique areas of expertise to quickly resolve a myriad of technical challenges to reduce the antenna element’s size, improve energy efficiency and lower manufacturing expenses. The clever architecture of this new antenna is but one of the innovations critical to realizing Alcatel-Lucent’s unique lightRadio portfolio.
“We believe this unique antenna – as part of the lightRadio solution – will have a significant impact in the wireless space,” concluded Sizer.
Quick Links
Wim Sweldens presents lightRadio, a breakthrough for the mobile industry [Feb 7, 2011]
Wim Sweldens, President, Alcatel-Lucent wireless activities, talks about lightRadio™, a new system that signals the end of the mobile industry’s reliance on masts and base stations around the world. lightRadio represents a new architecture where the base station, typically located at the base of each cell site tower, is broken into its components elements and then distributed into both the antenna and throughout a cloud-like network. Additionally today’s clutter of antennas serving 2G, 3G, and LTE systems are combined and shrunk into a single powerful, Bell Labs-pioneered multi frequency, multi standard Wideband Active Array Antenna that can be mounted on poles, sides of buildings or anywhere else there is power and a broadband connection. More info: http://www.alcatel-lucent.com/lightradio
Alcatel-Lucent. Cube light Radio [Feb 18, 2011]
Highlights of lightRadio press conference [London, Feb. 7th, 2011]
Presentation of the lightRadio system which will dramatically reduce technical complexity and contain power consumption and other operating costs in the face of sharp traffic growth. This is accomplished by taking today’s base stations and massive cell site towers, typically the most expensive, power hungry, and difficult to maintain elements in the network, and radically shrinking and simplifying them. Conference guests: Stephen Carter, Wim Sweldens, Tod Sizer and Javier Garcia Gomez (Alcatel-Lucent), Lisa Su (Freescale) and Joe Weinman (HP).
AMD 2012-13: a new Windows 8 strategy expanded with ultra low-power APUs for the tablets and fanless clients
February 3, 2012 2:31 pm / 3 Comments on AMD 2012-13: a new Windows 8 strategy expanded with ultra low-power APUs for the tablets and fanless clients
AMD Strategy Transformation Brings Agile Delivery of Industry-Leading IP to the Market [AMD press release, Feb 2, 2012]
At its annual Financial Analyst Day, AMD (NYSE: AMD) detailed a new “ambidextrous” strategythat builds on the company’s long history of x86 and graphics innovation while embracing other technologies and intellectual property to deliver differentiated products.
AMD is adopting an SoC-centric roadmap designed to speed time-to-market, drive sustained execution, and enable the development of more tailored customer solutions. SoC design methodology is advantageous because it is a modular approach to processor design, leveraging best practice tools and microprocessor design flows with the ability to easily re-use IP and design blocksacross a range of products.
“AMD’s strategy capitalizes on the convergence of technologies and devices that will define the next era of the industry,” said Rory Read, president and CEO, AMD. “The trends around consumerization, the Cloud and convergence will only grow stronger in the coming years. AMD has a unique opportunity to take advantage of this key industry inflection point. We remain focused on continuing the work we began last year to re-position AMD. Our new strategy will help AMD embrace the shifts occurring in the industry, marrying market needs with innovative technologies and become a consistent growth engine.”
Roadmap Updates Focus on Customer Needs
Additionally, AMD today announced updates to its product roadmaps for AMD Central Processing Unit (CPU) and Accelerated Processing Unit (APU) products it plans to introduce in 2012 and 2013. The roadmap modifications address key customer priorities across form factors including ultrathin notebooks, tablets, all-in-ones, desktops and servers with a clear focus on low power, emerging markets and the Cloud.
AMD’s updated product roadmap features second generationmainstream (“Trinity”) and low-power (“Brazos 2.0”) APUs for notebooks and desktops; “Hondo,” an APU specifically designed for tablets; new CPU cores in 2012 and 2013 with “Piledriver” and its successor “Steamroller,” as well as “Jaguar,” which is the successor to AMD’s popular “Bobcat” core. In 2012, AMD plans to introduce four new AMD Opteron™ processors. For a more in-depth look at AMD’s updated product roadmap, please visit http://blogs.amd.com.
Next-generation Architecture Standardizes and Facilitates Software Development
AMD also provided further details on its Heterogeneous System Architecture (HSA), which enables software developers to easily program APUs by combining scalar processing on the CPU with parallel processing on the Graphics Processing Unit (GPU), all while providing high bandwidth access to memory at low power. AMD is proactively working to make HSA an open industry standard for the developer community. The company plans to hold its 2nd annual AMD Fusion Developer Summitin June, 2012.
New Company Structure Strengthens Execution
In conjunction with announcing its restructuring plan in November 2011, AMD has strengthened its leadership team with the additions of Mark Papermaster as senior vice president and chief technology officer, Rajan Naik as senior vice president and chief strategy officer, and Lisa Su as senior vice president and general manager, Global Business Units. These executives will help ensure that sustainable, dependable execution becomes a hallmark of AMD.
Supporting Resources
- Visit the AMD Financial Analyst Day websitefor webcast replay, presentations, updated roadmap, and more
- Visit AMD Blogsfor more details on AMD’s product roadmap changes
- Follow AMD on Twitter at @AMD_Unprocessed
- Like AMD on Facebook
AMD started talking about ‘Trinity’ and ‘Hondo’ last summer. See in Acer repositioning for the post Wintel era starting with AMD Fusion APUs [June 17, 2011]
What AMD could definitely be proud of for 2011 is A “Brazos” Story: The Little Chip That Could (And Then Just Kept On Going) [AMD Fusion blog, Feb 1, 2012]:
In late 2010, AMD shipped its first-ever Accelerated Processing Units (APUs), internally codenamed “Brazos,”which combined the tremendous processing power of graphics and x86 on a single chip.
We had high expectations for the low-voltage “Brazos” APU: great computing, HD, long battery life and DirectX 11 capable graphics, all on a single chip. Yet still we were blown away by the initial industry reception. It was only a year ago we left CES with seven highly-sought after innovation and technology awardsfor the little product we ultimately named the C- and E-Series APUs, including:
- 2010 PC Magazine Technical Excellence Award
- CES 2011 Design & Engineering Innovations Award
- CHIP China 2010 Highlight Awards
- Computer World China Innovation Award
- Notebooks.com Best Innovation CES 2011
- Popular Mechanics Editors’ Choice Award
- Shopping Guide China Most Advanced Digital Product Award
After CES we should have re-nicknamed “Brazos” the “Little Chip That Could.” And all throughout 2011, “Brazos” kept on chugging. We added the “Best in Show” Award at Embedded Systems Conference and the “2011 Best Choice of Computex TAIPEI Award” to the list of accolades. In the second quarter we sold more than five million C- and E-Series APUs. What a tremendous start to a new way of processing for AMD and the industry.
But “Brazos” kept on impressing, showing up in a variety of form factors – notebooks, netbooks, small desktops and all-in-ones– from top global OEM partners.
So it was no surprise or mistake that we ended 2011 with more than 30 million APUs shipped. It all started with little “Brazos,” which has now earned its place in history as AMD’s fastest ramping platform ever.
John Taylor, Director of Worldwide Product Marketing at AMD
CES 2012 Consumer Showcase Tour [amd, Jan 11, 2012]
AMD Codename Decoder – November 9, 2010 [AMD Business blog]
APU
An APU is an accelerated processing unit, a new generation of processors that combine either low-power or high-performance x86 CPU cores with the latest GPU technology (such as DirectX® 11) on a single die.
Planned for introduction: Q1 2011
…
“Bobcat”
Market: Multiple devices, including notebooks ultrathins, HD netbooks and small form factor desktops.
What is it? A sub-one watt capable x86 CPU core that first comes to market in the “Ontario” and “Zacate” Accelerated Processing Units (APU) for mainstream, ultrathin, value, and netbook form factors as well as small form factor desktop solutions. “Bobcat” is designed to be an extremely small, highly flexible, out-of-order execution x86 core that easily can be scaled up and combined with other IP in SoC configurations.
Planned for introduction: Q1 2011
“Brazos”
Markets: Value Mainstream Notebooks, HD Netbooks and Small Form Factor Desktops
What is it? “Brazos” is AMD’s 2011 low-power platform, available with two APUs; “Zacate” – currently planned to be marketed as the E Series – is an 18-watt TDP APU for ultrathin, mainstream and value notebooks as well as desktops and all-in-ones. “Ontario” – currently planned to be marketed as the C Series – is a 9-watt
APU for netbooks and small form factor desktops and devices. Both “Brazos” platform APUs include a DirectX® 11-capable GPU.
Planned for introduction: Q1 2011
“Bulldozer”
Market: Server and Client
What is it? A multi-threaded high-performance x86 CPU core contained in the “Zambezi” processor for client PCs and “Interlagos” and “Valencia” processors for servers. Included in the “Scorpius” enthusiast desktop PC platform and “Maranello,” “Adelaide,” and “San Marino” server platforms, “Bulldozer” is designed to be a completely new, high performance architecture that employs a new approach to multithreaded compute performance for achieving advanced efficiency and throughput. “Bulldozer” is designed to give AMD an exceptional CPU option for linking with GPUs in highly scalable, single-chip APU configurations. “Bulldozer” offers AMD another exceptional CPU option for combining with GPUs in highly scalable, single chip APU configurations, beginning in 2012 APU designs.
Planned for introduction: Client (1H 2011); Server (2H 2011)
…
“Llano”
Market: Notebooks and Desktops
What is it? Part of the “Sabine” platform, “Llano” is a 32nm APU including up to four x86 cores and a DirectX® 11-capable GPU, primarily intended for performance and mainstream notebooks and mainstream desktops. “Llano” is engineered to deliver impressive visual computing experiences, outstanding performance with low power and long battery life.
Planned for introduction: Mid-2011
…
“Ontario”
Market: Primarily ultrathin notebooks and HD netbooks
What is it? A 9W APU featuring dual or single “Bobcat” x86 cores currently planned to be marketed as the C Series, and primarily intended to serve the low power and highly portable PC markets for netbooks and small form factor desktops and devices.
Planned for introduction: Q1 2011
…
“Zacate”
Market: Notebook/Desktop
What is it? “Zacate” is AMD’s 18W APU designed for the mainstream notebook and desktop market. Zacate will feature low-power “Bobcat” CPU cores and support DirectX 11 technology.
Planned for introduction: Q1 2011
…
More information about 2011 AMD APU past on this blog:
– Acer repositioning for the post Wintel era starting with AMD Fusion APUs [June 17, 2011]
– Supply chain battles for much improved levels of price/performance competitiveness [Aug 16, 2011]
– Acer & Asus: Compensating lower PC sales by tablet PC push [March 29 – Aug 2, 2011]
– CES 2011 presence with Microsoft moving to SoC & screen level slot management that is not understood by analysts/observers at all [Jan 7, 2011]
– Changing purchasing attitudes for consumer computing are leading to a new ICT paradigm [Jan 5, 2011]
AMD 2012 APU, code name “Trinity” [amd, Jan 11, 2012]
AMD started talking about ‘Trinity’ last summer. See in Acer repositioning for the post Wintel era starting with AMD Fusion APUs [June 17, 2011]
Advanced Micro Devices’ CEO Discusses Q4 2011 Results – Earnings Call Transcript [Seeking Alpha, Jan 24, 2012]
We are seeing particularly strong customer interest in our expanded low-power APUs for 2012. The low-power versions of our next-generation chip, Trinity APU, delivers mainstream performance while using half the power of our traditional notebook processor. This processor fits into an ultrathin notebook design, as thin as 17 millimeters, providing industry-leading visual performance and battery life at very attractive price points. Trinity remains on track to launch for midyear.
…
We achieved record quarter client revenue driven by an increase in supply of Llano APUs. And in Q4 of 2011, APUs accounted for nearly 100% of mobile microprocessors shipped and more than 60% of the total client microprocessors shipped. Microprocessor ASP increased sequentially due to an increase in mobile microprocessor ASP and an increase in server units shipped.
There is no doubt that the customer acceptance of our APU architecture is quite strong. We’ve now shipped over 30 million of these APUs to date. And we’re seeing a strong uptake in terms of that architecture, what it means to the customer. They are looking for a better experience, and I think that’s a key reason why we’ve seen the momentum in our business and the ability to deliver on that. Our focus on execution around the APUs and around Llano is definitely paying off. And I think as we move forward, we should be able to continue to build on that momentum.
…
We’ve actually increased our Llano 32-nanometer product delivery by 80% from the third quarter, and now Llano makes up almost 60% of the mobile microprocessing revenue. … We’re going to continue to build on the strong relationships that we’ve been developing with GLOBALFOUNDRIES as we move forward.
…
The movement to thin and light is nothing new. Customers want mobility. And the idea of ultrathin is something that we’re very focused on. And if you think about it with our APU strategy that I mentioned, with the next-generation product, Trinity APU, we already are well ahead of the pace last year when we set a record-setting year for design wins with the Trinity product in 2012. With that product, we can deliver ultrathin in the range of 17 millimeters. And what’s really important and I think we have to all focus on is ultrathin and mobility, the ability for computing to reach customers across the planet. … And I’ll add that the improvements that we’ve made in Trinity in both our CPU and the GPU are really delivering outstanding results in performance per watt. So as well for the ultrathins being able to hit the 17-millimeter low-profile, we’re also getting a doubling of the performance per watt. So it’s an exciting application of our APU technology.
…
… as you think of the industry trends around consumerization, cloud and convergence, there’s no doubt, as we’ve seen these kinds of inflection points in the industry, there’s always a significant downward pressure in terms of the price points. So if you’re dragging huge asset base along with you and there comes pressure into the market around those price points, that could put pressure into their [Intel’s] — into a business model. … We think the emerging market and the entry — and the high-growth markets around entry and mainstream will be the hottest segment, and I think that’s playing to our hand. We’re going to emphasize this strategy. We want to embrace this inflection point that’s emerging. We want to accelerate it, because shift happens when there’s these inflecting points.
Of course, we see the investment of our competitor, but the fabless ecosystem is not sitting still. And if you look at the investments that are done on their — TSMC, at a GLOBALFOUNDRIES and a GLOBALFOUNDRIES and alliances level, then the numbers are very comparable. GLOBALFOUNDRIES and their partnership models invest about $9 billion this year. TSMC seeds around $6 billion, if I recall the number correctly. So this is, in terms of scale and absolute numbers, are very comparable to what Intel is putting on the table.
… I feel pretty good about where we are in terms of the transition around 32 nm. … And I want to emphasize, we’ve made real progress, but we’re not finished with that. And we need to continue to work every day with those tiger teams we’ve put in place. We’re tracking the test vehicles through the lines to make sure that we’re getting that consistent improvement, because that will reduce our consumption of wafers and give us far more flexibility in our supply chain. So while we have improved by 80% from the third quarter, we’re not all the way there yet … there’s more yield improvements possible on that 32-nanometer line. … And those same techniques and practices that the teams — the tiger teams applied on 32-nanometer, that momentum continues in the 28-nanometer. And so that poises us well going into the coming 2012.
… I think it’s fair to say from the improvements we have seen and the — and our foundry partners that we are not going to be supply-constrained in the first quarter. … I think the progress we have seen on Trinity has impressed us. And of course, all the learnings that have been done on 32-nanometer with the Llano product will be transferred to Trinity. So the start-off pace with Trinity is going to be significantly better from a yield perspective compared to where we were at Llano launch. So that makes us quite optimistic looking forward.
…
Here are also a couple of illustrations highlighting that 2011 APU success with the details of new APU strategy additions from Lisa Su‘s (Senior Vice President and General Manager, Global Business Units) presentation for the 2012 Financial Analyst Day held on February 2, 2012 (see her full presentation in PDF):
APUs BRING LEADERSHIP GRAPHICS/COMPUTE IP TO MAINSTREAM [#10]
2011: AMD first to introduce heterogeneous computing to mainstream applications
“Llano” APU offers nearly 3X the performance in the same power envelope over conventional CPUs (2)
Fully leverages the growing ecosystem of GPU-accelerated apps
Source: AMD Performance labs
(1) Testing performed by AMD Performance Labs. Calculated compute performance or Theoretical Maximum GFLOPS score for 2013 Kaveri (4C, 8CU) 100w APU, use standard formula of (CPU Cores x freq x 8 FLOPS) + (GPU Cores x freq x 2 FLOPS). The calculated GFLOPS for the 2013 Kaveri (4C, 8CU) 100w APU was 1050. GFLOPs scores for 2011 A-Series “Llano” was 580 and the 2013 [2012] A-Series “Trinity” was 819. Scores rounded to the nearest whole number.
(2) Testing performed by AMD Performance Labs. Calculated compute performance or Theoretical Maximum GFLOPS score (use standard formula of CPU Cores x freq x 8 FLOPS) for conventional CPU alone in 2011 was 210 GFLOPs while the calculated GFLOPs for the 1st Gen APU using standard formula (CPU Cores x freq x 8 FLOPS) + (GPU Cores x freq x 2 FLOPS) was 580 or 2.8 times greater compute performance.
Related new codenames (from the AMD provided At-a-Glance Codename Decoder [Feb 2, 2012]):
“Trinity” APU (Traditional Notebooks, Ultrathin Notebooks and Desktops)
- “Trinity” is AMD’s second generation APU and improves the power and performance of AMD’s A-Series APU lineup for mainstream and high-performance notebooks and desktops. “Trinity” will feature next-generation “Piledriver” CPU cores and new, DirectX® 11-capable, second generation AMD Radeon™ HD 7000 series graphics.
- New for 2012, AMD will offer a BGA or pin-less format, low power “Trinity” APU specifically designed for ultrathin notebooks.
- Planned for introduction: Mid-2012
“Piledriver” Core Micro Architecture
- “Piledriver” is the next evolution of AMD’s revolutionary “Bulldozer” core architecture.
- The “Trinity” line-up of APUs will be the first introduction of “Piledriver.”
“Kaveri” APU (Notebooks and Desktops)
- “Kaveri” is AMD’s third generation APU for mainstream desktop and notebooks.
- These APUs will include “Steamroller” cores, and new HSA-enabling features for easier programming of accelerated processing capabilities.
- Planned for introduction: 2013
“Steamroller” Core Micro Architecture
- “Steamroller” is the evolution of AMD’s “Piledriver” core architecture.
AMD OPTERON™ FUTURE TECHNOLOGY [#26]
Additional new codename (from the AMD provided At-a-Glance Codename Decoder):
“Excavator” Core Micro Architecture
- “Excavator” is the evolution of AMD’s “Steamroller” core architecture.
APU ADOPTION: RECORD DESIGN WINS, STRONG END-USER DEMAND [#11]
Shipped > 30m APUs to date
11 of the world’s top 12 OEMs shipping AMD APU-based platforms
“Brazos” APUs shipped more units in its first year than any previous mobile platform in AMD history
“Llano” APUs ramped to represent nearly 60% of mobile processor revenue by Q4 2011
Additional new codenames (from the AMD provided At-a-Glance Codename Decoder):
“Southern Islands” Discrete Graphics
- Internal codename for the entire family of desktop graphics ASICs based on Graphics Core Next architecture and utilizing 28nm process technology.
- “Southern Islands” products include “Tahiti” (AMD Radeon™ HD 7900 series), “Pitcairn,” “Cape Verde” and “New Zealand.”
“Brazos 2.0” APU (Essential Desktop and Notebook, Netbook, All-In-One and Small Desktop)
- The “Brazos 2.0” family of APUs will follow “Brazos”, AMD’s fastest ramping platform ever.
- In addition to increased CPU and GPU frequencies, “Brazos 2.0” will offer additional features and functionality as compared to “Brazos”.
- Planned for introduction: H1 2012
“Hondo” APU (Tablet)
- “Hondo” is AMD’s sub-5W APU designed for tablets. “Hondo” will feature low-power “Bobcat” CPU cores and support DirectX® 11 technology in a BGA or pin-less format.
- Planned for introduction: H2 2012
AMD started talking about ‘Hondo’ (as well as ‘Trinity’) last summer. See in Acer repositioning for the post Wintel era starting with AMD Fusion APUs [June 17, 2011]

(3) Projections and testing developed by AMD Performance Labs. Projected score for 2012 AMD Mainstream Notebook Platform “Comal” on the “Pumori” reference design for PC Mark Vantage Productivity benchmark is projected to increase by up to 25% over actual scores from the 2011 AMD Mainstream Notebook Platform “Sabine”. Projections were based on AMD A8/A6/A4 35w APUs for both platforms.
(4) Projections and testing developed by AMD Performance Labs. Projected score for the 2012 AMD Mainstream Notebook Platform “Comal” the “Pumori” reference design for 3D Mark Vantage Performance benchmark is projected to increase by up to 50% over actual scores from the 2011 AMD Mainstream Notebook Platform “Sabine”. Projections were based on AMD A8/A6/A4 35w APUs for both platforms.
(5) Testing performed by AMD Performance Labs. Battery life calculations using the “Pumori” reference design based on average power draw based on multiple benchmarks and usage scenarios. For Windows Idle calculations indicate 732 minutes (12:12 hours) as a resting metric; 421 minutes (7:01 hours) of DVD playback on Hollywood movie, 236 minutes (3:56 hours) of Blu-ray playback on Hollywood movie, and 205 minutes (3:25 hours) using 3D Mark ‘06 as an active metric.
Projections for the 2012 AMD Mainstream Platform Codename “Comal” assume a configuration of “Pumori” reference board, Trinity A8 35W 4C – highest performance GPU, AMD A70M FCH, 2 x 2G DDR3 1600, 1366 x 768 eDP Panel / LED Backlight, HDD (SATA) – 250GB 5400rpm, 62Whr Battery Pack and Windows 7 Home Premium.
Additional new codenames (from the AMD provided At-a-Glance Codename Decoder):
“Sea Islands” Graphics Architecture
- New GPU Architecture and HSA Features
- Planned for introduction: 2013
“Kabini” APU (Essential Desktop and Notebook, Netbook, All-In-One and Small Desktop)
- The “Kabini” APU is AMD’s second generation low-power APU and follow-on to “Brazos 2.0.”
- In addition to new “Jaguar” cores, these APUs will be enhanced with new Heterogeneous Systems Architecture (HSA), enabling features for easier programming of accelerated processing capabilities.
- Planned for introduction: 2013
“Temash” APU (Tablet and Fanless Client)
- The “Temash” APU is AMD’s second generation tablet APU and follow-on to “Hondo.”
- In addition to new “Jaguar” cores, these APUs will be enhanced with new Heterogeneous Systems Architecture-enabling features for easier programming of accelerated processing capabilities.
- Planned for introduction: 2013
“Jaguar” Core Micro Architecture
- “Jaguar” is the evolution of AMD’s “Bobcat” core architecture for low-power APUs.
MOBILE MARKET PROJECTIONS [#29] AMD Direction:
Focus on true productivity and user experience in ultra-low power devices
Leadership graphics, web applications and video processing leveraging APUs
Agile, flexible SoC designs
Ambidextrous solutions across ISAs and ecosystems
Fanless, sealed designs
These APU related strategic moves have been summarized by the same John Taylor as Strengthening our Client Roadmap [AMD Fusion blog, Feb 2, 2012]:
Roadmaps signify our plans to customers and business partners, outlining the new products and technologies that we are bringing online. In an ideal world plans would never change. But in reality, change is a certainty in the tech industry – new form factors immerge, technologies and applications shift and consumer tastes remake technology plans.
Like any technology company, AMD desires to anticipate change in the industry. So we course-correct as we work with customers to ensure that we create products that address the optimal blend of timing, features and performance, cost and form factors.
Today at our Financial Analyst Day in Sunnyvale, AMD senior staff detailed how AMD will focus its investments in R&D and marketing going forward, including roadmaps for 2012-2013. As Phil Hughes summarized, the announced roadmaps are designed to extend platform longevity, accelerate time to market and enhance performance and features. These roadmaps strengthen AMD’s ability to make the most of shifting market dynamics, all the while giving stand-out experience across device categories through our graphics and video IP. This blog provides some insight into our 2012 and 2013 roadmaps – the words in quotes are the codenames for the particular AMD processor offerings discussed today.
2012 Client Roadmap
AMD’s “Brazos 2.0” Accelerated Processor Unit (APU) family will be used for essential desktop and notebook, netbook, tablet, all-in-one and small desktop form factors. This allows us to address a fast-growing segment of the PC market where we have proven success with the original “Brazos” line-up – the C-Series, E-Series and Z-SeriesAPUs. We will add plenty of new features to the “Brazos 2.0” APU family, including increased CPU and GPU performance, longer battery life, a bevy of integrated I/O options and improvements to AMD Steady Video technology. “Brazos 2.0” is scheduled to hit the market in the first half of 2012.
As we demoed at CES, AMD’s “Trinity” APU for desktop and notebook remains on track for introduction in mid-2012, with plans to pack up to four “Piledriver” CPU cores and next-generation DirectX® 11-capable graphics technology, together delivering up to 50% more compute performance than our “Llano” offerings, including superior entertainment potential, longer battery-life and an even more incredibly brilliant HD visual experience.
New for 2012, AMD will introduce a low voltage “Trinity” APU that will be ideal for the next-generation of ultrathin notebook. This “Trinity” APU matches the experience enabled by the AMD 2011 APU in up to half the TDP. As we said, “Trinity” is on track for introduction in mid-2012.
In 2012 we will also introduce the ultra-low voltage “Hondo” APU for tablets. These low-power (power maxes out at 5W TDP) APUs will have “Bobcat” CPU cores and support DirectX 11 technology in a BGA or pin-less, thin processor package. Look for these in the second half of 2012 – more details to come later.
On the desktop platform side of things, the “Vishera” CPU will replace the “Komodo” CPU for desktop. This change enables accelerated time to market for improved performance and next-generation CPU features while maintaining the existing AM3+ motherboards. The “Vishera” CPU ushers in many exciting updates, includes 8 “Piledriver” cores, and when compared with the previous generation, provides higher frequencies, improved instruction per clock performance, advanced instruction sets (thus increasing application performance), additional DDR3 memory support and next-generation AMD Turbo Core Technology. We plan to launch “Vishera” in the second half of 2012.
2013 Client Roadmap
2013 brings major evolution to the client roadmaps as the vision presented by Rory, Mark and Lisa today begin to manifest – including moving our low power APUs to a system on a chip (SoC) design with the AMD Fusion Controller Hub integrated right into a single chip design.
In the performance APU category our third-generation APU, “Kaveri,”will employ “Steamroller” (the evolution of AMD’s “Piledriver” core architecture) x86 cores for enhanced instructions per clock and power advantages. Applications that take advantage of GPU accelerate will give users an amazing experience thanks to our Graphics Core Next and new Heterogeneous Systems Architecture (HSA) enabling features for easier programming of accelerated processing capabilities.
In the low power category, the “Kabini” SoC APU takes over for “Brazos 2.0.” This second generation low power APU integrates “Jaguar” x86 cores for augmented performance and energy efficiency. These APUs will also benefit from select HSA features and functionality.
We keep on innovating for the ultra-low power space in 2013. Our second generation, ultra-low-power “Temash” SoC APU will follow “Hondo” for tablet and other fanless form factors. This APU will also leverage the “Jaguar” low-power x86 cores and HSA features.
We at AMD strongly believe these roadmap updates help us time new product introductions with customer design phases to hit key sales cycles across a range of form factors and experiences. We are moving with the market and on the path to deliver exceptional productivity and user experience in a wide array of form factors.
John Taylor, Director of Worldwide Product Marketing at AMD
He also provided the following answers to questions regarding how AMD spells out Windows 8 tablet strategy [CNET, Feb 2, 2012]:
…
Q: Before, we go to Windows 8, what is your smartphone strategy, if any?
Taylor: The smartphone market is eight, nine, ten, maybe a dozen players. [They have] lower ASPs (average selling price), lower [profit] margins, different competitive dynamic. So, there is no shift on the smartphone strategy.And Window 8?
Taylor: But you will see much more focus on tablets, the convertible or hybrid devices that fit between tablets and notebooks, very thin [designs].What chips exactly will get you there?
Taylor: For tablets, it will decidedly be the Hondo chip. We’re acknowledging that we still have a couple of watts to shave off to really be a more ideal tablet platform (to achieve optimal power efficiency). But we think that Temash gets us much, much closer to that in 2013.And Windows 8 convertibles?
A 17-watt [power consumption] is the lowest that we’ll offer. That’s called Trinity. It will be unmatched in that [17-watt design] space. Discrete graphics-like performance. All types of dedicated video processing capabilities, better battery life than the competition. And all of these ways that we’re driving the new generation of accelerated applications. If you think about the Web apps that are being built for Win 8, using HTML5 and the graphics enginethat drives that higher level experience.…
I will add to that the following two illustrations from the AMD Product and Technology Roadmaps[AMD FAD, Feb 2, 2012]:
![]()
“Vishera” CPU (Desktop)
- The “Vishera” desktop CPU incorporates up to eight “Piledriver” cores, advanced instruction sets and other performance enhancing additions
- This next-generation CPU will maintain the AM3+ infrastructure.
- Planned for introduction: H2 2012
In addition to the above described expansion of the original APU strategy for the clients there is a kind of naming change with AMD Fusion System Architecture is now Heterogeneous Systems Architecture [AMD Fusion blog, Jan 18, 2012]
Since its introduction to the public in June 2011 at the AMD Fusion11 Developer Summit, the AMD Fusion System Architecture (FSA) has received widespread support and interest from our business partners and technology industry leaders. FSA was the blueprint for AMD’s overarching design for utilizing CPU and GPU processor cores as a unified processing engine, which we are making into an open platform standard. This architecture enables many benefits, including high application performance and low power consumption.
Our software partners are already taking advantage of the power and performance advantage of APU and GPU acceleration, with more than 200 accelerated applications shipped to date. The combination of industry standards like OpenCL and C++ AMP, alongside FSA, is ushering in the era of heterogeneous computing.
Together with these software partners, we have built a heterogeneous compute ecosystem that is built on industry standards. As such, we believe it’s only fitting that the name of this evolving architecture and platform be representative of the entire, technical community that is leading the way in this very important area of technology and programing development.
FSA will now be known as Heterogeneous Systems Architecture or HSA. The HSA platform will continue to be rooted in industry standards and will include some of the best innovations that the technology community has to offer.
Manju Hegde and I will be hosting a breakout session on HSA at AMD’s Financial Analyst Day on February 2nd 2012, which will be webcast live here. More information on the latest advances in HSA design will be released at a future date.
Also, if you haven’t already made plans to attend the AMD Fusion12 Developer Summit in June 2012 in Bellevue, Washington, I encourage you to save the date. Leaders from the technology and programming development communities will converge at the summit to discuss Heterogeneous Computing and the next-generation user experiences that are enabled by this platform.
Phil Rogers, corporate fellow at AMD.
From the Analyst Day breakout session presentation I will include the following illustrations here as the food for thoughts and further interests:
For Windows 8 related HSA, “C++ AMP” (indicated on the last illustration) is worth to expand on via Introducing C++ Accelerated Massive Parallelism (C++ AMP) [MSDN Blogs, June 15, 2011]
A few months ago, Herb Sutter told about a keynote he was to delivered today in the AMD Fusion Developer Summit (happening these days). He said by then:
“Parallelism is not just in full bloom, but increasingly in full variety. We know that getting full computational performance out of most machines—nearly all desktops and laptops, most game consoles, and the newest smartphones—already means harnessing local parallel hardware, mainly in the form of multicore CPU processing. (…) More and more, however, getting that full performance can also mean using gradually ever-more-heterogeneous processing, from local GPGPU and Accelerated Processing Unit (APU) flavors to “often-on” remote parallel computing power in the form of elastic compute clouds. (…)”
In that sense, S. Somasegar, Senior Vice President of the Developer Division made this morning the following announcement:
“I’m excited to announce that we are introducing a new technology that helps C++ developers use the GPU for parallel programming. Today at the AMD Fusion Developer Summit, we announced C++ Accelerated Massive Parallelism (C++ AMP). (…) By building on the Windows DirectX platform, our implementation of C++ AMP allows you to target hardware from all the major hardware vendors. (…)”
C++ AMP, as Soma tells in his post, is actually an open specification. Microsoft will deliver an implementation based on its Windows DirectX platform (DirectCompute, as Daniel Moth specifies in a later posta few minutes ago).
Daniel added that C++ AMP will lower the barrier to entry for heterogeneous hardware programmability, bringing performance to the mainstream. Developers will get an STL-like library as part of the existing concurrency namespace (whose Parallel Patterns Library –PPL and its Concurrency Runtime –ConcRT are also being enhanced in the next version of Visual C++ –check references at the end of this post for further details) in a way that developers won’t need to learn a different syntax, nor using a different compiler.
Update (6/16/2011): “Heterogeneous Parallelism at Microsoft”, the keynote where Herb Sutter and Daniel Moth introduced this technology with code and graphic demos is available for on-demand watching.
Update (6/17/2011): Daniel Moth’s session “Blazing-fast Code Using GPUs and More, with C++ AMP” is available as well! Beside, Dana Groff tells what’s new in Visual Studio 11 for PPL and ConcRT.
Pedal to the metal, let’s go native at full speed!
References:
- S. Somasegar’s announcement: http://blogs.msdn.com/b/somasegar/archive/2011/06/15/targeting-heterogeneity-with-c-amp-and-ppl.aspx
- Daniel Moth’s blog post: http://www.danielmoth.com/Blog/C-Accelerated-Massive-Parallelism.aspx
- Herb Sutter’s keynote at the AMD Fusion Developer Summit: http://channel9.msdn.com/Events/AMD-Fusion-Developer-Summit/AMD-Fusion-Developer-Summit-11/KEYNOTE
- Daniel Moth: Blazing-fast Code Using GPUs and More, with C++ AMP (session presented at AMD Fusion Developer Summit): http://channel9.msdn.com/Events/AMD-Fusion-Developer-Summit/AMD-Fusion-Developer-Summit-11/DanielMothAMP
- Announcing the PPL, Agents and ConcRT efforts for Visual Studio 11, by Dana Groff: http://blogs.msdn.com/b/nativeconcurrency/archive/2011/06/16/announcing-the-ppl-agents-and-concrt-efforts-for-v-next.aspx
- AMD Fusion Developer Summit Webcasts: http://developer.amd.com/afds/pages/webcast.aspx
With that in mind the upcoming 2012 AMD Fusion Developer Summit will definitely bring quite important updates as promised by the last breakout session illustration:
![]()
More on that: Adobe and Cloudera among Keynotes at AMD Fusion12 Developers Summit [AMD Fusion blog, Feb 3, 2012]
Finally, regarding the ‘ambidextrous’ strategy mentioned in the first sentence of the press release:
- ‘ambidextrous’ generally means ‘very skillful and versatile’ coming from ‘able to use the right and the left hand with equal skill’
- it is described in the press release as:
-
… adopting an SoC-centric roadmap designed to speed time-to-market, drive sustained execution, and enable the development of more tailored customer solutions. SoC design methodology is advantageous because it is a modular approach to processor design, leveraging best practice tools and microprocessor design flows with the ability to easily re-use IP and design blocks across a range of products. …
- and detailed in Mark Papermaster‘s (Senior Vice President and Chief Technology Officer) presentation for the 2012 Financial Analyst Day held on February 2, 2012 (see his full presentation in PDF) via the following illustrations:
![]()
as the Go-to-market approach together with ODM / OEM relationships
![]()
specifically highlighting the differentiation with it for the datacenter
![]()
related to MDC [Multi-DataCenter] workloads and HSA.
But also mentioning it in more generic terms as:
![]()
”Flexible around ISA [Instruction Set Architecture]” and
“Flexible around combination of AMD IP and third party IP”
Which caused probably the biggest interest and questions among participating analysts what made even The Wall Street Journal to report as AMD Will Incorporate Others’ Technology in Its Chips [Feb 3, 2011]:
Advanced Micro Devices Inc., the microprocessor maker whose fortunes have long been closely tied to the same technology as bigger rival IntelCorp., is planning a more flexible future.
The company on Thursday said it may pursue what it calls an “ambidextrous” strategy that would allow it to offer chips that include circuitry developed by other companies as well as its own. One obvious option would be low-power microprocessor technology from ARM HoldingsPLC that now dominates chip markets for cellphones and tablet computers.
AMD Chief Executive Rory Read, at a meeting with analysts here and in a subsequent interview, stopped short of saying that AMD would definitely add ARM-based technology to its chips in the future. But he noted that the company is laying the technical groundwork for modular chips that could accept blocks of circuitry developed by ARM as well as other companies.
“We have a relationship with ARM, and we will continue to build on it,” Mr. Read said in an interview. “We will continue to evolve that relationship as the market continues to evolve.”
Such possibilities are a sign of how the exploding market for mobile devices is causing many companies to alter their strategies. The x86 design used by AMD and Intel is the foundation of virtually all personal and most server computers.
But the two companies have struggled to make headway in the mobile-device market, in large part because of the lower power consumption of ARM-based designs. Meanwhile, ARM licensees—which include Qualcomm Inc., Texas Instruments Inc. and Nvidia Corp.—are adding to the pressures by edging toward the PC market, as MicrosoftCorp. finishes development of a new operating system that supports ARM and x86 chips.
AMD’s management team, in a meeting with analysts here, took pains to dispute the notion that AMD may become marginalized as ARM-powered competitors enter the PC market. Rather, they argued, AMD’s strength in graphics and microprocessors—and a strategy of customizing chips for large customers—will expand AMD’s opportunities.
Indeed, Mr. Read argued, it is Intel’s outsize influence of the tech industry that will tend to decline. “We will see the breakdown of proprietary control points,” Mr. Read said.
Though Mr. Read didn’t commit to embracing ARM’s designs, others who heard his presentation said the direction is clear. “AMD was very deliberate today about their goal to integrate more third-party intellectual property,” said Patrick Moorhead, a former AMD vice president and now principal analyst at Moor insights & Strategy. “Nothing they communicated excluded the potential for ARM.”
AMD’s remarks also underscore an industry shift—driven largely by the mobile market—away from separate chips and toward multi-function products that the industry calls SoCs, for systems on a chip, which save space and power in mobile devices and other hardware.
Intel and AMD have begun offering SoCs for laptop computers. But AMD discussed extensive plans to create more such products at a faster rate, using a flexible design scheme that can accommodate technology submitted by other companies.
Mr. Read, who previously served as a senior executive at PC maker Lenovo GroupLtd., has recruited others that also worked at IBM and have experience with other chip technologies than x86.
One is Mark Papermaster, AMD’s senior vice president and chief technology officer, who worked at Apple Inc. and Cisco Systems Inc. after leaving IBM in 2008. Another is Lisa Su, a senior vice president and general manager of AMD’s global business units, who most recently worked at Freescale Semiconductor HoldingsLtd., an ARM user.
Ms. Su gave an updated road map for a series of future chips, including products that AMD expects to be used in tablets that are powered by Microsoft’s forthcoming Windows 8 operating system. But Mr. Read said AMD would likely stay away from trying to sell chips for smartphones soon, characterizing the market as too crowded with competitors.




with all the features you’d expect for a fun and easy mobile experience. It boasts a bright and colourful, scratch resistant
The 


An app that showcases many features of the platform is this Movie Review app.









In connection with the availability of these breakthrough thin clients, Wyse also announced the results of independent testing, recently conducted by 










