Home » Posts tagged 'Microsoft' (Page 10)
Tag Archives: Microsoft
The low priced, Android based smartphones of China will change the global market
September 10, 2012 9:36 pm / 10 Comments on The low priced, Android based smartphones of China will change the global market
During the 12 months or so China took over the overall leading market role for smartphones from the key markets considered to be in the lead: US, Australia, Brazil, Great Britain (GB), Germany, France, Italy and Spain.
An even more dramatic change was that while on the old, combined lead market of the above countries high/moderate margin products were the dominating ones, on the new lead market of China average retail prices went down in the second quarter of 2012 to 1560 yuan (i.e. US$246) for the #1 Android with a whopping 82.8% market share, and to 1320 yuan (i.e. US$208) for the #2 Symbian now having only 6% share of the market.
It is notable as well that in China Apple had only a 6% market share vs. 23.7% in the combined old lead markets. According to a recent Reuters video report from Hong Kong we are witnessing (you can also watch this report in this post, as embedded well below in the following elaboration of details):
… commoditization of smartphones … hardware specifications for the handsets have already peaked…
A race to the bottom therefore will present a major challenge for Apple and Samsung who put together have dominated the industry in the last couple of years. If the China trends spread globally the shift to cheaper handsets will mean tighter margins and slower growth for this industry powerhouses and new opportunities for little known upstarts like Xiaomi.
Given my previous trend tracking posts the change will even be more dramatic as:
- The best smartphone based on the MediaTek MT6577 both technically and in terms of price is the MT6577-based JiaYu G3 with IPS Gorilla glass 2 sreen of 4.5” etc. for $154 (factory direct) in China and $183 [Sept 13, 2012], which is also the best example of the low priced, Android based smartphones of China will change the global market.
- – Lowest H2’12 device cost SoCs from Spreadtrum will redefine the entry level smartphone and feature phone markets [July 26 – Aug 16, 2012]
– Boosting the MediaTek MT6575 success story with the MT6577 announcement – UPDATED with MT6588/83 comingearly 2013in Q42012 and 8-core MT6599 in 2013 [June 27, July 27, Sept 11-13, Sept 26, Oct 2, 2012]
– Smartphone-like Asha Touch from Nokia: targeting the next billion users with superior UX created for ultra low-cost and full touch S40 devices [July 20 – Aug 12, 2012]
– MediaTek’s ‘smart-feature phone’ effort with likely Nokia tie-up[Aug 15-31, 2012] - Update: China to ship 300 mil. smartphones in ’13: MediaTek head [The China Post, Sept 26, 2012]: … overall shipments in China may reach 200 million in 2012. …
- Update: China market: Dual-core CPUs, 4-inch displays become standards for entry-level smartphones [DIGITIMES, Sept 17, 2012]:
Local brands in China have made upgrades to the specifications of their entry-level smartphones for the CNY1,000-1,500 (US$158-237) segment making dual-core 1GHz processors and 4-inch displays the industry standards, according to industry sources.
Prices of the previous mainstream models with single-core CPUs and displays below 4-inch sizes for the CNY1,000 segment in the first half of 2012 are now expected to drop to CNY500-800, the sources added.
China Unicom has led the purchase of the upgraded dual-core, 4-inch display smartphones recently, and its suppliers are all China-based vendors including Huawei Technologies, ZTE, Lenovo, Coolpad, TCL, Hisense, K-Touch and Wanlida, the sources revealed, adding that those makers will source chipset solutions from Qualcomm or MediaTek.
First-tier international players did not participate in China Unicom’s procurement on concerns of pricing and hardware specifications, the source asserted.
However, the pace of hardware upgrading may start slowing down as telecom companies in China are mulling reducing their subsidies to smartphone subscribers, while smartphone makers are also trying to maintain their profit margins, commented the sources.
The next round of competition will shift from hardware to software including product design, user’s interface and also smart audio recognition, the sources noted.
Neither Apple nor Samsung reacted to these challenges yet. Nokia was also playing safe with its recent announcement:
– Unique differentiators of Nokia Lumia 920/820 innovated for high-volume superphone markets of North America, Europe and elsewhere [Sept 6, 2012]
We may expect a fundamental reorganisation of the market in the next two quarters.
Meanwhile read through the details included below and make your own, hopefully more fine-tuned conclusions and predictions:
See: Kantar: Windows Phone has overtaken RIM Market Share in USA, “Key 8 Countries”
[WMPoweruser, Sept 3, 2012]
Note that in terms of mobile data traffic the market share is quite different. For North America (U.S. and Canada) Chitika Insights, the independent research arm of online ad network Chitika, released the following web usage market share report [Sept 5, 2012]:
Remark: iPads and other tablets are included here as well!
Relative to all that China is a quite different story:
3G phones months shipments reach 21.64 million, domestic mobile share over 70% – 3G手机月出货量达2164万部 国产手机份额超七成 [Sohu IT – 搜狐IT, Sept 10, 2012]
According to data published by the Telecommunications Research Institute of the Ministry of Industry and Information Technology …
根据工业和信息化部电信研究院公布的数据 …[the data in the translated Chinese text I’ve compiled into the below table:]
China sees soaring smartphone market in Q2 [Xinhua, Sept 3, 2012]
Beijing: China’s smartphone market saw its sales volume soar to 38.19 million units in the second quarter, according to a report released Monday by market researcher Analysys International.
The figure represented a 22.5-per cent increase compared with that of the previous quarter and a sharp rise of 127.1 per cent over the corresponding period in 2011, said the report.
Nearly 67 million mobile phones were sold in China in the second quarter, the report said, representing a 1-per cent decrease from the previous quarter and a 2-per cent decrease from the corresponding period in 2011.
Stellar growth sees China take 27% of global smart phone shipments, powered by domestic vendors [Canalys press release, Aug 2, 2012] – Android is the clear platform of choice, accounting for 81% of Chinese shipments
Shanghai, Palo Alto, Singapore and Reading – Canalys published its final Q2 2012 country-level shipment estimates to clients yesterday. Results show that China saw phenomenal growth of 199% year-on-year and 32% over the previous quarter. In total, more than 42 million smart phones were shipped into the channel in China in Q2 2012, representing the second consecutive quarter of record breaking volumes in a single country market. China accounted for 27% of the 158 million global smart phone shipments, compared to 16% for the United States.
Notably, growth in China was heavily driven by domestic vendors, while international vendors struggled to keep pace.
While Samsung maintained its overall leadership position in China with a 17% market share, this reduced sequentially as volumes were flat and as several local vendors closed the gap. ZTE, Lenovo and Huawei were the second-, third- and fourth-placed vendors, ahead of Apple, making up a third of the market. They achieved growth of 171%, 2,665% and 252% year-on-year respectively. Collectively, domestic Chinese vendors shipped 25.6 million units, representing a growth of 518% and 60% of the market. By comparison, international vendors grew by a more modest 67% to 16.7 million units. Apple fell to fifth place in China. While its shipments were up 102% year-on-year, they were down 37% compared to Q1 2012.
‘The rise of the domestic tier-one brands has been aided by a number of factors. Their reactiveness to market demands and deep understanding of local consumer behavior and preferences have been key in helping them surpass international peers in the fast-evolving Chinese market. Local tier-one vendors have worked hard in recent quarters to greatly improve their brand resonance among consumers and to expand and enhance their relationships and influence within operators,’ said Canalys Research Director for China, Nicole Peng. ‘But the tier-two vendors — the likes of Oppo, K-Touch and Gionee — have also stamped their mark, boosting smart phone shipments into tier-three and tier-four cities, predominantly through the open channels. As feature phone vendors, they already have established partnerships and strong brand awareness. These domestic vendors are making significant progress transitioning their portfolios and customer bases to be more focused on smart phones.’
Nokia and Motorola both lost significant ground in China, with Nokia’s volumes down 47% on Q2 2011. ‘Among the international vendors, only HTC managed an outstanding performance in mainland China. Its shipments grew 389% year-on-year to reach 1.8 million units for the quarter,’ said Jessica Kwee, Canalys Research Analyst. ‘Its success this quarter is heavily based on the strong performance of Desire V series devices, designed with the local China market in mind, underscoring the importance of tailoring propositions to local consumer preferences.’
Android has become a major growth driver in China, running on 81% of the smart phones shipped in China in Q2 2012.
On a global basis, Android continued to grow in significance, surpassing 100 million quarterly smart phone shipments for the first time and reaching two-thirds share of the market. ‘Growth in Android volumes of 110% far outpaced growth in the overall market of 47% year-on-year, heavily driven by Samsung, which saw Android volumes of over 45 million, contributed to by a full and broad portfolio of products, from its high-end flagship Galaxy S III down to its aggressively priced Galaxy Y and Galaxy Mini. Its sponsorship of the London Olympics and subsequent product placements are sure to attract new customers to ensure that Q3 delivers a strong performance,’ commented Pete Cunningham, Canalys Principal Analyst.
Samsung retained its gold medal position in the global smart phone market with a 31% share, followed by Apple and Nokia once again. Huawei and ZTE were unable to push in on the global top five with shipments of their own branded devices. HTC moved up to fourth place, though, just ahead of RIM, which shipped 8.5 million units in the calendar quarter.
Analyst contacts
To speak with any analyst quoted in this release, please contact the appropriate Canalys office: Nicole Peng, Jessica Kwee (Canalys APAC), Pete Cunningham (Canalys EMEA). Alternatively, you can speak with other members of Canalys’ global team of mobile analysts: Chris Jones (Canalys Americas), Rachel Lashford (Canalys APAC), Tim Shepherd (Canalys EMEA).
About Canalys
Canalys is an independent analyst firm that strives to guide clients on the future of the technology industry and to think beyond the business models of the past. We deliver smart market insights to IT, channel and service provider professionals around the world. Our customer-driven analysis and consulting services empower businesses to make informed decisions and generate sales. We stake our reputation on the quality of our data, our innovative use of technology, and our high level of customer service.
Smart phone and pad forecasts show varying OS fortunes [Canalys press release, Sept 10, 2012] – China and Android influence smart phone landscape, the US and Apple dominate pads
Shanghai, Palo Alto, Singapore and Reading – The latest product announcements by leading smart phone and pad vendors will help drive consumer demand to new heights, according to Canalys. It forecasts that in 2016, global annual smart phone shipments will be around 1.2 billion units, meaning a CAGR (Compound Annual Growth Rate) of 19.5%. It predicts pad shipments in the same year will hit 207 million – a CAGR of 26.8%.
Apple’s latest unveiling is attracting extraordinary interest and competitors have also made several major announcements in the past week, including Windows 8 devices from Nokia and Samsung; new Android smart phones from Sony, Motorola and Samsung; and Amazon’s enhanced Kindle Fire pads. With these big vendors attracting the headlines, Canalys has issued a timely reminder that the trends across pads and smart phones in various countries will be markedly different.
In smart phones, Canalys expects Asia Pacific to remain the largest region by volume, with annual shipments reaching 594 million by 2016. China will account for almost half of all shipments in the region and nearly a quarter of the world’s smart phones in 2016. This equates to only 10 million less than is forecast to ship in the whole of the Americas in that year.
Canalys managing director for Mobile and APAC, Rachel Lashford, said, ‘The latest, in-depth research for our dedicated Smart Phone Analysis China service reveals there will be a substantial increase in the number of first-time smart phone users in China over the next 12 months, while feature phone shipments will continue to decline. Smart phone sales will move beyond tier-one and tier-two cities.’
China’s domestic feature phone vendors are rapidly moving their businesses to smart phones, supported by low-cost solutions from chipset providers, such as MediaTek, Spreadtrum and Qualcomm’s QRD.
‘We anticipate strong demand from local Chinese vendors selling in both operator and open channels,’ said Nicole Peng, Canalys Research Director for China. ‘Chipset vendors are reporting growing momentum in 2.5G (EDGE) smart phone solutions. For less developed areas where 3G coverage is limited, 2.5G smart phones have advantages in cost and battery life. They are becoming popular with consumers, especially where prices are already close to those of feature phones (around RMB500, US$78). The tier-three and tier-four cities are feature phone vendors’ traditional strongholds. Local vendors will use their long-standing relationships with open channels and their established infrastructure to distribute smart phones, with or without operator subsidies, over the next few years.’
In terms of percentage growth, Canalys expects Latin America to move fastest, with a CAGR to 2016 of 27.3%. It forecasts good double-digit growth in all countries, but Brazil and Mexico will account for more than half of all shipments in the region.
Globally, Canalys expects Android to remain dominant, with 57% of the smart phones shipped in 2016 running the OS (up from 49% in 2011). It expects Apple’s share of this much larger market to remain similar to today, at around 18%. Microsoft is expected to make inroads over the coming years.
In the pad market, however, the OS picture will be quite different. Canalys expects Apple to take a little under half of the market in 2016. The plethora of Windows 8 pads that will be introduced over the next few years are predicted to bring Microsoft’s share to around 17%. Competitively priced Android pads, such as Google’s Nexus 7 and Amazon’s Kindle Fire models will have an impact in terms of volumes, but Android’s share is forecast to remain relatively stable at 35%, unless vendors make radical improvements to the overall user experience. In contrast to smart phone market trends, the US is expected to dominate pad shipments, with the volume more than doubling to 88 million units in 2016. China is expected to be the second largest country market, with shipments of around 20 million.
‘Pads are the fastest growing consumer electronics products in history and are forecast to represent 29% of total PC shipments in 2016. But the market remains dominated by a single vendor. Other PC and smart phone vendors are currently finding it hard to weaken Apple’s position,’ said Canalys Analyst Tim Coulling. ‘The only product that most would consider a big hit is the Kindle Fire, brought to market by Amazon – an Internet retailer. Tight integration of hardware, software and services is a prerequisite for competing in the pad market, even at low price points, and fragmentation among other pad vendors’ offers helps Apple maintain its position.’
Analyst contacts
To speak with any analyst quoted in this release, please contact the appropriate Canalys office: Rachel Lashford, Nicole Peng (Canalys APAC), Tim Coulling (Canalys EMEA). Or contact another member of Canalys’ global analyst team: Chris Jones (Canalys Americas), Jessica Kwee, Pin-Chen Tang (Canalys APAC), Pete Cunningham, Tim Shepherd, Tom Evans (Canalys EMEA).
Analysys data: 2012Q2 China Android Smartphone market 82.8% [Analysys International release, Sept 5, 2012] as translated by Bing:
Easy views network hearing” easy views international: according to EnfoDesk easy views intellectual library industry database recently publishing of 2012 2nd quarter China phone terminal market monitoring report under displayed, 2 quarter, China smart phone terminal (does not containing parallel and cottage machine) market in the, Android Department sales accounted for than from Shang last quarter of 76.7% upgrade to this quarter of 82.8%, net 6.1%. While the Symbian sales percentage has continued to free fall to the ground from the parent 11.8% to 6%. In addition, iOS small callback to 6%.
2012Q2 OS smartphone market penetration in China (not including parallel and cottage)
2 quarter pick-up systems from Smartphone ( encyclopedia of Analysys : smartphones ) [average smartphone] price changes, Android from 1670 [yuan i.e. US$263] last quarter, continuing down to the quarter of 1560 [yuan i.e. US$246]; 1320 [yuan i.e. US$208] of Symbian from last quarter down to 1170 dollars [yuan i.e. US$185] this quarter.
2012Q2 China Android and Symbian Smartphone price
(not including parallel and cottage)Information about the mobile Internet more relevant data, please visit
http://data.eguan.CN/yidonghulianFor more content, please visit http://www.enfodesk.com/SMinisite/maininfo/regapply-cf-17.html
Or call the customer service-4006-515.
Analysys data: 2012Q1 China Android Smartphone market share increased from 76.7% [Analysys International release, June 6, 2012] as translated by Bing:
“Analysys Web video” Analysys: at present, according to EnfoDesk Analysys think-tank on traditional retail markets of mobile phones (of the last quarter of 2012 quarterly monitoring mobile terminal market) data monitor display: Chinese smartphone market, Android system’s market share in handset sales rising 5 consecutive quarters.
Vulnerability analysis:
In the last quarter of 2012 China Mobile end-markets quarterly monitoring data show end of 2012 Q1, carrying Android in the Smartphone market system’s market share in the Smartphone Terminal 76.7%, 10% average quarterly market share gain. At the same time, as the Smartphone market continues to mature, carrying Android system average Smartphone prices are also way down to 1670 [yuan i.e. US$263 from 2300 yuan i.e. US$363 a year earlier].
Combined with traditional mobile phone sales channels under the line status, EnfoDesk Analysys Research think-tank believes that mobile phone sales market share of Android system continue to enhance, benefit from its open source nature attract numerous manufacturers to participate in, and China in the past two years in the Smartphone market and 3G business increment. Through the performance of manufacturers on the market today as well as the impact of EnfoDesk Analysys think tank study says
1. Is now dominated by application of the formation of eco-systems, as well as the Android open source, attracting new industry participants, such as Internet companies to enter product prices are depressed, make the increasingly intense market competition environment, product prices are driven down, threats to traditional enterprise bargaining power in the channel.
2012Q1 China smartphone sales share
2. Fragmentation trends exacerbate the Android system. Traditional manufacturing enterprises to overcome the effects of homogenization of products of intelligent systems, secondary development on the Android system, causes the application to version adjusted accordingly, application developer development costs gradually increased.
Smartphone price quarterly changes of 2011Q1-2012Q1 Android system
3. Sales in this period dominated by domestic brands in the low-end products, intelligent products of these enterprises continue to 3G input costs on the production line. But at the same time, while veteran international brand market share continues to decline, it would shorten the product line, focusing its research and development production 4G products research and development. With the advent of 4G era, will reshuffle the mobile terminal market. (Analysys International)
Information about the mobile Internet more relevant data, please visit http://data.eguan.CN/dianzishangwu
For more content, please visit Enfodesk Analysys Thinktank
http://www.enfodesk.com/SMinisite/maininfo/regapply-cf-17.html
Or call the customer service-4006-515.Related reading:
2011Q2 China’s massive increase in Android share Symbian tumble
Is sun setting on smartphone profit miracle? [ReutersVideo YouTube channel, Aug 16, 2012]
… in 2 years the low-end has blown up …
China smartphone sales by price tier Q1 – 2010 Q1 – 2012 <1,500 yuan [<US$ 237] 17.7% 60% 1,500-3,000 yuan [US$ 237-473] 51.5% 24% >3,000 yuan [>US$ 473] 30.8% 16% Source: Jefferies Research
Cynthia Meng, China/HK TMT Equity Research, Jefferies Hong Kong:
[00:49] Next year it’s going to be about who is going to provide the best value for my money from a consumer point of view, from a telco point of view, because we think that hardware specifications for the handsets have already peaked. [01:03]Narrator, xxx Gordon in Hong Kong:
In other words the oversized screen and quadcore processors of your precious Samsung [Galaxy] S III will soon be standard and achieved in handsets in China. [01:13]… commoditization of smartphones …
[02:11] A race to the bottom will present a major challenge for Apple and Samsung who put together have dominated the industry in the last couple of years. [02:19] If the China trends spread globally the shift to cheaper handsets will mean tighter margins and slower growth for this industry powerhouses and new opportunities for little known upstarts like Xiaomi. [02:26]
The Chinese View: VIDEO: STUDIO INTERVIEW: CHINA’S SMARTPHONE MARKET [CCTV News – CNTV English, Sept 3, 2012]
iPhone Ranked Seventh in China’s Smartphone Market — Watch Out, ZTE [AllThingsD.com, Aug 24, 2012]
Apple’s iPhone has been gaining a lot of traction in China recently. As Apple CEO Tim Cook said during the company’s third-quarter earnings call, greater China accounted for two-thirds of Apple’s revenue in the Asia-Pacific region during the period.
“In terms of iPhones in general in mainland China, we were incredibly pleased with our results,” Cook said. “We were up over 100 percent, year over year.”
That’s an impressive achievement. But Apple still has a lot of work to do in China before the iPhone claims the same levels of market penetration it enjoys in the U.S. In China, the iPhone has captured about 7.5 percent of the smartphone market, compared to rival Samsung, which has claimed more than 20 percent, according to IHS iSuppli. Despite its popularity in the country, the iPhone is still ranked seventh in the Chinese smartphone market.
Why? Two reasons. First, Apple doesn’t yet offer a truly low-end smartphone that appeals to price-conscious Chinese consumers. (To be clear, China Telecom is offering the iPhone fully subsidized, but it requires subscribers to sign a contract that ties them to a two-year $62 per month plan.) Second, and more importantly, the iPhone doesn’t yet support Time Division Synchronous Code Division Multiple Access (TD-SCDMA), China’s homegrown wireless standard. And until it does, China Mobile, the world’s largest wireless carrier, can’t offer it to its 688 million or so subscribers.
“Among all the international smartphone brands competing in China, Apple is the only one not offering a product that complies with the domestic TD-SCDMA air standard,” IHS iSuppli’s Kevin Wang said in a statement. “For Apple, this is a huge disadvantage, as TD-SCDMA represents the fastest-growing major air standard for smartphones in China, with shipments of compliant phones expected to rise by a factor of 10 from 2011 to 2016.”
In other words, if Apple wants access to the massive addressable market that China Mobile has to offer, it’s going to have to offer a lower-end iPhone variant designed specifically for TD-SCDMA, something it has been loath to do in the past, and hasn’t given any indication that it’s willing to do in the future. As Cook said during Apple’s last earnings call, the company feels that its business is strongest when it focuses on making the best products it can, not the most inexpensive ones.
“I firmly believe that people in the emerging markets want great products, like they do in developed markets,” Cook said. “And so we’re going to stick to our knitting and make the best products. And we think that if we do that, we’ve got a very, very good business ahead of us. So that’s what we are doing.”
Breakingviews: Apple v. Samsung [ReutersVideo YouTube channel, Aug 27, 2012]
Apple Should Take The $199 Chinese Smartphone Seriously [Seeking Alpha, Sept 6, 2012]
At a time when China is set to overtake the U.S. as the world’s largest smartphone market, little-known Chinese firms are prepared to battle it out for market dominance with the maker of the game-changing iPhone, Apple (AAPL). As per the predictions of IDC and Gartner, China’s smartphone shipments could hit 140 million this year, exceeding those in the United States.
There are a number of Chinese brands offering similar capabilities, nominally, as the iPhone at half the price, most of them using a forked version of Google’s (GOOG) Android. The names include ZTE Corp., Lenovo Group, and other small private firms like Xiaomi, Gionee, and Meizu Technology. Even cheaper smartphones are offered by Alibaba Group, Shanda Interactive, and Baidu (BIDU) for fewer than ¥1,000 (~$150 U.S.).
Xiaomi Technology, founded just two years ago, has emerged as a serious potential threat to the likes of Apple and Samsung in smartphone arena. According to its CEO, the company sold more than 3 million phones with revenues close to $1 billion for the first half of 2012. Its latest offering, a successor to its popular MiOne (MI) smartphone, the MI2, costs less than half the price of iPhone 4S, but exceeds its specifications. Xiaomi not only tries to mimic the iPhone’s specifications, but has also been able to charge fans ¥199 (~$31) to attend the Beijing launch of the phone, the same way as Apple followers would pay to see Steve Jobs showcasing new products. The Xiaomi conference was attended by more than 1,000 people, with the proceeds going to charity. The MI2, which is expected to hit the markets in October, will have quad-core Qualcomm (QCOM) S4 Pro SoC, an 8 mega-pixel camera, and a voice-assistant similar to Apple’s Siri, and is priced at ¥1,999 ($310). This is no cheap knock-off, but rather a serious piece of hardware packed with the latest technology.
The fascinating part of Android’s rise here is that Microsoft (MSFT) will likely see more profit from many of these phones than Google will due to the licensing agreements many of them have made to avoid patent issues with Redmond. Reports are spotty, but Microsoft collects anywhere from $5 to $15 per Android license and has deals with at least half of the phones sold. Moreover, it is very possible it makes more money than Google does.
In the coming years it is expected that Apple’s market share may flatten out or even dip, as it has this year, but market share is not Apple’s goal; it has always been about margins — selling a premium product at extremely high margins to those with the resources to not care about the upfront cost. Estimates from IDC place the sub-$200 smartphone at 40% of the shipments, while devices costing more than $700 made up 11% of the market, which is where Apple plays and why it still controls most of the profits generated by the industry. China and India make up 40% of new smartphone activations.
This huge difference in shipments is mainly due to the limited purchasing power of an average Chinese person, which is around ¥800-¥1,500 ($130-$240). By contrast, the iPhone comes with a price tag of around $800, the equivalent of two months of earnings of an urban Chinese person (in an area that has around 670 million people).
According to a report from Gartner, Apple’s market share by volume has been sliding and iOS‘ share of the mobile operating system space is expected to slip to third place by 2016 below Android and Windows Phone. The Gartner report is, however, very controversial as Windows Phone has not proven anything to this point, although Nokia’s (NOK) sales of its Lumia 610 and Asha line of proto-smartphones are keeping its brand alive while it searches for the killer phone. Even in its second-largest market, iPhone sales slipped for the April-June quarter due to inventory adjustments after the huge launch of the iPhone 4S.
Apart from these estimates, Apple also suffers on various fronts in China. The iPhone is backed by China Telecom and China Unicom, but the country’s and the world’s leading telco China Mobile (with about 655 million subscribers) has still not supported it. Apple and China Mobile are still working on the details of China Mobile’s implementation of CDMA, which requires Apple to build a specific phone for its network.
Responding to the competition and the difference between the iPhone and the local offerings, Apple recently slashed the price of the iPhone 3GS below $200. While an entry-level Apple phone is something that the market will absorb, part of Apple’s appeal is the status it confers and a 3GS simply not a strong enough status symbol to drive sales. Mix in that with Chinese preferences for buying from Chinese companies and this market becomes a whole lot harder for Apple to maintain not its sales per se — it can manipulate prices to maintain sales — but its extreme margins. The latest earnings call highlighted this as it sold a lot of lower-end iPads and iPhones in Asia, which pushed its results and future guidance under 40% net margins.
Companies like Lenovo, ZTE, and Huawei are gaining because they are Chinese and are providing good products at reasonable prices. Lenovo, in particular, is pushing its smartphone and PC strategy both up and down the value chain, similar to Samsung’s approach. It is working very well for Lenovo, whose revenues were up 40% in the second quarter when everyone else was complaining of softening business.
Apple’s problems are the standard problems for a company on top of the world; everyone will nibble away at it in various little ways. How it responds to this is key.
The recent lawsuit victory over Samsung and its pressing of the legal attack smacks of a company that is frightened. Why should it fear Samsung? And if it doesn’t, why did it go after Samsung and restrict consumer choice, a clear breach of its branding compact with its fans? Is it trying to push Samsung into Windows 8 Phone’s arms? All of these things point to further margin erosion for Apple and a slowing of its titanic growth without a new market to push into. As things stand now, staking a new position in Apple requires believing none of these issues matter.
It points to Apple becoming a value trap at some point in the future. Not every country, especially China, will grant Apple an injunction against knockoff competition; quite the opposite is true. Many investors are sitting on capital gains so large they can’t sell, and the dividend will pay them well enough to stay in even if the price goes nowhere. But new investors should be very careful in light of the market dynamics.
Microsoft adding staff, R&D in China mobile push [Associated Press, Sept 6, 2012]
BEIJING (AP) — Microsoft Corp. will hire more than 1,000 additional employees in China this year and boost research and development spending by 15 percent as it tries to catch up with Apple and Google in the fast-growing mobile Internet market, executives said Thursday.
The announcement adds to intensifying competition in wireless Internet in China, where nearly 400 million people surf the Web using mobile phones and other devices. Microsoft is promoting its Windows 8 mobile operating system but came late to the market and trails Apple Inc. and Google Inc., whose Android system is widely used in China.
“We respect that we have two players in the market which have a strong role, and we feel ready to attack and have different offers to basically change the game plan on that one,” said Microsoft’s CEO for China, Ralph Haupter, at a news conference.
The new employees will be in addition to Microsoft’s workforce of 4,500 in China and will be spread across research and development, marketing and customer service, Haupter said.
Research spending in China will rise by 15 percent over last year’s $500 million, according to another executive, Ya-Qin Zhang, Microsoft’s Asia-Pacific chairman for research and development. He said the current research staff of 3,000 would be expanded by about 15 percent.
Global technology companies and local rivals are spending heavily to gain a foothold in mobile Internet in the world’s most populous online market as Chinese users shift quickly to the new technology.
This week, Chinese search engine Baidu Inc. released its own new mobile browser to compete with Google and Apple and announced it will open a cloud computing center.
China had 538 million people online at the end of July, up 11 percent from a year earlier, according to the China Internet Network Information Center, an industry group. The share that uses wireless devices grew twice as fast, rising 22 percent to 388 million, or 70 percent of the total.
Android dominates the Chinese smartphone market, used on 76.7 percent of phones in the
secondfirst quarter of this year, according to Analysys International, a research firm. Apple’s iPhone dominates the higher end of the market.Microsoft plans to recruit more local partners to develop mobile applications specifically for China, said Haupter. He said the company believes it has an advantage in doing that because developers can draw on their experience working on other Microsoft products.
Zhang said Microsoft’s six development centers in China that now spend about 80 percent of their time working on products for global markets will focus more on creating offerings tailored to Chinese customers.
Microsoft also plans to expand its cloud computing business in China, the executives said. Zhang said about 100,000 commercial customers now use its private cloud computing service and a service for use by the public is being developed.
Microsoft Names New Leaders in Key International Markets [Microsoft press release, April 13, 2012]
… Ralph Haupter, currently serving as area vice president (AVP) for Microsoft Germany, has been promoted to corporate vice president and named CEO for Microsoft GCR. Haupter is replacing Simon Leung who has decided to leave Microsoft for personal and family reasons. Gordon Frazer, currently serving as managing director (MD) for Microsoft U.K., has been named chief operating officer (COO) for Microsoft GCR. He is replacing Michel van der Bel, who will assume the role of MD for Microsoft U.K. Haupter and van der Bel will report to Jean-Philippe Courtois, president of Microsoft International, and Frazer will report to Haupter. …
…
Haupter is a seven-year veteran of Microsoft, having delivered excellent and sustainable results in growth and profitability and repeatedly proving his ability to build and grow high-performing, diverse organizations. He previously served as head of the partner division for Europe, Middle East and Africa and general manager (GM) of Microsoft’s Small and Midmarket Solutions & Partners Group for Western Europe, both based in Paris, and served as COO for Microsoft Germany before becoming the German AVP. Before that, he worked for IBM both in Germany and internationally.
Frazer is a 16-year veteran of Microsoft, having served as the GM for Microsoft South Africa for four years and most recently as the Microsoft U.K. MD for the past six years. He brings a tremendous amount of operational expertise to the Microsoft GCR team from his various roles across both developed and emerging markets. His leadership in managing the full breadth and depth of Microsoft’s business in the U.K. will serve as a strong asset in helping take Microsoft China’s operations to the next level of efficiency and growth.
…
Leading the New Era, Winning the Future—Microsoft Announces Development Strategy in China [Microsoft China press release, Sept 6, 2012]
Partnering for an Innovative, Competitive, and Talented China
New leadership team in Greater China
(third from left is the COO Gordon Frazer and the fourth is the CEO Ralph Haupter)September 6, 2012, Beijing– Microsoft China today announced its new strategy and commitment to partnering with the country for an innovative, competitive and talented China by further enhancing and accelerating investments. In the new fiscal year, Microsoft will recruit more than 1,000 staff in China, 50% of which will be college graduates. Microsoft’s annual R&D investment will exceed $500 million, and the company will explore local markets in more provinces and deepen its engagement in industrial informatization.
Over two decades of growth, Microsoft China has continued to penetrate deeply into increasingly important local markets. Ralph Haupter, Corporate Vice President, Chairman & CEO Microsoft Greater China Region, said: “Since entering China 20 years ago, Microsoft has grown steadily in China and acquired a deeper understanding of the Chinese market. Our new strategy reflects our perception, emphasis and commitment to the China market. In this new era, China and the entire Greater China Region will become the source of global innovations. Through comprehensive devices and services combined with cloud computing, Microsoft is working closely with the Chinese government, partners, customers and the academic world, entering this new era by leveraging our advantages.”
Haupter stressed that this year is a big year for Microsoft, with the introduction of many new products and technologies, and also a year where Microsoft China is making a great effort to further develop the market. “Our new leadership team in Greater China has helped develop a new strategy for customers and partners, deepening cooperation with governments of all levels to strengthen innovation in China. The team will popularize new technologies and explore new markets,” Haupter said.
Through continuous investment of innovation resources and improving the scale of partnerships in China over the years, Microsoft Asia-Pacific R&D Group has become Microsoft’s largest R&D base outside of the United States, with the most complete functions and innovation chain covering basic research, technology incubation, product R&D and industry cooperation. Chinese R&D teams have made great contributions to Microsoft products launched this year, such as Windows Server2012, Windows 8, New Office, SQL Server 2012 and Surface. Ya-Qin Zhang, Corporate Vice President and Chairman of Microsoft Asia-Pacific R&D Group, said: “We are lucky to be in an era where globalization is deepening, the IT revolution is emerging and China is rising. Microsoft’s continuous exploration in natural human-machine interfaces, mobile Internet and cloud computing will help us win the future and contribute to China’s sustainable development.”
Samuel Shen, COO of Microsoft Asia-Pacific R&D Group, said Microsoft’s software outsourcing business was now worth more than $200 million per year. In the future, Microsoft will continue to work closely with local communities through programs such as the Internet of Things, Big Data, cloud computing, cloud-based smart cities and the Microsoft Accelerator for Cloud Computing, accelerating the vision of “Innovation in China, Innovation for the World”
According to Microsoft’s new strategy in China, Microsoft is committed to cooperating with the Chinese government and industry, aligning with China’s priorities and partnering for an Innovative, Competitive, and Talented China. Gordon Frazer, Vice President and COO of Microsoft Greater China Region, said that over the next five years, Microsoft China will expand its footprint in China, deepen cooperation with governments of all levels and partners, improve customer support and foster talents on a broad scale:
Expand Microsoft’s footprint in local markets: Over the next five years, Microsoft will expand its presence in over 20 cities across 15 provinces by expanding local teams, enhancing local management, working closely with local governments, making contributions to local informatization, building cloud-based smart cities, and providing cloud-based solutions for e-government, city management and citizen services.
Accelerate local partner ecosystems and expand service coverage: Microsoft will deepen customer services, deliver joint services and solutions with partners, and engage in further convergence of informatization and industry upgrading to improve the core competency of Chinese enterprises. By the end of this year, Microsoft will set up its second technical support center in China to enhance support for Chinese customers and partners, share best practices and knowledge of supporting global customers to help them accelerate the adoption of new technologies and share with them the experience of providing cloud services to customers in Asia. Microsoft will also drive partners’ development through many forms: system-grade innovation support for OEMs, software engineering assistance for software outsourcing companies and innovative design references for hardware manufacturers.
Foster talents in a large scale: Over the next five years, Microsoft will hire more talent in China to better serve and support its partners in China, foster talents for the Chinese software industry and improve the skills of Chinese youths.
China to Overtake United States in Smartphone Shipments in 2012, According to IDC [IDC press release, Aug 30, 2012]
Top Five Smartphone Markets and Market Share for 2011, 2012, and 2016 (based on shipments)
Country 2011 Market Share 2012 Market Share 2016 Market Share 2011 – 2016 CAGR PRC 18.3% 26.5% 23.0% 26.2% USA 21.3% 17.8% 14.5% 11.6% India 2.2% 2.5% 8.5% 57.5% Brazil 1.8% 2.3% 4.4% 44.0% United Kingdom 5.3% 4.5% 3.6% 11.5% Rest of World 51.1% 46.4% 46.0% 18.1% Total 100.0% 100.0% 100.0% 20.5% Source: IDC Worldwide Mobile Phone Tracker, 2012 Q2 Forecast Release, August 30 2012
Strong end-user demand and an appetite for lower-priced smartphones will make China (PRC) the largest market for smartphones this year, overtaking the United States as the global leader in smartphone shipments. According to the International Data Corporation (IDC) Worldwide Quarterly Mobile Phone Tracker, China will account for 26.5% of all smartphone shipments in 2012, compared to 17.8% for the United States.
“Looking ahead, the PRC smartphone market will continue to be lifted by the sub-US$200 Android segment,” said Wong Teck-Zhung, senior market analyst, Client Devices, IDC Asia/Pacific. “Near-term prices in the low-end segment will come down to US$100 and below as competition for market share intensifies among smartphone vendors. Carrier-subsidized and customized handsets from domestic vendors will further support the migration to smartphones and boost shipments. Looking ahead to the later years in the forecast, the move to 4G networks will be another growth catalyst.”
“Regionally, we expect smartphone demand to flow down to lower-tier cities,” added James Yan, senior market analyst for Computing Systems Research at IDC China. “After going through a period of sustained high growth, top-tier cities are likely to see decelerating smartphone growth rates. In contrast, secondary cities are expected to experience accelerated smartphone growth, with strong demand for low-cost models as well as high-end models, which are desired as status symbols.”
“The fact that China will overtake the United States in smartphone shipments does not mean that the U.S. smartphone market is grinding to a halt,” said Ramon Llamas, senior research analyst with IDC’s Mobile Phone Technology and Trends program. “Now that smartphones represent the majority of mobile phone shipments, growth is expected to continue, but at a slower pace. There is still a market for first-time users as well as thriving upgrade opportunities.”
“In addition to China and the United States, several other countries will emerge as key markets for smartphone shipment volume over the next five years,” said Kevin Restivo, senior research analyst with IDC’s Worldwide Mobile Phone Tracker program. “High-growth countries such as Brazil and Russia will become some of the most hotly contested markets as vendors seek to capture new customers and market share.”
Top Five Markets for Smartphone Shipments
As it becomes the leading country for smartphone shipments this year, the PRC smartphone market will continue to grow, primarily on demand for lower-cost handsets. While this bodes well from a volume perspective, it also means lower average sales values (ASVs), thinner margins, and increased competition from all players. Over the course of the forecast, China’s share of the global smartphone market will decline somewhat as smartphone adoption accelerates in other emerging markets.
Smartphone shipments into the United States will increase as users upgrade their devices and feature-phone users switch over to smartphones. Furthermore, a combination of lower-priced models, expansion of 4G networks, and the proliferation of shared data plans will encourage continued smartphone adoption. Smartphones are already the device of choice at the major carriers, and regional and prepaid carriers are following suit and competing with alternative service plans.
With smartphone penetration in India currently among the lowest in Asia/Pacific, the market has tremendous untapped growth potential. Low-end smartphones offering dual-SIM capability and local apps and priced around US$100 will rapidly bring this market to life. Although 3G data plans are currently too expensive for the majority of consumers in India, IDC expects the popularization of 3G, and in later years 4G, to drive smartphone uptake as operators roll out more affordable data plans and generous subsidies while expanding offerings to tier 2 and tier 3 cities. The affordability of service plans will be another important key to smartphone adoption in India.
Smartphone growth in Brazil will be bolstered by strategic investments by mobile operators, smartphone vendors, and regulators. Operators’ focus on increasing ARPU will drive greater demand for smartphones while smartphone vendors will look to reap greater profitability from offering such devices. The Brazilian government, meanwhile, will offer tax exemptions for smartphones and protect local manufacturing against foreign vendors. These factors, combined with solid end-user demand, will drive smartphone volumes in the coming years.
The United Kingdom has been one of the fastest growing smartphone markets in Western Europe, driven by the high operator subsidies and long-term post-paid contracts. Over the forecast period, smartphone shipments will continue to increase due to the introduction of LTE and a new range of services that will appeal to heavy smartphone users. In addition, price erosion on HSPA devices will also attract feature phones users. Growth rates will slow in the later years of the forecast as penetration plateaus and operators seek out alternative subsidy models.
Unique differentiators of Nokia Lumia 920/820 innovated for high-volume superphone markets of North America, Europe and elsewhere
September 6, 2012 8:30 pm / 4 Comments on Unique differentiators of Nokia Lumia 920/820 innovated for high-volume superphone markets of North America, Europe and elsewhere
Updates: Is Nokia A Value Buy At $2.65? [Seeking Alpha, Nov 13, 2012]
AT&T (T) announced it is offering Nokia’s (NOK) flagship phone, the Lumia 920, for $99 with a two year contract. In addition, AT&T is offering the Lumia 820 for $49.99. When compared to its competitors, Apple’s (AAPL) iPhone 5 costs $199 with the same two year contract and Samsung’s (SSNLF) Galaxy Note II for $299. Even when you compare the price of previous iPhone models to the Lumia 920 (all with a two year contract) the iPhone 4S costs $99.99 and the iPhone 4 $49.99 in some areas. That means that a Lumia 920 costs the same as an iPhone 4S and the Lumia 820 costs the same as the ancient iPhone 4. On top of this good news AT&T also announced that it will add a wireless charging pad for free for those who purchase a Lumia 920 early but be warned this is a limited time offer and AT&T most likely has a cap for the free charging pads.
The Lumia 920, with no contract commitment, costs $449.99, whereas the iPhone 5 and the Samsung Galaxy Note II costs $649.99. With no commitment for a two year plan there are no carrier subsidies, and the $200 price difference is very promising for Nokia. In just its second generation Windows phone is was able to match, if not beat, the competition in both software and hardware while making it for roughly $200 less.
I personally love the strategy Nokia is moving forward with. The company is temporarily cutting profit margins in its Lumias in order to establish a larger customer base. … Brand loyalty is always prevalent but in the U.S. it is in overdrive and customers need an obvious reason to switch operating systems. With a much lower price point many will be willing to give the Lumia 920 a shot and those on a tight budget that never even considered buying a newer smart phone and would opt out for older smart phones such as an iPhone 4 or older now have a new option with the Lumia 820 which costs less than $50. Assuming Nokia can gain a decent market share in the U.S., say around 4-5%, that will get its foot in the door and become relevant as an alternative to the other operating systems. …
And a summarized report on how Lumia 920 went on sale November 9 through AT&T:
Nokia’s Lumia 920 Sold Out Many Places Over The Weekend, But There Weren’t That Many To Begin With [Business Insider, Nov 13, 2012]
End of updates
Although the price of Lumia 920/820 will be announced later, with selected country availabilities, the effective retail prices of them should be in the current US$450-600 range of what I am considering as “high-volume superphones”. Detailed comparison of Lumia 920 with the leaders on that
market is here (source: The Verge). Note that the lead market for that segment is North America and Europe, and this will remain so for the foreseeable future (unlike below of that, where China and the BRIC countries are leading the market).
What was announced by Nokia should therefore be evaluated in terms of unique Nokia differentiators introduced for the current state of that market. Otherwise Nokia will be misunderstood like here.
Note as well that Samsung GALAXY S III Reaches 20 Million Sales Milestone in Record Time [Samsung Mobile press release, Sept 6, 2012]: “…in just 100 days since its debut in May 2012. … a new record …”
Update as of Sept 11, 2012: Nokia Lumia 900 LTE (4G) list price went down to $499.99 and with a 2-year AT&T contract it only costs $9.99 and without it but AT&T locked $299.99. It received 437 reviews with an average score of 4.6 out of 5.0. This is the highest average customer score of all high-end smart phones sold by Amazon.
Update: Nokia Lumia 900 Buy now – Nokia – India [Sept 14, 2012]: list price Rs 32,999 i.e. US$ 608.
Update as of Sept 13, 2012: … according to the Marketing Director of Nokia China Huang Guoqiang’s personal micro-blog and displays, Nokia will launch China Mobile’s customized version of Lumia 920, supporting the TD-SCDMA standard. … Via: WinP.cn
Nokia Lumia 920 & 820 Announcement – Nokia and Microsoft Press Conference on September 5 in New York [nokia YouTube channel, Sept 7, 2012]
The major specs are as follow (the differentiators are highlighted in bold; Nokia ClearBlack with high brightness mode; Sunlight Readability Enhancements and color boosting are just in this type of lesser way because there are no explanations for them although they were not present in any of the previous Lumias):

| Nokia Lumia 920 | Nokia Lumia 820 | |
| Display: | 4.5 inch Nokia PureMotion HD+ WXGA 1280×768 IPS LCD; Super Sensitive Touchfor nail & glove use; Nokia ClearBlack with high brightness mode; Sunlight Readability Enhancements, luminance 600 nits; color boosting and Corning® Gorilla® Glass | 4.3 inch ClearBlack AMOLED WVGA 800×480; Super Sensitive Touch for nail & glove use; Nokia ClearBlack with high brightness mode; Sunlight Readability Enhancements and color boosting |
| Battery: | 2000mAh with integrated Qi wireless charging (built-in) | 1650mAh with support for Qi wireless charging(via a Wireless Charging Shell) |
| Processor: | 1.5GHz Dual Core Snapdragon S4 | 1.5GHz Dual Core Snapdragon S4 |
| Main Camera: | 8.7MP with Nokia PureView advanced optical imaging stabilization and Carl Zeiss optics; Full 1080p HD video capture at 30fps | 8MP Auto Focus with Carl Zeiss optics; Dual LED flash; Full HD 1080p video capture at 30fps |
| Front camera: | 1.2MP with 720p HD video | VGA |
| Memory: | 1GB RAM | 1GB RAM |
| Storage: | 32GB mass memory with 7GB free SkyDrive storage | 8GB mass memory with upto 32GB microSD memory card support and 7GB free SkyDrive storage |
| Upload speed: | HSUPA Cat 6 – 5.76 Mbit/s LTE Cat 3 – 50 Mbit/s |
HSUPA Cat 6 – 5.76 Mbit/s LTE Cat 3 – 50 Mbit/s |
| Download speed: | EGPRS MSC 12 – 236.8 kbit/s HSDPA Cat 24 – 42.2 Mbit/s LTE Cat 3 – 100 Mbit/s |
EGPRS MSC 12 – 236.8 kbit/s HSDPA Cat 24 – 42.2 Mbit/s LTE Cat 3 – 100 Mbit/s |
| Exclusive new apps: | Nokia City Lens Nokia Smart Shoot [Nokia] Cinemagraph Angry Birds Roost (for 3 months) and more |
Nokia City Lens Nokia Smart Shoot [Nokia] Cinemagraph Angry Birds Roost (for 3 months) and more |
Joe Belfiore on the Nokia Lumia 920 [nokia YouTube channel, Sept 5, 2012]
So let’s see the unique screen and camera technologies and other innovations behind the new, Windows Phone 8 based Lumia devices:
- An addition to the innovative Nokia ClearBlack display technology: Nokia Pure Motion HD+ [nokiaYouTube channel, Sept 5, 2012]
The great pixel run. A film about the best display. Nokia PureMotion, making pixels faster! Building on top of innovations in the ClearBlack technology, PureMotion introduces new innovation on outdoor viewing experience in mobile displays. In addition to the very low reflectance, which largely improves dark tone rendering in ambient light, PureMotion adds high luminance mode for backlight LED-driving and image contrast enhancement, on top of superb optical stack design. Together they improve the overall contrast and thus brightness and sunlight readability. In an extremely bright environment the Lumia 920 PureMotion display its backlight luminance reserve and becomes the smartphone WXGA (1280×768) display with highest peak luminance. For user high luminance mode is fully automatic, working based on the data coming from ambient light sensor. The adaptive image contrast enhancement compensates the loss of contrast caused by the unavoidable ambient light reflections inside the display-touch-window optical stack. It enhances the display readability by altering the user interface graphics color and contrast mathematically and optimizes it dynamically for any ambient light viewing condition. Nokia stated on the New York that the Lumia 920 screen is perfectly readable even in a desert. Equally important is that these sunlight readability enhancements are fully automatic for the user.
More information: PureMotion Technology White Paper [Nokia , Sept 4, 2012]
as well as The leading ClearBlack display technology from Nokia [this same ‘Experiencing the Cloud’ blog, Dec 18, 2011 – May 8, 2012] - Another direction of Nokia PureView technology with the advanced Nokia Optical Image Stabilisation (OIS) using so called floating lens technology:
OIS on Nokia Lumia 920 [nokiaYouTube channel, Sept 5, 2012]This is the real comparitive shot since the below, more comprehensive marketing video giving a misleading perception that the bicycle shot was done by Lumia 920. “In an effort to demonstrate the benefits of optical image stabilization (which eliminates blurry images and improves pictures shot in low light conditions), we produced a video that simulates what we will be able to deliver with OIS. … we should have posted a disclaimer stating this was a representation of OIS only. This was not shot with a Lumia 920. At least, not yet. We apologize for the confusion we created.” Nokia told in a separate, “An apology is due” post. See: http://conversations.nokia.com/2012/09/06/an-apology-is-due/PureView: The next innovation [nokia YouTube channel, Sept 5, 2012]
The latest innovation in PureView technology: http://nokia.ly/Q7qTtR changing the way we capture the world around us! The fundamental elements of PureView announced as part of the first phase were pixel oversampling, enabled through the use of a high performance sensor, high performance Carl Zeiss optics and Nokia proprietary image processing. As part of the original announcement Nokia outlined their intention to reuse the core elements, high performance optics, sensor and image processing in different combinations over time. With this second phase development of PureView, these fundamental enabling elements are used again, but in different forms. In place of the 808 PureView’s 41mp sensor lies the latest generation BSI (Backside illumination) sensor with a total of 8.7mp. The new optics are again developed in conjunction with Carl Zeiss; Nokia’s most challenging opto-mechanical design to date. The hardware has also been designed to accommodate further future developments in software image processing. Then comes Nokia’s new OIS system which, based on lab tests, can cater for around 50% more movements per second than conventional OIS systems – up to around 500 movements every second! An OIS works by detecting camera movement using a gyroscope – a highly accurate sensor used to detect the degree and direction of movement. But that’s pretty much where the similarity between Nokia’s OIS system and broadly comparable OIS systems ends. Rather than a single lens element being shifted to compensate for camera shake (as in most OIS systems), Nokia’s OIS system moves the entire optical assembly in perfect synchronisation with the camera movement, or to be more precise, unintended camera shake. The benefit of this approach is that the amount and form of camera movement that can be compensated for is much greater. In addition the position of the lens assembly is monitored in real time, even whilst it’s moving to its calculated position allowing it to be continuously updated regardless of how random the camera movement is. This process of checking operates at a rate of up to 5x more frequently than typical OIS systems, approximately 300 times faster than that of the average human reaction time to an expected event. Nokia calls this approach “floating lens” technology. Adding up all of the advantages of Nokia’s OIS system means camera shake in lower light can be compensated for to lower lighting levels than conventional OIS systems, ultimately resulting in low light photography as well.The above videos are demonstrating PureView advanced optical imaging stabilization for HD video recording on the move (except the closing part of the second video). And the following one was the example of the low light, still image photography which without PureView would have been blurry (because of the need to keep the lens open for longer time, so camera shake comes into effect), or almost dark otherwise (when the exposure is the “usual” one, so there is insufficient light):

More information: PureView imaging technology white paper 2 [Nokia, Sept 4, 2012] as well as The 41 MP Nokia 808 PureView meeting the vanishing world challenge [this same ‘Experiencing the Cloud’ blog, April 4, 2012] and the PureView related content of the MWC 2012 day 1 news [Feb 27, 2012]: Samsung and Nokia [this same ‘Experiencing the Cloud’ blog, Feb 28 – March 8, 2012]
Note as well that such “professional level” camera functionality is available to everyday users and does not require any specific skills, as the Nokia presenter emphasized at the press conference with this photo just taken by himself in a cloudy environment of Helsinki a couple days ago when he visited an American football match with his two sons with no special cameras just his Lumia 920:
More demos of Nokia Lumia 920 Camera image stabilization and low light [LAPTOP Magazine, Sept 5, 2012]: - Exclusive new apps:
– Nokia City Lens: see your surroundings differently—Augmented Reality using the phone’s camera viewfinder
Looking for a restaurant? Fancy a quick drink? With Nokia City Lens, just holding your phone up reveals the area’s shops, restaurants and businesses. You can then read reviews. Book a table. And find your quickest route.
More information:
Nokia City Lens comes out of beta[Conversations by Nokia official blog, Sept 10, 2012]Nokia City Lens http://nokia.ly/QeAOiK instantly connects you to all of the places you’re looking for—and even more importantly—gets you there exactly when and how you want to. Now available on Windows Phone Marketplace. Just landed in town and looking for a good restaurant? Interested in checking out the local museum or theater? Time to hit the nearest transit station to catch a ride uptown? No longer is finding your chosen destination a hassle—whether you’re in a new city or your hometown. Now you can simply launch Nokia City Lens on your phone to easily find all the places you want to go. Nokia City Lens instantly reveals what you’re looking for on your phone’s camera display, no matter if it’s down the street or just around the corner. You simply tap your chosen destination on your screen to conveniently access walking directions, make a reservation, or learn more detailed information about the locale.Plus the “Nokia City Lens” (Beta) related parts within Less focus on feature phones while extending the smartphones effort: further readjustments at Nokia [this same ‘Experiencing the Cloud’ blog, June 25 – Aug 9, 2012] as well as The Where Platform from Nokia: a company move to taking data as a raw material to build products [this same ‘Experiencing the Cloud’ blog, April 7, 2012] in entirety. Note that with the Where platform Nokia partners will introduce other location based services in the coming weeks (GroupOn was mentioned as an example by Elop). So the augmented reality service of City Lens is just one example (even from Nokia). Note as well that—according to Elop— Nokia City Lens (and the Where platform) will come to Lumia 900 (and others likely) which is not upgradable to Windows Phone 8.
– Nokia Smart Shoot: it’s always better when everyone is smiling—Smart Shoot will take multiple photos with a single click so that you can edit effortlessly to make the perfect shot
With Smart Shoot, just one click of the camera takes a whole series of photos. Then merge them into one, so everyone in the group looks their best. Eyes open, big smiles, perfect shot.
Also a Nokia specific WP8 “lens” (filter) for aiding the picture editing, like used in the removal of unwanted moving figures application demonstrated at the press conference:
– Cinemagraph (based on another Nokia specific WP8 “lens”): a still photograph augmented with minor and repeated movements for additional meaning and fun (see Wikipedia), like making the flag waving and the girl bending to give a kiss on the photo demonstrated as well:
about which the LAPTOP Magazine shot the following explanatory video there:
– Angry Birds Roost, exclusive to Nokia Lumia for 3 months
Launching on the Nokia Lumia 920 and Nokia Lumia 820, the new and exclusive Angry Birds Roost experience for Nokia Lumia will enable fans to dive into the wonderful world of Angry Birds, with access to hundreds of walk-through videos, news, ringtones and more, brought together in one unique and original hub experience. A first for Rovio and the Windows Phone platform, the Angry Birds Roost will offer fans an exclusive ‘Bird Cam’ feature. Users can take a picture on their Nokia Lumia, add their favorite Angry Birds characters to the picture, and then share it on Facebook, Twitter, MMS or via email. Additionally, Angry Birds Roost will have tons of wallpapers and wide Live Tiles on the Lumia start screen.
– and more: Exclusive apps coming to Nokia Lumia [Conversations by Nokia official blog, Sept 10, 2012] – Red Bull app for Nokia Lumia, enhanced Vimeo app experience, new Bloomberg app for Nokia Lumia, enhanced Bloomberg Hub experience for the Nokia Lumia 920 and Nokia Lumia 820, StyleSaint app experience for forward thinking fashion fans, new Groupon app, new YouSendIt app for the Nokia Lumia 920 and Nokia Lumia 820, enhanced MICHELIN app, and WhatsApp Messenger (a cross-platform mobile messaging app)
- 4. Super Sensitive Touch for nail & glove use built on ClearPad Series 3 mobile touch solution for premium high-end smartphones from Synaptics [Synaptics press release, Sept 5, 2012]:
Synaptics Inc. (NASDAQ: SYNA), a leading developer of human interface solutions, today announced that the new Nokia Lumia 920 and Nokia Lumia 820 will be the first smartphones in the world to use a new advanced multi-touch experience based on the Synaptics ClearPad capacitive touchscreen sensing technology. Synaptics ClearPad Series 3, the premier mobile touch solution for premium high end smartphones, has raised the bar for high performance touch on the Lumia 920 and Lumia 820 with the introduction of support for gloves and fingernails. Previously, people were unable to use their smartphone touchscreens with gloved fingers or long fingernails, requiring them to remove their gloves, or awkwardly position their fingers with long nails in order to operate their phones. For the first time ever, ClearPad Series 3 technology instantly optimizes the touch experience by automatically detecting the presence of skin, gloved fingers, or fingernails, giving users a seamless multi-touch experience regardless of input methods. [See more in the press release.] MiniPC Pro recorded this presentation on the press event:
Nokia Super Sensitive Touch Demonstration on the Lumia 920 [minipcproYouTube channel, Sept 5, 2012] -
Nokia Lumia 920 Sensitive Touch – http://www.mobilegeeks.com – We are taking a look at the Nokia Super Sensitive Touch feature of the Lumia 920 which lets you operate your smartphone while wearing gloves.
- 5. Qi wireless charging:
– built into Nokia Lumia 920
– added to Nokia Lumia 820 via a Wireless Charging Shell:
With a Wireless Charging Shell, you can charge your Nokia Lumia 820 using any of our Wireless Chargers. Just rest it on a charging plate or Fatboy Pillow to boost the battery. Magic.Pick a colour. Snap it on. And you’re good to go.
and then supported by a number of accessories:
![]() Nokia Wireless Charging Plate Place your phone on a Charging Plate and watch its battery go up. There’s no need to align it carefully – the charger will work as soon as it senses your Lumia on top. |
![]() Nokia Wireless Charging Pillow by Fatboy When you go to bed, your phone can too. Just rest it on its Fatboy pillow to start charging. They even come in bright colours to match your Lumia. |
![]() Nokia Wireless Charging Stand Want to keep using your Nokia Lumia while it charges? Place it on our Wireless Charging Stand so the screen’s easy to see. You can even pre-set the stand to open apps (via NFC) on your phone – like an alarm, Skype or SkyDrive. |
![]() JBL PowerUp Wireless Charging Speaker for Nokia It has NFC built-in, so one tap is all it takes to connect your phone. Then while you enjoy the music with high-quality sound, just rest your phone on top of the speaker to charge its battery. Simple. |
Introducing Wireless Charging from Nokia [nokia YouTube channel, Sept 5, 2012]
You can click on one of accessory “tiles” towards the end in order to watch a video related to that. There are two additional accessories not shown in the table before the video above:
![]() Nokia Luna Bluetooth Headset with Wireless Charging Comes with Nokia’s Always Ready technology which lets you connect the headset to the phone without any fuss. Just take it out of its cradle and it will take care of the rest: powering up, connecting and call answering. It also comes with wireless charging. Place your headset on a wireless charger and it charges up. Simple and easy, with no need for wires. |
![]() JBL PlayUp Portable Speaker for Nokia Get the music you love wherever you go – and now without the wires. Our new range of portable JBL speakers and Monster headsets don’t just have clear audio and deep bass – they’re also NFC enabled with Bluetooth connectivity. Just tap your phone against them and your phone connects. Then stream your music seamlessly. |
Giving up the total OEM reliance strategy: the Microsoft Surface tablet
June 19, 2012 3:09 pm / 11 Comments on Giving up the total OEM reliance strategy: the Microsoft Surface tablet
Follow ups:
– Microsoft Surface: its premium quality/price vs. even iPad3 [Oct 26, 2012]
– Microsoft Surface: First media reflections after the New-York press launch [Oct 26, 2012]
Updates #2: As the result of this sudden turn of direction 9 months ago, the previously closely cooperating with Microsoft OEMs are now (March’13) working with the company in the most cautious way:
Brand vendors cautious about Microsoft when it comes to hardware design [DIGITIMES, March 25, 2013]
Notebook brand vendors have turned cautious about revealing their new products’ industrial designs for next-generation Windows as they are concerned that Microsoft may use their designs for the benefit of its new Surface products, according to sources from the upstream supply chain.
The sources noted that the brand vendors have already lost their trust in Microsoft and the software giant’s strategy of pushing Surface tablets is starting to impact itself.
Although Microsoft only had sales of about 1.5 million Surface tablets so far, the company continues to expand into the retail channel with its branded products and has even established an online store for ordering the devices.
To avoid from design leakage, many brand vendors have hidden their important designs and will only showcase the prototype of the new mobile devices during Computex 2013 to minimize the risk.
China market: Microsoft to launch Surface Pro, say Taiwan makers [DIGITIMES, March 29, 2013]
Microsoft, following the launch of the10.6-inch Surface RT in the China market, will launch the 10.6-inch Windows 8 Surface Pro there on April 2 at a retail price of CNY6,500 (US$1,045) for the 64GB version and CNY7,300 for 128GB, according to sources with Taiwan’s supply chain.
Surface RT is priced at CNY3,688-4,488 plus CNY800 for a touch cover, the source indicated.
According to previous estimation by market observers, Surface RT and Surface Pro shipments to the global market would have reached one million units and 500,000 units respectively so far since their launch, but the actual volume for the two models so far is estimated at about one million units in total, the sources said.
Viewing that Microsoft has not placed additional orders for Surface RT, an estimated one million units of Surface RT remain in the inventory, the sources indicated.
Microsoft has talked with partners about developing second-generation Surface models, but those partners have generally been conservative, the sources noted, adding that Microsoft is inviting notebook and chip vendors to co-develop tablets based on Windows-ARM platform but those vendors have been reluctant.
Updates #1:
Microsoft Surface : Assembly in China [NIDA ISM YouTube channel, June 23, 2012]
Annual Report for the Fiscal Year Ended June 30, 2012 [Microsoft Corporation, July 19, 2012]
ITEM 1A. RISK FACTORS [p. 14]… our Surface devices will compete with products made by our OEM partners, which may affect their commitment to our platform. …
Microsoft’s radical new business plan is hidden in plain sight [Ed Bott on ZDNet, July 30, 2012]
Microsoft is reimagining its entire business model, and they’ve laid out the details for anyone to inspect. You just have to read between the boilerplate sections in the company’s most recent 10-K.
…
In the Sinofsky regime, Microsoft isn’t interested in hobbies or side projects. The company’s motto is “Go big or go home.” Earn a billion dollars. Get a billion users. Don’t think small.
I expect a massive marketing push behind Surface, and I would be shocked if we don’t see more PC hardware from Microsoft in the next 12 months.
Deal with it, OEMs.
Microsoft plans to pick up the pace. Dramatically.
Microsoft has a reputation for being too slow to respond. This year’s 10-K contains a new section that suggests that’s all about to change:
Many of the areas in which we compete evolve rapidly with changing and disruptive technologies, shifting user needs, and frequent introductions of new products and services. Our ability to remain competitive depends on our success in making innovative products that appeal to businesses and consumers. [emphasis added]
…
Microsoft unveils Windows 8 OEM licensing charges [DIGITIMES, July 11, 2012]
Microsoft has released licensing rates for OEM Windows 8, including US$60-80 for Windows 8, US$80-100 for Windows 8 Pro (with Office) and US$50-65 for Windows RT (with Office), according to Taiwan-based notebook supply chain makers.
Microsoft also confirmed the launch schedule of Windows 8 at the end of October with the RTM version of Windows 8 to be released in the first week of August for testing.
Sources from notebook players pointed out that the supply chain is placing high hopes on Windows 8 and expect the operating system to help resurrect consumer demand for traditional notebooks; however, due to remaining uncertainties, most players are still taking a conservative attitude about the launch.
Sources also noted that Windows 8 is unlikely to help significantly boost PC demand before 2013 since the new operating system will increase hardware costs due to some components needing to feature additional functions such as touchscreens to allow the operating system to perform fully, while the addition of the operating system’s licensing costs, the increasing expenses are expected to boost Windows 8-based products’ end prices to a rather unfriendly level.
However, as the notebook supply chain will gradually shift their production to touchscreen models with costs to start to see drops, the sources expect demand for Windows 8-based products will see an obvious increase starting mid-second quarter 2013.
Steve Ballmer, Jon Roskill, Kurt DelBene, and Tami Reller: Worldwide Partner Conference 2012 Day 1 Keynote [Microsoft, July 9, 2012]
Steve Ballmer: …
… there’s over 1.3 billion Windows systems on the planet. We’ve sold over 630 million Windows 7 licenses. … In the next 12 months, most forecasts would be for 375 million — 375 million new Windows PCs to be sold. That’s bigger than any phone or any other single device ecosystem. It is a stunning number. And all of those represent new opportunities as they move to Windows 8. …
But Surface is just a design point. It will have a distinct place in what’s a broad Windows ecosystem. And the importance of the thousands of partners that we have that design and produce Windows computers will not diminish. We have a mutual goal with our OEM partners to bring a diversity of solutions, Windows PCs, phones, tablets, servers, to market. And what we seek to have is a spectrum of stunning devices, stunning Windows devices. So, every consumer, every business customer can say, “I have the perfect PC for me.”
And we’re excited about the work from our OEMs. We may sell a few million, I don’t know how many, of the 375 million, but we need partners to have that diversity of devices. We’re excited about the work our OEM partners are doing on Windows 8, and we’d really like to show more of that today to you and everybody collected here, Rich. …
Tami Reller, Chief Financial Officer and Chief Marketing Officer, Windows and Windows Live Division: …
Today, as we sit here, more than 50 percent of enterprise desktops are running Windows 7. …
Windows 8 is on track to RTM, or release to manufacturing, the first week of August. (Applause.) And Windows 8 will reach general availability at the end of October. (Applause.)
General availability means that new Windows 8 PCs will be available to buy and upgrades will also be available starting late October. …
Microsoft OEM head change related to Surface, say Taiwan makers [DIGITIMES, July 4, 2012]
Microsoft has announced the replacement of Steven Guggenheimer with Nick Parker, originally vice president of OEM Sales and Marketing, for the position of corporate vice president for OEM Division. The personnel shuffle is related to Microsoft’s plans to launch Surface tablet PCs, representing Microsoft’s long-term business model of stepping into hardware, Taiwan-based supply chain makers have guessed.
The personnel change has caused worries among Taiwan-based PC vendors and ODMs, because it signals that Microsoft’s launch of Surface is not a short-term promotion for Windows 8 but marks a new “software + hardware” business model which is expected to bring troubles for hardware partners, the sources analyzed.
As Microsoft will step into the hardware business, it is naturally no longer concerned about the long-term close relations established by Guggenheimer with hardware partners and therefore has decided to change his position, the sources claimed.
Microsoft Surface chassis suffers low yields [DIGITIMES, July 9, 2012]
Microsoft reportedly planned to adopt unibody magnesium-aluminum chassis for its Surface tablet PCs originally, but affected by chassis makers’ limited capacity, the company has instead turned to adopt a magnesium chassis and use MegVapor technology for surface treatment to allow the device to feature a similar exterior to traditional metal chassis; however, due to the method having a rather low yield rate, is has greatly affected Microsoft in trying to mass produce its new tablet PCs, according to sources from the upstream supply chain.
Microsoft has not confirmed the rumors.
The sources pointed out that before Microsoft launched Surface, the company has inquired at several metal chassis makers about their available capacity and revealed to these makers that its orders for Surface tablet PCs will go as high as five million units before the end of 2012; however, the chassis makers were forced to give up because of lack of capacity.
Although Microsoft’s current chassis design for Surface allows the device to feature a similar exterior and sturdiness as traditional magnesium-aluminum, while having several color choices, the drawback of the design is that the device will be heavier.
The sources also pointed out that the chassis is supplied by a China-based supplier, but since the company is a second-tier maker, its low yield rates are causing Microsoft to pay a lot of attention to the supplier’s manufacturing process hoping for improvements.
Samsung Said To Plan Windows RT Tablet For October Debut [Bloomberg, July 7, 2012]
… The decision to support Windows RT follows Samsung’s earlier announcement that it will back another version of Windows. … Samsung’s Windows RT tablet will feature Qualcomm Inc. (QCOM)’s Snapdragon processor …
Apple led the tablet market at the end of the first quarter, with 11.8 million units shipped, or a 58 percent share, according researcher IHS ISuppli Inc. Samsung was second, with 11 percent, followed by Amazon.com Inc., which had 5.8 percent. …
HP, Dell to launch 10.1-inch Windows RT tablet PCs in 4Q12 [DIGITIMES, July 6, 2012]
Hewlett-Packard (HP) and Dell will launch 10.1-inch Windows RT tablet PCs equipped with processors developed by Texas Instruments and Qualcomm respectively in the fourth quarter of 2012, according to supply chain makers.
In addition to the two US-based brand vendors, Lenovo, Toshiba and Asustek Computer are all preparing to release Windows RT-based tablet PCs.
Meanwhile, although Acer is preparing to release Windows 8-based tablet PCs, the company currently has no plans to launch Windows RT-based models in 2012, while Sony and Samsung Electronics are turning conservative about developing Windows RT-based tablet PCs, according to the two firms’ current component supply status.
The sources pointed out that both Windows 8- and Windows RT-based tablet PCs are expected to be priced starting from US$599 and could go as high as US$1,000, while the machines’ major competition will be Apple; however, the sources hope the tablet PC competition will no longer revolve around price and instead attract demand from enterprise users and consumers that are used to the Windows operating system and its strong software compatibility.
End of updates
Surface by Microsoft [surface YouTube channel, June 19, 2012]
#1 excerpt:
Microsoft Announces Surface: New Family of PCs for Windows [Microsoft press release, June 18, 2012]
Two models of Surface will be available: one running an ARM processor featuring Windows RT, and one with a third-generation Intel Core processor featuring Windows 8 Pro. From the fast and fluid interface, to the ease of connecting you to the people, information and apps that users care about most, Surface will be a premium way to experience all that Windows has to offer. Surface for Windows RT will release with the general availability of Windows 8, and the Windows 8 Pro model will be available about 90 days later. Both will be sold in the Microsoft Store locations in the U.S. and available through select online Microsoft Stores.
…
Contributing to an Expanded Ecosystem
One of the strengths of Windows is its extensive ecosystem of software and hardware partners, delivering selection and choice that makes a customer’s Windows experience uniquely their own. This continues with Surface. Microsoft is delivering a unique contribution to an already strong and growing ecosystemof functional and stylish devices delivered by original equipment manufacturers (OEMs) to bring the experience of Windows to consumers and businesses around the globe.
…
Suggested retail pricing will be announced closer to availability and is expected to be competitive with a comparable ARM tablet or Intel Ultrabook-class PC. OEMs will have cost and feature parity on Windows 8 and Windows RT.
Microsoft’s unique contribution to an already strong and growing ecosystem is well demonstrated by the following images provided by Microsoft (the accompanying text was also provided by Microsoft):

Conceived, designed and engineered entirely by Microsoft employees, and building on the company’s 30-year history manufacturing hardware, Surface is designed to seamlessly transition between consumption and creation, without compromise.
Surface features a built-in kickstand that lets you transition Surface from active use to passive consumption.
The 3 mm Touch Cover represents a step forward in human-computer interface. Using a unique pressure-sensitive technology, the Touch Cover senses keystrokes as gestures, enabling you to touch type significantly faster than with an on-screen keyboard. It will be available in a selection of vibrant colors.
#2 excerpt:
Microsoft Announces Surface: New Family of PCs for Windows [Microsoft press release, June 18, 2012] (data higlights are mine to denote the essential differences)
Surface for Windows RT Surface for Windows 8 Pro OS: Windows RT Windows 8 Pro Light(1): 676 g 903 g Thin(2): 9.3 mm 13.5 mm Clear: 10.6” ClearType HD Display 10.6” ClearType Full HD Display Energized: 31.5 W-h 42 W-h Connected: microSD,USB 2.0, Micro HD Video, 2×2 MIMO antennae microSDXC,USB 3.0, Mini DisplayPort Video, 2×2 MIMO antennae Productive: Office ‘15’ Apps, Touch Cover, Type Cover Touch Cover, Type Cover, Pen with Palm Block Practical: VaporMg Case & Stand VaporMg Case & Stand Configurable: 32 GB, 64 GB 64 GB, 128 GB
(1), (2). Actual size and weight of the device may vary due to configuration and manufacturing process.
The product introduction/overview part of the event keynote:
Steven Sinofsky
[President, Windows and Windows Live Division]Today when you have your tablet, you want to be entertained, you have to hold it. You’re always sitting in an awkward position or perhaps you have to choose from a seemingly endless variety of add on stands and cases that solve a relatively simple problem but by adding weight, adding fitness.
What if I just want to watch movie or listen to music and do something else. We think that this should be an integral part of the design. We think that a stand should be integral. So we built a stand into the device.
This stand is made of the same VaporMg as the rest of the case. And it’s completely integrated into the device. The hinge design is like that of the finest luxury car and when not in use it just fades away. No extra weight, no extra thickness, no separate add on. It’s integrated just like the software and the hardware integrated into Surface.
And then once you have this kickstand you can sit back and enjoy a truly hands free experience. You could go and just put the Surface on a table, lay back and watch a movie. And that’s really what entertainment should be about with the Surface. But you know Surface is designed to be mobile. We designed Surface to be rugged and move around but with VaporMg and Corning Gorilla Glass 2.0 you do not need to worry at all, but we know many people preferred to have some sort of cover. A cover that helps to just act like an easy on/off switch at least.
So Surface has a cover. We designed the cover to be an integral element of the PC. We built a magnetic connector into the device to hold it very securely.
So let me attach the cover, click — you heard that it’s solid — click, close the cover it’s integrated into the device. It’s made from a fine northwest pola? tech. Feels great in your hand like a book, it just fits there. And when we looked at the whole Surface on the cover, we challenged ourselves to do more. This cover is just 3 mm. Combined with Surface they are just over 12 millimeters that’s less than 0.5 inch. And we said why not do something with this Surface. Why shouldn’t we just take this Surface and make it a full multi touch keyboard.
This Touch Cover is not just a full multi touch keyboard, but it’s a modern track pad with left and right buttons. It even has the keys for the Windows 8 Metro Style UI. This keyboard combined with the kickstand form the hallmark of just hands on creativity. On average typing is twice as efficient as typing on glass. And it’s certainly more comfortable. Now of course the innovative on screen keyboard in Windows is still there and you can mix and match. The choice is really going to be yours. Just put them on the table and you’ve got a great stand.
Let me go over here and show you a different Surface. This Surface is connected to external HDMI. That’s built into the device. I’m going to go here and now I’ve got the Touch Cover connected. Now with front and rear facing cameras on this device, I can record videos. I’m going to start the camera application. So now I can go here and I could tilt this around and angle it, so I could see it. This camera is angled at 22 degrees, but angling at 22 degrees everybody at the table their head is perfectly framed into the picture or when I’m sitting at the seat, I can do a Skype call and I am perfectly framed. But this device also has Windows on it or Office on it. So I go into the desktop and I see here is Word running.
Now what is really neat, as I could also have using the multitasking capabilities I could dark the camera out there and now I can record a video or a interview and take notes, I could record my self and read from my notes. And that integration is really cool, in fact I could even use the USB port and plug in an external speaker and microphone even though it has dual array mics and dual speakers built in, and I could get super high quality recording. And so that’s a quick look at Surface.
Now there is so much more to show you today. Now imagine if you will that we took all of those capabilities of Surface and we build them so that you could use all the applications that you’re familiar with. You could use Photoshop or you could use other applications. Those applications would be built using the latest of the Intel Core Processor. Now that in addition to the Surface that we’re releasing today for Windows RT, we also have a Surface that’s designed with these latest Intel processors. So, in addition to working on the NVIDIA ARM processor we’re also working with on a Surface for Windows 8 Professional. I would like to introduce Mike Angiulo now, who’s going to come up on stage and show us a little bit of the next generation of Surface.
Mike Angiulo
[corporate vice president of Windows Planning, Hardware and PC Ecosystem]Thank you very much Steven. I’m proud to introduce you to another member of the Surface family. This is Surface for Windows 8 Pro. The Windows ecosystem has always been about choice. And for the millions of professional desktop users out there, people who use their PC everyday to design and to create things, this is a great choice for you. It shares the same design principles that Steven was talking about. It’s a stage for Windows. It shows the same pride in craftsmanship. It’s less than 2 pounds and less than 14 millimeters, it’s a full PC.
Now this also has a ClearType display. Steven’s PC had a ClearType HD. This is a ClearType full HD display, and what that means is three things. It’s a combination of a very specific pixel geometry, rendering and an optical bonding process that together create the effect that your eye can’t distinguish between the individual pixels at normal viewing distances, in this case 17 inches, less than ARMs length.
This ClearType display also reduces Z-height [alternate term for X-height] and conserves battery power. It has some of the other high performance features you saw too. It’s got that 2×2 antenna technology. This is the first in tablets. It has dual high performance antennas and receivers so that you get the best Wi-Fi performance possible no matter how you hold it. It also has a chassis that’s build out of that same durable and elegant VaporMg that enables features like the 0.7 millimeter thin kick stand less than a millimeter. It’s got the same compatible accessory spine that Steven had, so if you take a Touch Cover like he had, it just clicks in, it clicks in the same. It has that same design and feeling because the entire Surface family of products was designed together. Even close like this, this is still less than 17 millimeters, this PC has specs that rival those of the finest Ultrabooks that have ever been announced. And it delivers the power and the flexibility that you would expect of a high end PC. This PC is powered by Intel’s third generation Core i5 processor, the Ivy Bridge processor.
This is their 22 nanometer process that results in a CPU that’s faster, a GPU that has double the 3D graphics throughput, all while using less power than today’s Core i5s. With that power comes a unique design challenge, how do you design a PC that you might be holding in any different way or have a cover in the front and the back to integrate active cooling. There is no obvious place to put a vent, so here is our solution. This is called perimeter venting. You see this groove that goes all the way around the outside of the case. There is a good shot of it up on the screen. This allows air to be uniformly distributed across the entire PC when necessary in a way, that you never block it with your hands. In fact you never even feel it, which makes the PC really comfortable to hold which is really helpful in doing things like flipping back your keyboard and taking notes with digital ink.
Surface for Windows 8 Pro supports digital inking. Windows apps of all kinds can support inking. So here what I’ve done is, I can go back for the desktop and show you what I launched. I launched the Windows Reader and this is a PDF file of one of Steven’s blog posts. So you could see I can pan and zoom. What I can really do here is I can come and I could do ink. I’m going to come and say this is great.
Now what you’ll notice when I ink and I zoom in, as I zoom in that ink stay smooth. That’s because it’s being sampled at 600 GPI, that sub-pixel accuracy for ink. What that does is that keeps your hand writing very smooth and hopefully yours is a little better than mine.
One of the neat things about this too is, as I’m inking from here I can see the tip of the pen almost feels like it’s writing exactly on the screen. Since this screen is optically bonded, we eliminated the layers in between the thin covered glass in the screen. So it feels like you’re inking write on the page. The distance between the stylus and where I see the ink is only 0.7 millimeters. That’s the thinnest and closest distance of any tablet PC, any inking tablet ever.
Now one of the other things that’s going on here is as I am moving my hand, you see the page is not moving underneath my hand. That’s because Windows has palm block technology. This Surface has two digitizers. It has one for touch and a separate one for digital ink.
And what happens is as when I bring the pen close to the screen, Windows sees the proximity of the pen, and stops taking touch input. So my hand doesn’t mess up what I’m waiting. And when I’m done with the pen, you can see the little magnetic charging connector there. It just clicks in. So that’s one of the cool things on Surface for Windows 8 Pro and inking.
The apps that I’d be showing you, they look really great in the native resolution of the screen, the 1080 resolution. But if you want to unlock the highest possible resolutions that Ivy Bridge supports. Even higher resolutions that are possible on via HDMI out. We have DisplayPort. So now with DisplayPort, I can take this PC. I can docket and I basically have a full professional workstation with the power of a desktop PC.
I have one here that’s plugged in and synced up to the show monitor and this kind of a PC is powerful enough to run big applications. Applications like Photoshop, Autodesks, Solidworks, enterprise applications that require a TPM [Trusted Platform Module] chip. In this case, I’m going to copy some higher res photos on to the PC and edit them in Adobe’s Lightroom. So on copying on to the desktop and what you’ll see here, this is the five-second copy. That’s a whole gigabyte. That’s a whole gigabyte of pictures. They just copied in five seconds.
Surface has support for really fast USB 3.0 and the new USB SuperSpeed drives, a gigabyte file copy in five seconds is five times faster than USB 2.0, which makes sense with this PC because they will be using it to do big jobs whether you’re editing big photos like this, and – or you’re dealing with big video files or you’re doing in Steven’s case a big job might be typing a super-long blog-post that you may have read. Surface is up for the tasks.
Now let’s say you are in fact doing one of those big typing jobs. You’ve seen already, Steven talked a little bit about Touch Cover and the improvements it makes for typing. Let’s say you’re really fast touch typist or maybe you just prefer the feel of tactile keys.
Well, we’ve got another Surface choice for you. This is Surface Type Cover. It shares the same full-pitch layout as Touch Cover. But what we’ve done is we’ve taken a key switch that has a 1.5-millimeter travel and we built it into the thinnest possible package. So you can touch type – I can touch type on this as fast as I can touch type on any keyboard. Fully compatible with Windows; you see the shortcut keys here. It has a full modern trackpad with clicking buttons and this completes the Surface family of products. I’d like to pull all the Surface family together, all at one point.
Panos, would you join us with the colors of Touch Cover Surface for Windows RT, Surface for Windows 8 Pro and a handful of the Touch Cover colors that we’re going to have it launched. That’s the complete Surface family.
Thanks Steven. Now that’s how we feel to in Panos especially, Panos Panay is the leader of the team that created Surface and has some great stories in some more detail about the product and how it came to be. It’s all yours.
Panos Panay
[General Manager, Microsoft Surface] Thank you.
Super cool – super cool. Thank you. Thank you for having me. I’m unbelievably humbled right now and flattered to be up here. But truthfully I’m recognizing an entire team that’s back in Redmond right now waiting to see your blog posts, to see what you have to say. We have a team full of designers, development engineers, manufacturing engineers, hardware testers, all working on these products right now as we speak.
Before I get into what I’m going to talk about today, I’m just going to show you a little [bit more about the design,] …
#3 excerpt:
Microsoft Announces Surface: New Family of PCs for Windows [Microsoft press release, June 18, 2012]: the 1st image was provided ny Microsoft, the next two are from the Microsoft provided video record
Advances in Industrial Design
Conceived, designed and engineered entirely by Microsoft employees, and building on the company’s 30-year history manufacturing hardware, Surface represents a unique vision for the seamless expression of entertainment and creativity. Extensive investment in industrial design and real user experience includes the following highlights:
- Software takes center stage: Surface sports a full-sized USB port and a 16:9 aspect ratio – the industry standard for HD. It has edges angled at 22 degrees, a natural position for the PC at rest or in active use, letting the hardware fade into the background and the software stand out.
![]()
- VaporMg: The casing of Surface is created using a unique approach called VaporMg (pronounced Vapor-Mag), a combination of material selection and process to mold metal and deposit particles that creates a finish akin to a luxury watch. Starting with magnesium, parts can be molded as thin as .65 mm, thinner than the typical credit card, to create a product that is thin, light and rigid/strong.
![]()
- Integrated Kickstand: The unique VaporMg approach also enables a built-in kickstand that lets you transition Surface from active use to passive consumption – watching a movie or even using the HD front- or rear-facing video cameras. The kickstand is there when needed, and disappears when not in use, with no extra weight or thickness.
![]()
- Touch Cover: The 3 mm Touch Cover represents a step forward in human-computer interface. Using a unique pressure-sensitive technology, Touch Cover senses keystrokes as gestures, enabling you to touch type significantly faster than with an on-screen keyboard. It will be available in a selection of vibrant colors. Touch Cover clicks into Surface via a built-in magnetic connector, forming a natural spine like you find on a book, and works as a protective cover. You can also click in a 5 mm-thin Type Cover that adds moving keys for a more traditional typing feel.
The product design part of the event keynote:
Panos Panay
[General Manager, Microsoft Surface]… [I’m just going to] show you a little bit more about the design, show you a little bit more about the culture of how these products were build. So I think it might be interesting for you to hear that. I really want to share with you more of our team. So just watch this video really quick and I’ll be right back.
[Video Playback]
You’re going to get to meet a lot of the people you just saw on the video in just a few minutes. They’re actually backstage right now, preparing to show you more details of the product and give you a few minutes to put your hands on it, talk a little bit about the design.
Let me start by doing that to just give you a quick preview of what you might see backstage in just a few minutes. You’ve heard Steven and Mike both said this was build as the stage for Windows 8. That was part of our core vision for the product. It is very important for us that we had the hardware fade to the background for this product. It was important, so the Windows software could rise to the Surface. It gives you the best experience possible. When the hardware fades away and what comes to the Surface is that entertainment PC one when you’re using the device. Note the chamfered angles on the side of this product either chamfered at 22 degrees. That’s two things. One, it’s a physical manifestation of the actual stage itself. You can see as it falls away, just as we intended for the hardware to do. But two, it actually sits perfectly comfortable in your hands.
And let me call it by something. I’d say perfectly a lot. I’d say perfect a lot. As part of our team culture, what was really important for us as we had so many parts of the design that had to be in detail and be simple and be right that we always tried for perfection on every sub-component of this product, it includes this chamfered angle.
What it does is, it sits in your hand very comfortably, in a way that when you hold it, it feels like, it’s feels airy. Most importantly, you can use it all day in comfort. It’s really important when you talk about the hardware fading to the background that the hardware is not in your way to accomplish what you want to do. It’s meant to move you forward, which you think this product does.
Now when we talk about hardware fading to that back, another thing that’s super important is a seamless lines throughout the product. When you look at this product, you’ll see lines going throughout it, every line calculated, every line built, formed perfectly on the device.
But there is one challenge. Our vision for the product beyond being a stage for Windows was also that we had to bring creativity and productivity to folks such as yourselves.
The opportunity to transform this device well, to transition it to the state of getting things done. Putting this kickstand in the product, flies right in the face seamless lines and getting it perfect. But we really spent a lot of time here. We knew that if we do not get the kickstand perfect, this device would not work. We could not take any chances. Take a look at the three hinges that you see within this device. This is a really simple example of the details of the product. These are three custom-made hinges, mind you there are over 200 custom parts built from the inside out of this product to make it come to life.
But these hinges, they respect just as Steven told you. They respect to feel and sound like a high-end car door. When you close the device, the kickstand just goes away. It’s not in your way. When you needed the device, it’s there, just in time. You want to get something done, just open it and it feels great.
The spec we created was around sound. We iterated over and over again in our anechoic chamber. This is a critical point. We’ve really wanted to get the sound rights. So you get that – this full feeling, that emotional attachment to your product when you open this kickstand and close it. It makes it yours, it goes away when you don’t need it and it’s there when you do.
Now, we talked about VaporMg a few times. Now let me bring VaporMg to life just a little bit here. So you can understand a little bit more about what we did. VaporMg essentially becomes what lets us, get our product design and create life out of it. You can see the break up behind me, let me just explain a few things that we have going on.
I’m holding up my room key, it feels weird to hold at my room key. But if you look at this quickly, what you’ll see is 0.77 millimeters of thickness. This is an important point. If you can’t see it, that’s all right, same as a credit card, pull it out, your credit cards likely somewhere between 0.75 or 0.85 millimeters thick. It’s just a illustrated point. VaporMg is a process where we start with an ingot of magnesium and we melt it down to a molten state. Within injection mold the magnesium, there are some tools and we’re able to actually mold the intricate details that are needed for Surface. We mold down to 0.65 millimeters of thickness in any given part. 0.75 … [he means the credit card thinkness just mentioned], we mold to 0.65, this is important to understand, because for us to get to the design we needed for this product, to get the kick stand, integrated seamlessly and hold this line throughout the product we had to be able to mold to those tolerances.
Every micron matters within Microsoft Surface. we’ve actually stacked up every part from designing from the inside out, so tightly in the product and so cleanly that even if you stuck a piece of tape in the middle of the device, it would bulge, it would bulge out. That tells you how strong this product is, how much strength comes with it, how light it feels in your hands, all those parts play into each other.
The best part about VaporMg is not just that we can mold a 0.65 and get the intricate details like the 0.65 millimeters angles that go around the product this radial. The best part is the smoothness of the finish that comes out of the tools. After approximately 152 steps to get the VaporMg looking just like you see now, you find that the Surface finish on this product and as Mike says, bright in craftsmanship is perfect, it’s seamless. It screens watch quality finish and when you put it in your hands, it feels elegant, when you touch it, you’re going to want to hold it, I promise you.
Now I’m proud of VaporMg and I’m proud of the team for the product that they’ve done, but nothing, nothing stirs me more, nothing gets me more excited than Touch Cover. I really want to walk you through Touch Cover for just a few moments. This is an important technology that came out of our group. I’m going to walk you through it in two ways, the first way is through the experience and the second way I’m going to talk about is the technology.
Let’s do the experience first, we explained you what we try to do with Touch Cover from the get go, you notice I’m going to connect it now to my blue Touch Cover. So I just click it in, as you would expect. The Surface turns blue along with my Touch Cover and you have a beautiful integration of hardware and software. My Surface knows what is connected to it. I can now bring to life the vision that is Touch Cover for this product. The vision that lets you produce content when you want it, how you want it as fast as you’ve always done it, that’s what this product was designed for.
Let me give you one more second on this, on a little bit of the experience. The thing that was so critical for us in creating Touch Cover was that it had to be 3 millimeters thin. This essentially is at odds of any other keyboard you’ve used and still have a great typing experience. It also had to be a cover you wanted to connect, something you always had with you, something that gave you confidence just like the kick stand to bring this product to life.
We designed flex magnets in this product, that’s a combination of alignment in clamping magnets. You could actually never miss connecting this device, you can’t miss, we force you to not miss. We do that to give you confidence. You close it, it feels like a book, we design this organically like a book; we wanted it to feel just like that. What has more covers on it than books themselves? This spine feels like a book. When you put it in your hand and you walk away with your product, you’ll hold it like a book. When you carry it against your books, it will feel like it’s another book, it’s just light enough and it feels just perfect.
Now that said, I think you’re going to fall in love with Touch Cover. I know I have. I mean I’m seriously in love with it outside of my wife, Touch Cover is number two. It’s very important to me. Now, I never want to take Touch Cover off, and I’d argue that you don’t need to and you never have to.
You saw Mike move his Touch Cover to the back. Now when he did that I’m sure every single one of you thought like wait a minute, how do you move it to the back? Well, Touch Cover is pretty smart; it has an accelerometer built into it. The moment you fold it back, we know you fold it back, we know when you’re not using it and it’s turned off for you.
So you never have to take it off and underneath your fingertips, it feels great. So now you’ve got a comfortable device with Touch Cover that’s yours, it’s personalized to you. You saw the beautiful colors that we have coming to market and essentially what’s brought to you is an experience like none other with Touch Cover and Surface together.
Now I showed you the experience, but I wanted to show you the technology, because it really is important that you understand it and quite frankly, we have a bit of a mad scientist, who many of you know, named Stevie Bathiche. Stevie actually invented Touch Cover, the fact that we have 30 years of input experience using mice and 15 years creating keyboards, we really understand how to create a great typing experience. We also knew that if we brought you Touch Cover, and Touch Cover wasn’t any good, boy, what a breaking moment. But we’ve actually evolved this technology to a point through Stevie and his work to come to a place where we’ve brought you an experience that’s amazing at typing. There’s actually seven layers squeezed in, pressed right into Touch Cover to keep it 3 millimeters thin. Now that’s super thin, but critical for you to have a great experience when folding it back.
Let me explain to you how the technology works just ever so slightly and quickly. So what you’re going to see is I’m going to put my hands down on this machine here and, what you’re seeing is this is Surface for Windows RT, and my hands are down on Touch Cover. You’ll notice that my hands are laying flat on Touch Cover right now yet nothing is happening. If this was in fact a capacitive screen or the phone you might have in your pocket or some other device you might have, the keyboard would take up off the screen and you put your fingers down and it would look something like that.
Now that’s me actually pressing on Touch Cover, and it knows the grams of force coming off my fingertips, on to Touch Cover. Why is this critical? When you type in touch type speed, you have to find your home position and rest your hands. To do that, your keyboard can’t fire when you put your hands down, it’s comfortable, you can rest your hands and note as I put pressure on the J key, how the pressure goes up as I push harder and as I release, the pressure comes off.
It’s actually measuring every gram of force coming off my fingertips and as I start to type, it knows how many keys I’ve hit. This keyboard actually measures 10 times faster in scanning from a keyboard matrix than any keyboard, guarantee that you use today. It is super fast and brings great, great opportunity for you to be productive and get stuff done.
Obviously, I have a lot of pride in this product. I hope you’ll love it. I can’t wait for you to get your hands on it back there, and I really mean that. Steven, thanks for having me up here today.
Steven Sinofsky
That was a moment for our team for sure. I do want to talk a little bit about some availability and pricing information and things like that I know people want to know. Surface for Windows RT, I still say that there will be much more information available on the web and available shortly. So Surface for Windows RT will be available in both a 32 and a 64-gigabyte model and will be priced like comparable tablets that are based on ARM. Surface for Windows 8 Professional will come in 64-gigabyte and 128-gigabyte storage models and will have a retail price comparable with competitive Ultrabook-class PCs. Additional specifics on pricing and packaging will be announced as we get closer to retail availability.
Now of course, retail availability for the Surface PCs will be around the time of – for the Windows RT PC, will be at the time of the Windows 8 general availability and for Windows 8 Pro about three months later. Surface will be available through the Microsoft’s physical stores here in the U.S. and will be available through the select online outlets of the Microsoft store as well.
So welcome everybody to Surface. I just want to invite Steve Ballmer back up on stage one more time and thank you, thank you very much.
Steve Ballmer
I want to thank Steven and Mike and Panos and their team. This has been an unbelievable journey. We’ve invested significantly as you can see in talent, in time, in capital to bring the Surface to market. I was asked in the last few days here why now, why now? We took the time to really get Surface in Windows 8 right to do something that was really different and really special.
We’re very proud; very, very proud of the Surface just like we’re very proud of Windows 8. Because of Windows 8, because of Windows 8 the Surface is a PC, the Surface is a tablet, and the Surface is something new that we think people will absolutely love. We really want those of you here to have a chance to see and touch the Surface and talk with some of the people who are involved in designing the product.
We have several stations set up next-door where you can see the work that went into the creation of the Surface, and we hope you’ll stay and join us for that. Today has been the fun for us to put on for you very, very exciting and I want to thank you all for being part of today’s event. Thanks.
The justification part of the event keynote (was the general introduction, i.e. the first part of the event): i.e. how and why Microsoft decades long hardware innovation history has now been expanded by PC/tablet level innovation, why after Windows 8 innovation Microsoft needed a matching innovation in hardware as well?
Steve Ballmer
Well, good afternoon and welcome, I certainly want to thank everybody for joining us for today’s event. The past several years have seen great change in the industry and great innovations coming from Microsoft. We’ve helped usher in the new era of cloud computing, we’ve embraced mobility, we are redefining communications and attempting to transform entertainment. In all that we have done Windows is the heart and soul of Microsoft from Windows PCs to Windows Servers to Windows Phones and Windows Azure. Windows is proven to be the most flexible general-purpose software ever created spurring on an ecosystem of unrivaled success.
When Microsoft was founded our vision was odd and broad: a computer on every desk and in every home. And while certainly we are optimists to the core Windows has exceeded even our most optimistic predictions. It now powers well over 1 billion PCs from desktops to laptops to ATMs to NASA workstations and more: in homes, in businesses, in schools and in governments literally around the world.
With Windows 8 we’ve re-imagined the Windows product. We re-imagine Windows from the chipset to the user experience, to power a new generation of PCs that enable new capabilities and new scenarios. We approached the Windows 8 product design in a forward-looking way. We designed Windows 8 for the world we know, in which most PCs are mobile and people want access to information and the ability to create content from anywhere anytime.
People want to do all of that without compromising the productivity that PCs are uniquely known for: from personal productivity applications, to technical applications, business software and literally millions of other applications that are written for Windows that work perfectly on Windows 8. We are incredibly gratified by the enthusiastic response to Windows 8 from our partners, our OEM partners, thousands of developers and literally millions of people consumers who’ve downloaded our previews.
Excitement is high with the new X86 and ARM SoC support. The new Metro User Interface and the new Store all getting very broad interest.
Today, we want to add another piece, another bit of excitement and another piece to that Windows 8 story.
At our foundation Bill Gates and Paul Allen made a bet, a bet on software, at the same time it was always clear that our unique view of what software could do would require us to push hardware sometimes in ways that even the makers of the hardware themselves had yet to envision. That’s the nature of the dynamic between hardware and software pushing each other and pulling each other forward. In fact, our number one revenue product actually the year I joined Microsoft 1980 was a hardware product, something known as the SoftCard. Let’s just take a little bit of a look back at the role of hardware at Microsoft.
[Video Playback]
We believe that any intersection between human and machine can be made better when all aspects of the experience hardware and software are considered in working together. Just let’s take the mouse as an example.
To be successful Windows 1.0 really needed a mouse so we built one. Early reviews of mice were not very positive as people struggled to understand the real value. In fact actually it was so new the Canadian Customs quarantined the Microsoft mouse at the border for four weeks thinking that it was alive.
Our most successful hardware product has been the Xbox and with Kinect we’ve created a whole new user experience. And now developers are pushing Kinect, viewing more exciting and even cooler things for both the game console and for Windows PCs. This combination of hardware, software and peripherals in the Xbox case work together to deliver an absolutely amazing experience.
We see that sort of combination working also today in our PC ecosystem. We believe in the strength of that ecosystem, of software and hardware companies that work together to deliver selection and choice that makes your Windows experience uniquely your own. Those partnerships are essential to the re-imagination of Windows. We’ve worked with the component companies, Intel, AMD, NVIDIA, Qualcomm and Texas Instruments.
Of course the ultimate landing point of this PC experience is through our partnerships with OEMs: HP, Dell, Asus, Acer, Samsung, Sony, Lenovo, Toshiba and many, many more. They will deliver more PCs to market in the year 2013 than in any previous year. IDC estimates that number at over 375 million Windows PCs. That will ensure that software developers and content creators have a larger number of new systems to target with their Windows 8 applications than any other non-phone platform.
However, with Windows 8 we did not want to leave any scene uncovered. Much like Windows 1.0 needed the mouse to complete the experience, we wanted to give Windows 8 its own companion hardware innovation. What is this innovation? It’s something new, it’s something different, it’s a whole new family of computing devices from Microsoft.
[Video Playback]
This is the new Microsoft Surface. It embodies the notion of hardware and software really pushing each other. People do want to create and consume, they want to work and they want to play, they want to be on their couch, they want to be at their desk and they want to be on the go. Surface fulfills that dream. It is a tool to surface your passion, to surface your ideas, to surface your creativity and to surface your enjoyment. I really want you to take the time today to get to know Microsoft Surface. So let’s now learn more from Steven Sinofsky and the Microsoft Surface team.
Steven Sinofsky
Just as we’ve re-imagined Windows we also have a vision for re-imagining the tablet.
We see a tablet that is designed the way that Windows has been designed. We see a tablet that represents a unique vision with a seamless expression of entertainment and creativity. A tablet that works and plays the way that you want to, a tablet that’s a great PC, a PC that’s a great tablet, a new type of computing, Surface.
Surface is a stage for Windows. Surface is designed for the software experience to take it, have it take centre stage. Surface is super thin at 9.3 millimeters. It’s just thin enough for this full size USB port for peripherals or just charging your phone while you are at the hotel. The edges are bevelled away at 22 degrees, so the PC itself fades into the background. It feels natural in your hands.
Surface is the first PC with a full magnesium case. Through unique process the liquid metal is formed into an ultra rigid, yet ultra light frame. It is incredibly in strong and it’s airy at under 1.5 pounds, just 676 grams, and it’s finely balanced. We didn’t stop there, the case is one of a kind. It’s made from a physical vapor deposition process. It results in a permanent scratch and wear resistance for Surface. This VaporMg case is a first of a kind, and it accentuates the unique feel of Surface.
Surface is of course great for entertainment. It has access to all of the Windows apps for music, for video, for Xbox and gaming. We can see here I’m running Internet Explorer. I can browse smoothly, use see great pages using ClearType and have a great experience just with all the – with browsing. It’s 10.6 inch optically bonded, wide screen display, is custom designed for Surface. And of course people play games. I can go and play any of the interesting games that are on – in the Windows Store and I can use Surface for using all the sensors that are within Windows as well. Surface works for all of those games.
Movies and entertainment look great as well. Excuse me just a second. Surface looks great for entertainment as well. In fact I’m going to show here for the first time a very exciting new application. This is the Netflix application designed specifically for Windows 8. Now with the wide screen you get 30% more viewing area and no banding or letter boxing like you traditionally see.
I’m happy to show this new Netflix application … [, give you an early look how it’s designed specifically for Windows 8 with semantic zoom. And Netflix will have this ready at the Windows 8 launch. I can go here and start a movie and see it stream straight to my Surface PC. Just like you would expect.
Now to stream so well Surface needs great Wi-Fi. Surface is the first tablet to incorporate dual 2×2 MIMO antennas. That means it provides the very best Wi-Fi reception of any tablet today. Surface is incredibly great for Windows and for entertainment PC. And we are just getting started.] …
More information:
– Surface Website
– On-Demand Keynote: Microsoft Surface Event
– Broll: Product imagery of Microsoft Surface
– Broll: Images from Microsoft Surface Event
– Product & Event Images
– See it in Action
And remember this leading edge Microsoft Surface family, leading edge even against Apple’s market leading offerings, so this product is definitely just the tip of the iceberg. Consider this Channel 4 report which is showing the kind of the future which could come from Microsoft as seen back to the beginning of last year:
– Touching, waving at and talking to the future with Microsoft [Channel 4 News YouTube channel, Feb 8, 2011]
(Note towards the end of the video, Panos Panay to appear as simply from Microsoft Surface.) Additional infomation:
– Benjamin Cohen, the reporter in the video, had this detailed blog post about that visit
– Steve Clayton, the Microsoft’s not that long ago initiated, ‘Next at Microsoft’ storyteller, had also this detailed blog post about that visit
Note that Microsoft shares started to raise already last Friday (obviously based on expectation when the invitations to a ‘mistery event’ were sent out). Nevertheless from $29.34 to this Tuesday’s closing price of $30.71 that was only a 1% growth. Interestingly during the same period Apple’s share price had a 1% growth as well, although Apple made its series of announcements a week earlier, on Monday last week (June 11th, 2012):
– Apple Introduces All New MacBook Pro with Retina Display
– Apple Updates MacBook Air and Current Generation MacBook Pro with Latest Processors and New Graphics
– Mountain Lion Available in July From Mac App Store
– Apple Previews iOS 6 With All New Maps, Siri Features, Facebook Integration, Shared Photo Streams & New Passbook App
which resulted in 0.5% growth only.
So the stock market evaluated the Microsoft Surface against the above Apple introductions, and found that on equal level from business growth perspective, although Apple’s closing price yesterday was $587.31, i.e. 19x higher. In terms of market capitalisation Microsoft remains on the same 47% of Apple’s, so from business competition point of view the announcement of Microsoft Surface is not changing the positions as far as the opinion of the overall business world is concerned. INTERESTING!
Meanwhile the earlier Microsoft Surface product has been renamed as Microsoft PixelSense in order to avoid confusion:
About Microsoft PixelSense [Microsoft page for the press on the PixelSense microsite, June 18, 2012]
The Samsung SUR40 with Microsoft PixelSense is an innovative product that responds to touch, natural hand gestures and real world objects placed on the display, providing effortless interaction with information and digital content in a simple and intuitive way. With a large, 360-degree, 4-inch thin horizontal user interface, the Samsung SUR40 offers a unique gathering place where multiple users can collaboratively and simultaneously interact with content and each other. In addition, the SUR40 provides businesses with unique value in delivering information and services in a more friendly way allowing better engagement with their customers. The Samsung SUR40 with Microsoft PixelSense is targeted for companies across a variety of industries including retail, hospitality, health care, and public sector.
The Samsung SUR40 with Microsoft PixelSense is a major advancement in computing that moves beyond the traditional user interface to a more natural way of interacting with information. The four key attributes that make this experience unique are:
Multiuser experience.The horizontal form factor makes it easy for several people to gather together, providing a collaborative, face-to-face computing experience.
Massive multitouch contact. The Samsung SUR40 with Microsoft PixelSense recognizes many points of contact simultaneously, not just from one finger, as with a typical touch screen, but up to dozens and dozens of items at once.
Direct interaction.Users can actually “grab” digital information with their hands and interact with content through touch and gesture, without the use of a mouse or keyboard.
Object recognition. Users can place physical objects on the display to trigger different types of digital responses, including the transfer of digital content.
At CES 2011, Microsoft unveiled the designed for touch experience featuring Microsoft PixelSense technology, which gives LCD panels the power to see without the use of cameras.
This experience comes to life in the Samsung SUR40 with Microsoft PixelSense, which incorporates significant technological advancements designed to enhance the user experience.
The Samsung SUR40 with Microsoft PixelSense features key hardware and software technology advancements informed by feedback from users around the world.
Microsoft PixelSense™.Microsoft PixelSense allows a display to recognize fingers, hands, and objects placed on the screen, enabling vision-based interaction without the use of cameras. The individual pixels in the display see what’s touching the screen and that information is immediately processed and interpreted.
PixelSense technologydelivers an innovative user experience built on the principles of direct interaction using touch and objects. The Microsoft Surface 2.0 SDK allows application developers to take advantage of capabilities of PixelSense technology.
Thin form factor with multiple configuration options.The Samsung SUR40 is four inches thin, which makes it easy to use as a table, hang on the wall with the VESA mount, or embed in walls or custom enclosures. There are standard leg supports available or customers can design and attach their own leg supports.
High definition large format display. The 40-inch, stunning high-definition screen (1920 x 1080 resolution) enables enhanced multiuser and multitouch experiences.
Microsoft PixelSense activities are available on the Microsoft PixelSense blog and Microsoft PixelSense on Twitter.
For more information, press only: PixelSense PR team
Also these two videos appeared on a new Microsoft® PixelSense™ YouTube channel [June 18, 2012]:
The Power of Microsoft® PixelSense™
Samsung SUR40 with Microsoft® PixelSense™
Now some first reactions from the event attendees:
Microsoft Surface: a closer look [TheVerge YouTube channel, June 18, 2012]
See also this article: Microsoft Surface with Windows RT hands-on pictures and video [Joshua Topolsky from The Verge, June 18, 2012]
Microsoft Surface tablet demo June 18, 2012 Event in SF [SlashGear YouTube channel, June 18, 2012]
See also these articles, same date, on SlashGear (the first ones are kind of liveblogging):
– Microsoft Surface Tablet Hands-on by Vincent Nguyen
– Microsoft Surface re-introduced as a handheld tablet by Chris Burns
– Microsoft Surface cover doubles as built-in keyboard by Cory Gunther
– Microsoft Surface for Windows 8 Pro revealed by Chris Burns:
This tablet introduced its own “Perimeter Venting” so as not to get too hot [in fact to solve the problem of cooling with a tablet which can be used in both portrait and landscape modes], works with Pen input (with digital ink, explained in a different post), and has a display that’s just 0.7mm from the glass that covers it. The Microsoft Surface for Windows 8 Pro has two digitizers, one for ink, one for touch, and has a bit of magnetization for its pen so no holes or clips are needed.
– Microsoft Surface to feature digital ink stylus support by Cory Gunther
At the event live they said it best by stating, “This surface has two digitizers. One for touch, one for digital ink.” All stylus or pen input is converted into digital ink and the new Surface tablet is extremely responsive and accurate.
The distance between the screen (digitizer) and the stylus is only .7mm thick, and allows for it to be highly accurate, making you feel like the ballpoint of a pen is actually writing on the “surface”. Surface will see the proximity of a stylus and stop recognizing hand inputs.
– Microsoft Surface Windows RT confirmed with NVIDIA’s Tegra processor by Cory Gunther
NVIDIA has just issued a rather short note confirming that their Tegra processor will be under the hood and powering the smooth and fluid Windows 8 RT model. They didn’t specify which Tegra processor as expected, but we are speculating it will be the quad-core Tegra 3 KAI platform, or the Tegra 3+ that was detailed as coming soon [… an upgraded Tegra 3 called T3+, with code-names Wayne and Grey splitting off in the third quarter of 2012 with LTE. Grey specifically will have access to LTE data speeds, with Tegra and Icera hardware being part of this sector for NVIDIA] in a lot more than just Android devices.
Microsoft Surface Touch Cover vs Type Cover hands-on by Chris Burns
These keyboards bring on a fair stab at what 3rd party manufacturers have been attempting for the iPad and a host of Android tablets now for several years. The keyboards on both units aren’t going to bring you a perfect replacement for a notebook computer if you’re attempting to match the laptop-bit of the equation, but if you’re the sort of person to work on a desk, you might be in business.
Microsoft Surface could debut MagSafe-data hybrid hook-up by Chris Davies
The four-pin port is on the right lower edge of the new tablets, and seemingly matches up with the MagSafe-like connector detailed in a patent application from the company. If so, that could mean a single hook-up for recharging the Surface and synchronizing it with other devices.
Microsoft’s patent application followed in the footsteps of Apple’s magnetic charger system – which allows the cord to break away easily if someone trips over it, rather than yanking your laptop off the desk – but added in a data connection. With just one port, the Surface could be hooked up to both a charger and other external hardware, with an optical data link used for maximum speed potential.
…
The potential for such a connection is vast. Microsoft has been coy about external device support for Surface, only mentioning the USB and video-output ports, but with this proprietary port it could be used with a docking station to add in an optical drive, wired network connection and more.
We’ve been waiting for just such a strategy from Apple for some time, and indeed the Cupertino company has an optical data MagSafe patent application of its own. More on Microsoft Surface in our hands-on here.
Microsoft Surface Tablet: Hands-on [laptopmag YouTube channel, June 18, 2012]
See also these articles:
– Microsoft Surface Tablet Hands-on: The Future of Windows is Here [Video] [Michael A. Prospero from LAPTOP Magazine, June 18, 2012]
– iPad vs Microsoft Surface: Tablet Specs Compared [Kenneth Butler from LAPTOP Magazine, June 18, 2012] (data higlights are mine to denote the essential differences)
Microsoft to Unveil a New Tablet – Good or Bad Idea? [The Wall Street Journal YouTube channel, June 18, 2012]
See also this article: Microsoft Unveils Surface Tablet to Rival iPad [Shira Ovide from The Wall Street Journal, June 18, 2012]
… Al Hilwa, an analyst at IDC, said the combination of PC and tablet features makes surface a “true converged” device. “A Swiss Army knife of a tablet?” …
The computer makers‘ business is dependent on Microsoft, so they may not express annoyance publicly at Microsoft’s trading on the hardware makers’ turf. But at least some hardware executives are fuming privatelyat Microsoft’s decision.
Microsoft’s move to make its own tablet “comes with consequences, which is complicating choices for consumers and complicating relations with third-party manufacturers,” said Sarah Rotman Epps, an analyst with Forrester Research Inc.
Microsoft “Surface” Tablet Announced, Powered by Windows 8 [Eric Savitz for ForbesVideo YouTube channel, June 18, 2012]
See also these articles:
– Microsoft: Live From Hollywood! Introducing Microsoft Surface Tablet (Updated) [Eric Savitz from Forbes, June 18, 2012]: a live blog of the event
– Microsoft Announces Surface, Its New Windows 8 Tablet [Kelly Clay from Forbes, June 18, 2012]
…
As no one does keyboards better than Microsoft, yet another keyboard is also available for Surface that features a full trackpad with clicking buttons. Though Surface is slightly heavier than the iPad and has 25% less battery life (31.5 Watt hours compared to the iPad’s 42.5 Watt hours), Surface is truly one of the most powerful and lightweight mobile PCs we have seen.
It’s clear that Surface is designed for current Windows users, and according to NetMarketshare, Windows XP, Vista, and 7 combine for 93% of all desktops. For these users – especially those in the corporate environment – there is a hesitation to switch to another platform, even just for mobile use. As a result, Surface could be a game-changer in the tablet industry. Not only does it feature key capabilities that Apple has yet to ever integrate (such as a keyboard), but Surface will undoubtedly make it easier for curent Windows users to transition from home to office and in-between. While a price has yet to be set, it’s expected to be extremely competitive compared to other tablets, ensuring that Surface is a device that many current Windows users will want to own.
Other notable first reports:
1. WIRED magazine [June 18, 2012]:
– Liveblog: Meet ‘Surface,’ Microsoft’s New Windows 8 Tablet
– Microsoft May Be Late to Tablet Fight, But Has the Cash to Keep Sparring
– Microsoft Dives Head-First Into Mobile Hardware With Two 10.6-Inch Tablets
…
Surface is much, much more than a new tablet platform. It’s also Microsoft’s first fully branded computing device — an ambitious new development direction after years of making only simple computer peripherals. And Surface is also a challenge to every hardware partner in Microsoft’s OEM stable.
“Its a bold move on the part of Microsoft,” says Gartner analyst Michael Gartenberg. “This is a real change in strategy for them, and it’s certainly a vote of no confidence for their partners. This shows how high the stakes are. There is competitive pressure from Apple that is clearly a threat to their business. Steve Ballmer seemed to be channeling Steve Jobs on stage, saying hardware and software have to be designed to together.”
…
As for pricing, Microsoft isn’t saying, but Gartenberg weighs in:
“I’m guessing somewhere between $600 and $1000 — Microsoft was very vague. This the problem you encounter when you launch something so far ahead of delivery,” he said. “For a launch like this, it’s all about the details. Everything about this event, the mysterious invitations, the presentation — Microsoft is trying to be Apple. But the only company that has successfully been like Apple, is Apple.”
2. engadget [June 18, 2012]:
– Live from Microsoft’s mystery press conference in Los Angeles! by Dana Wollman
– Hands-on with Microsoft Surface for Windows RT, Touch Cover and Type Cover (update: video!) by Dana Wollman:
… (Microsoft has only said that the ARM chip is made by NVIDIA. No one ever said it’s a Tegra 3 SoC, but that is naturally our best bet.) …
Based on remarks by Steve Ballmer and others during the presentation, it sounds like a lot of thought went into the two keyboards, so we wouldn’t be surprised if a large focus group of touch typists were able to prove Redmond’s engineers right. But having played with both, we don’t imagine this being like settling in with a new laptop or Transformer-style dock. You might have to re-learn how to type (or at least teach your brain to fuhgeddaboutit and trust your fingers to land where they’re supposed to.) …
Even after some brief handling, we feel impressed, almost sobered by what Microsoft’s managed to produce after vowing to take the Windows 8 hardware-software package into its own hands. Surface for Windows RT is well-made, polished, durable and carefully engineered. And yes, that’s sobering news: Microsoft’s own OEM partners, everyone from ASUS to Acer to HP, should feel a tinge of defensiveness. If Redmond’s mission until now has been to showcase all the possible form factors for Windows 8, it may have just taken a step in the opposite direction by upstaging everybody else.
– Microsoft reveals its own Windows 8 tablet: meet the new Surface for Windows RT by Jon Fingas:
Not unlike Apple’s last two generations, there’s a magnetically attached cover, but it’s more than just a protector: here, it includes a full multi-touch keyboard and trackpad.
– Microsoft one ups other tablet ‘smart’ covers with Surface’s Touch Cover and Type Cover by Terrence O’Brien:
… right now we’re pretty enamored with Microsoft’s Touch Cover for the newly announced Surface. See, it works almost exactly like that other “smart” tablet shield, but this one actually earns it’s smart moniker. When you peel the plastic shroud back it turns into a fully functional keyboard and touchpad. Obviously, being a thin plastic sheet, the cover is relying on touch for key presses, not the actual depression of mechanical switches. …
Perhaps one of the more interesting features though, is their ability to force Win 8 to color coordinate with your chosen shade of folio. Click the blue Touch Cover on to the Surface and the background switches to a soothing shade azure. There’s even an accelerometer inside those 3mm-thin softer covers — which is an impressive feat of engineering. The Touch Covers can easily distinguish between you simply resting your hands on the keyboard and actually typing, which should help minimize accidental key presses.
– Microsoft announces Surface for Windows 8 Pro: Intel inside, optional pen input by Donald Melanson
– Microsoft Surface tablets: the differences between Windows RT and Windows 8 Pro models by Darren Murph
3. CNET [June 18, 2012]:
– Microsoft breaks tradition with Microsoft Surface tablets
– Surface touches the right keys, but not a complete picture
– Who is the Microsoft Surface for, exactly?
– Five key takeaways from Microsoft’s Surface event:
… 1. Don’t confuse this with the table thing [i.e. the old Surface now called Microsoft PixelSense]. … 2. This isn’t just aimed at the iPad and Android tablets [as it can work like a PC, complete with a full version of Windows]. … 3. This thing is high-tech. … 5. This is just the start [as Microsoft is positioning Surface as the beginning of a family]. …
– Why Microsoft built its own tablet — think Apple and Xbox
…
The tablet and ultraportable form factors are especially fertile ground in terms of growth and innovation. A recent Online Publishers Association studyfound that 31 percent of the U.S. Internet population (74.1 million users) own tablets, up from 12 percent in 2011. By 2013, the study projected that 47 percent of the U.S. Internet population (117.4 million users) would own tablets.
At this juncture, Google’s Android platform (including Amazon’s Kindle) and Apple’s iOS are splitting the market. Apple’s continuation of its firm grip on hardware and software integration is working exceedingly well, as evidenced by the company’s incredible financial success.
…
Google gives its Android platform to partners for free, which leads to some fragmentation and a fraction of the profits Apple is generating. Like Microsoft, Google plans to introduce its own branded tablet this month. Microsoft expects that it can generate some buzz and give Windows users a legitimate alternative to Apple’s iOS and Google’s Android, as well as incent its developer community to build native apps for its platform.
Note: In the above argumentation CNET relied on the released the same day “A Portrait of Today’s Tablet User – Wave II” U.S. findings from the Online Publishers Association (OPA), particularly the one represented on the following slide:
for which the accompanying OPA press release stated the following:
… Tablet adoption has significantly increased in the past year; 2012 saw 31% of the U.S. internet population owning tablets (74.1M users), up from 12% (28.3M users) in 2011. Furthermore, by the year 2013 this figure is expected to increase with a projected 47% of the U.S. internet population (117.4M users) owning tablets.
Of these tablet users, the Android platform has drawn level with iOS, largely in part because of the strong sales of the Kindle. 52% of tablet owners have an iOS operating system, while 51% use an Android powered tablet (percentages do not add up to 100% because tablet owners own/use more than one type of tablet). This is a drastic change from 2011, which saw 72% of tablet owners use some form of the iPad while only 32% used an Android tablet. …
4. AllThingsD [June 18, 2012]:
– Microsoft’s Surface Tablet Takes On Apple’s iPad liveblog by Ina Fried
– Microsoft Launches New Surface Tablets With Windows 8 by Bonnie Cha
– Microsoft CEO Steve Ballmer on Where Microsoft’s New Surface Tablet Fits in PC Ecosystem by Ina Fried
In a brief chat after the event, Microsoft CEO Steve Ballmer said that PC makers have known for an unspecified period of time that Microsoft would be doing its own hardware.
Ballmer noted that there will be a lot of PCs sold that will be made by companies other than Microsoft.
“If you look at the bulk of the 375 million machines that get sold (next year), they probably aren’t going to be Surfaces,” Ballmer said. “On the other hand, we could have a sizeable business.”
“It’s an important companion to the whole Windows 8 story,” Ballmer said. “It’s an important piece. It’s not the only piece.”
While Microsoft kept the details of Surface tightly limited to a small group of Microsoft employees working on the project, Ballmer said PC makers weren’t totally taken by surprise.
“Our PC partners knew in advance we were announcing something today in this space,” Ballmer said.
So how did they feel about it? “No comment.”
Ballmer said Microsoft’s goal is that Surface “gives people a full range of things to think about, sort of primes the pump for more innovation around Windows 8, (and) brings new technology to the Windows PC platform.”
Just how closely to the vest has Microsoft been keeping Surface? Ballmer said he has not personally been using a prototype on a regular basis.
“We wanted to keep things under wraps,” Ballmer said. “I’m out in public a lot.”
5. Boy Genius Report (BGR) [June 18, 2012]:
– Live from Microsoft’s tablet event! by Brad Reed
– Microsoft unveils Surface tablet by Zach Epstein
– Microsoft Surface tablet hands-on by Brad Reed
I have to admit that the Touch Cover felt somewhat alien to me at first when I was playing around with it, but that could be due to the fact that I didn’t have a lot of time to play around with it — Microsoft was really herding reporters quickly through the line. The Type Cover did feel quite natural as a keyboard should, however, so at the very least, there should be one strong option for people who prefer traditional keyboards.
The tablet’s 10.6-inch display screen looked gorgeous, although Microsoft was being weirdly evasive when asked what the exact screen resolution was. The tablet’s “VaporMg” casing is extremely solid, and the tablet feels very strong in your hands. Despite being 9.3 millimeters thick, the Windows RT version of the Surface is in no danger of bending under pressure.
In terms of software, Windows RT brings some cool new capabilities to the tablet form factor, including the ability to run two apps on the same screen simultaneously. One Microsoft rep, for instance, demonstrated how to have Outlook email on one half of the screen while having sports scores on the other half. And of course, the home screen on both versions of the Surface tablet features Windows 8′s Metro UI that is significantly more intuitive, colorful and user-friendly than past editions of Windows.
Thin/Zero Client and Virtual Desktop Futures
May 30, 2012 5:36 pm / 1 Comment on Thin/Zero Client and Virtual Desktop Futures
26 years of Wyse and Citrix collaboration resulted in an advanced infrastructure solution bringing the Windows desktop into a virtualised cloud environment and accessible from any cloud computing client device, including even thin client and zero client devices, or ones presenting a HTML5 browser functionality only. The infrastructure is getting a universal device management capability as well. And the most important hallmark of this infrastructure solution is complete security meaning immunity from viruses et al. In addition to the Windows desktop applications the next wave of web applications as well as SaaS applications (such as those provided by Salesforce.com) are made easily accessible and usable from any of those device and access points. The hallmark here is the possibility of continuing usage at the point where it has been left off from another device and access point. True flexibility from the user point of view.
For more introductory information please watch these two videos:
Jeff McNaught Interview One [CitrixTV YouTube channel, May 24, 2012]
Zenith2 – The Product that Changes Evertyhing [CitrixTV YouTube channel, May 24, 2012]
The detailed elaboration of the “Thin/Zero Client and Virtual Desktop Futures” topic will go through the following sections of the post:
- Wyse entry-level solution for education
- A glimpse into the Wyse portfolio and their large public / enterprise markets
- Essential technology and market information
A highly important preview from it:
XenDesktop and Metro Receiver [CitrixTV YouTube channel, May 9, 2012]
- Note: following that video it is absolutely important to watch the SYN229: What’s new with Citrix Receiver for desktop users video next to it because of the need to understand the Virtual Desktop Future assured by the upcoming Citrix Receiver universal client as represented best by the following image:

delivered in the [18:53 – 23:05] timeframe of the video.
Finally to understand the whole picture from/through a very practical demonstration of the whole range of possibilities watch these videos:
– The Future is Now (17 minutes – part 1 of 2)
– The Future is Now (28 minutes – part 2 of 2)
– Citrix Receiver on the Wyse Xenith, connecting to a XenDesktop virtual desktop - Wyse Product/Technology Details
- Dell Wyse (i.e. the Dell acquisition of Wyse)
– for introduction to that see: Dell Completes Acquisition of Cloud Client Computing Leader Wyse Technology [Dell press release, May 25, 2012]
Before going into those detailed sections here is a highly important introduction as well (in order to understand the future potential of this advanced infrastructure solution):
Wyse Technology’s President and CEO Tarkan Maner speaks with Edie Lush at Hub Davos [hubculture YouTube channel, Jan 26, 2012]
Notes:
– [00:40] Presumably the entry-level zero (which has no OS – see much below) client, Wyse E01 is shown as “working on only 2 watts” (the spec much below says upto 3 watts) and “costing less than $50, start at $35” (the current single unit retail price of E01 is $76 however, while the list price is $99 – see much below).
– That device is even presented as needing only the data center. Currently however entry-level zero client devices such as E01 (and the latest E02) require Microsoft MultiPoint Server (see much belo). So he is definitely pointing to an upcoming solution.
– [03:00] He mentions South-Africa with “10 million devices this year” as an educational example. So that kind of upcoming solution could definitely be in the works already. The power consumption difference might also indicate such a new entry-level device.
Management team [Wyse webpage, April 2, 2012]:
President, CEO and Chief Customer Advocate, Tarkan Maner
Tarkan Maner is the President and CEO at Wyse Technology, the global leader in Cloud Client Computing. Cloud Client Computing is the ultimate end user computing solution for our time, replacing the outdated, unsecure, unreliable, un-green and expensive client/server-centric systems. Cloud Client Computing delivers the security, manageability, availability, reliability, scalability, flexibility, and user experience with the lowest energy usage and total cost of ownership. Cloud Client Computing simply connects all the dots: Cloud client software, hardware and services.
Wyse provides its customers and partners with the broadest and deepest portfolio of Cloud Clients, including Thin, Zero and Cloud PC clients, supported by the leading cloud-centric firmware, virtualization, management and mobility software in the industry. Wyse independently partners with the leading data center, networking and collaboration solution providers within its global partner ecosystem to help organizations and people reach the clouds – in a private, public, government or, even in a personal cloud. Wyse’s mission is to enable any user, anywhere, to connect to any content via any app in any work environment without constraints, conflicts or compromises.
Tarkan believes that Cloud Client Computing not only drives better economic and productivity results for organizations, but, also drives societal change throughout the world. Cloud Client Computing reduces the cost, eliminates the complexity and enables the reach of computing to the next six billion users via billions of devices pervasive in every aspect of our lives.
…
Tarkan in the news
- Forbes OpEd – Cloud Computing for Public Sector
- Top Five Cloud Myths, Trends, and Recommendations
- How to Succeed at Innovation and Differentiation
- Opinion: Seeking ‘game changers’ that will create jobs
- Tarkan at WEF 2011 in Davos – Future of Manufacturing
- Tarkan at WEF 2012 in Davos – Cloud Client Computing for a New World
- Voices from the New Generation: The Explorer
…
Wyse entry-level solution for education
Post-PC Era Expands as Wyse and Serbian Government Partner for Nation-wide Cloud Client Computing Deployment in Education [Wyse press release, Sept 28, 2011]
More than 30,000 Students Gain Access to Latest Learning Technology with Wyse and Microsoft Solutions in Schools across Serbia
LONDON, UK and SAN JOSE, Calif. – 09/28/2011 – Wyse Technology, the global leader in cloud client computing, today announced a major implementation of its zero client technology in the Digital School project to transform classroom teaching in Serbia. In one of the largest projects of its kind in Europe, all elementary schools in Serbia will be outfitted with a new IT infrastructure based on Wyse zero clients and Microsoft Windows MultiPoint Server 2010, enabling every student to have access to the latest computing software, educational applications and online resources.
Committed to modernizing the country’s educational system, among other reforms, the Ministry of Telecommunications and Information Society, identified the need for a better information technology and communications infrastructure to support teaching and learning in classrooms.
Working with its technology partner company ComTrade, the solution is based on Windows MultiPoint Server 2010 and enables multiple users to simultaneously share a single computer while each using their own monitor, keyboard and mouse. This is an ideal solution for educational customers that want to extend IT access to more students, easily and affordably. The solution is designed for simple implementation and ease-of-use for teachers, provides the familiar Windows 7 desktop experience, and requires no advanced IT expertise.
The ministry selected Wyse E01 zero clients because they maximize the advantages of Windows MultiPoint Server. The zero clients simply plug into the host computer which automatically configures and enables a student to start work immediately. Unlike comparable devices for Windows MultiPoint Server, the Wyse E01 zero client supports USB peripherals such as, webcams, and USB flash drives, allowing a more flexible computer-based teaching and learning experience.
Jasna Matic State Secretary for Digital Agenda, former Minister for Telecommunications and Information Society said , “Enhancing ICT for education is a major goal of the Government with this programme delivering on our promise to give every student access to their own computer at school. With cutting edge technology from Microsoft and Wyse, our schools have a solid foundation for delivering education to the highest standards.”
…
Deployment of the Microsoft and Wyse education solution started in December 2010 and will be completed this year.
For more information about Wyse E01 zero clients, please visit, http://www.wyse.com/products/hardware/zeroclients/E01/index.asp
Windows MultiPoint Server 2011 Overview [msmultipoint YouTube channel, May 25, 2011]
Windows MultiPoint Server 2011 is a low-cost computing solution that creates a 1:1 user to computer experience built on Windows Server. With MultiPoint Server 2011, one PC can provide up to 20 computing sessions at a fraction of the cost.
Wyse® E class™ – Affordable computing for education [Wyse brochure, Jan 23, 2012]
…
1. One Windows Multipoint server shares its operating system and applications with up to 20 users at a time.
2. Features Wyse E class zero clients, one per desktop and each one linked by a USB [E01] or Ethernet [E02] cable.
3. Low cost, fast and simple to set up delivery of Windows desktops.
Windows MultiPoint Server 2011 Quick configuration guide
4 ~ 6 users 8 ~ 20 users CPU Intel CPU i5/i7 Intel CPU i5/i7 Memory 4 GB 8 GB Hard drive 250 GB 500 GB Graphics/1 On board Intel HD Graphics 2000 or similar same Graphics/2 PCI-Express Card ATI Radeon™ HD 4600 / 4770 / 5750 nVidia GeForce 8x, 9x Series / GT220,GT240 same
Software Microsoft Windows Multipoint Server 2011 Zero client Wyse E01 [retail: $76+] and E02 [$99] Zero Client Licenses (Microsoft Academic VL) Microsoft MultiPoint Server License [$115]Microsoft MultiPoint CAL License per device [$29]
…
Technical specifications Wyse E01
[E02 difference is Ethernet networking + 2 USB 2.0 port instead of 4 with E01 + 98 x 98 x 20 millimeters dimensions and 128g + standing position]
Server OS Windows MultiPoint Server 2011 I/O peripheral support One VGA (DB-15)
Four USB 2.0 ports (1 on left side, 3 on right side)
One Mic In / One Line Out
USB keyboard (not included)
USB mouse (not included)Networking One USB in to connect to host computer (cable included)
Maximum distance between each Wyse E01 zero client and the host computer is 5 meters (16 feet 5 inches)Display Up to 1680 x 1050 @ 60Hz / 32bits or 1600 x 1200 @ 60Hz / 32bits Audio Output: 1/8-inch mini jack, full 16 bit stereo
Input: 1/8-inch mini jack, 8 bit microphonePhysical characteristics Height: 21.5mm (0.85 inches)
Width: 132mm (5.20 inches)
Depth: 87mm (3.43 inches)Shipping Weight 145g (0.32 lbs) Power Worldwide auto-sensing 100-240 VAC, 50/60 Hz. power supply
Average power usage with device connected to 1 keyboard with 1 mouse and 1 monitor:
less than 3 WattsTemperature Range Vertical position: 50° to 104° F (10° to 40° C) Humidity 20% to 80% condensing
10% to 95% non-condensing
…
Announcement information:
- Wyse Extends Client Virtualization Leadership in Education Market with the Introduction of a New Zero Client for Schools [Wyse press release, Feb 24, 2010]
$99 Wyse E01 Zero Client and Windows MultiPoint Server 2010 Optimize IT and Financial Resources for Schools in Tough Economy
…
“We’re happy to be launching with strong support from Wyse, which has committed to developing innovative and effective solutions like the Wyse E01 Zero Client for the MultiPoint platform,” said Ira Snyder, general manager, Windows MultiPoint Server at Microsoft Corp. “MultiPoint Server can deliver a familiar Windows computing experience to educational institutions around the world, helping them get the best value out of technology investments while providing the very best education for their students.”
…
- The New $99 Wyse Zero Client Provides Simple and Cost-Effective Computing Access for Education and SMBs Worldwide [Wyse press release, Jan 11, 2012]
Wyse Expands E Class Zero Client Offering for Windows MultiPoint Server
Wyse Technology … today announced the introduction of the Wyse E02 zero client in support of Microsoft’s Shape the Future program
…
The Wyse E01 zero client and the Wyse E02 zero client work with Windows MultiPoint Server 2011 to enable multiple students or SMB users to share a single server. The E02 is easy for teachers to set up and use in the classroom, providing an excellent Windows 7 desktop experience for their students. While the Wyse E01 zero client provides students access to the shared server via USB cabling up to 5 meters, the E02 goes a step further to provide access via Ethernet, at a distance of up to 100 metersfrom the Windows MultiPoint Server.“Providing students with affordable access to technology is one way Microsoft is helping to ultimately create greater opportunities and more enriched lives for youth around the world. The Wyse E02 zero client, combined with Windows MultiPoint Server, is an excellent example of how we are working to deliver on this mission,” said Microsoft’s Shape the Future Senior Director, Joice Fernandes.
…
Appropriate and sustainable technology solutions for education in Africa [in The eLearning Africa 2012 Report (p. 17), May 23, 2012]
Widening access to reliable information technology is key to how we can help our children develop educationally. This is especially true in the fast developing economies of Africa where the expectation for access to ICT in the school has increased as more citizens use information technology like mobile phones in their everyday lives.
However, in our view, the ambitious eLearning goals in Africa can only be achieved with classroom technology that is intrinsically sustainable. But, in the African context, what do I mean by sustainability? First of all this is not about ticking the box of some green IT policy set by a government. The reality of extending digital classrooms into urban or rural Africa is that IT provision must take account of the absence of reliable power supplies. Any interruptions can be managed with novel solutions around battery back-ups or solar energy to power a classroom in a remote setting.
Even when reliable power supplies are available, low power consumption is going to remain important in how schools manage their budgets. This makes thin or zero client computers very attractive as they typically only use between 3 and 15 watts of power.
Sustainability in African eLearning is much more than about energy efficiency. It also refers to how IT in schools needs to be easy to set up and manage because it is unrealistic to expect a school to always have access to IT management skills on the ground. As African educators plan their expansion of eLearning, they need to ensure the classroom technology is largely self-sufficient and simple to set up, manage and use in the classroom. The centralised management and robust plug-and-play functionality of classroom labs that use virtualisation technology answers this requirement, ensuring that investments in school classroom labs deliver the maximum educational benefit over a long period of time.
In investing in digital classrooms African educators are demonstrating incredible foresight in what new generations of Africans need to improve their lives. They need to guard against making ICT decisions that trap them in the past. While budgets are always going to be tight, African educators must be ambitious about ICT in education and take advantage of the latest 21st century thinking on virtualised and cloud computing.
Another important dimension of sustainability is the degree to which the ICT is future-proofed in how it can keep pace with future developments in applications and data. Educators are already using solutions like this to transform ICT in their schools and colleges. In South Africa more than 1.5 million students already have ICT access thanks to classroom labs that utilise Wyse cloud computing technology.
Sustainability in African eLearning is vitally important in making ICT widely accessible to students across the Continent. Indeed, African countries look set to trail-blaze other economies in their innovative use of cloud client computing on a massive scale.
David Angwin is Vice President, Field Marketing for Wyse Technology,
and based in the United Kingdom
Wyse Cloud Client Computing Highlights Sustainable E-Learning for Students at eLearning Africa 2012 [Wyse press release, May 23, 2012]
Showcases Latest Digital Classroom Solutions to Widen Availability of School Labs and One-to-One Computing for High Quality IT Enhanced Teaching and Learning in African Schools and Colleges
SAN JOSE, CA and COTONOU, Benin – 05/23/2012 – Wyse, the global leader in cloud client computing, today announced its participation in the eLearning Africaconference and exhibition. As the event’s platinum sponsor for the second year running, Wyse will discuss how advanced cloud client computing can help African educators meet their goals for widening access to technology-enhanced education, development and training. eLearning Africa runs from 23rd – 25th May 2012 in Cotonou, Benin, under the patronage of the Government of Benin.
Working across the continent with its local technology partners, Wyse has developed and deployed a range of solutions that are ideally suited to widening access to IT-enhanced education and training in Africa. The technologies involved are tailored to the continent’s requirements for classroom ICT that is exceptionally reliable, affordable and energy efficient while not compromising on access to the latest applications and data for teaching and learning.
Delegates to eLearning Africa will have the opportunity to see the latest in digital classroom solutions co-developed by Wyse and Microsoft. This includes an entry level shared computing solution for school IT labs that combines Wyse E01 and Wyse E02 zero clients with Microsoft Windows MultiPoint Server 2011; and the Wyse WSM cloud software solution, which offers a centrally-managed, scalable one-to-one computing environment for students that scales across classrooms, labs and schools. Both solutions address the requirement for classroom IT that is secure and easy to set up and run, while delivering a great desktop experience for the students.
Mark Jordan, vice president and general manager, EMEA Sales, Wyse Technology will be delivering a keynote in the opening plenary session on 23rd May 2012. He will address how cloud solutions can play a pivotal role in helping IT enhanced education transform the prospects of African students. Tarkan will be speaking alongside S.E. Max Ahouêkê, Ministère de la Communication et des Technologies de l’Information et de la Communication (MCTIC), Benin; and Prof Sugata Mitra, Professor of Educational Technology, Newcastle University, UK and Visiting Professor, MIT Media Lab, Cambridge, USA.
The event will be ideal opportunity to be updated on how African customers are advancing their e-learning strategies with Wyse cloud client computing solutions. For example in South Africa more than 1.5 million students already have ICT access thanks to classroom labs that utilize Wyse cloud computing technology. In Nigeria, a new network of examination centers relies on a Wyse cloud client computing infrastructure to enable examinations to be delivered, taken and scored entirely electronically, saving time and money while also improving reliability and service with accurate results delivered in hours rather than months.
Education is Wyse’s second largest market, with ten of the world’s top fifteen universities using Wyse solutions to reduce costs and improve learning. They and other educational institutions benefit from Wyse’s position as the only cloud vendor to offer desktop virtualization solutions for every budget and scale of implementation, ranging from ten to upwards of ten thousand units.
A glimpse into the Wyse portfolio
and their large public / enterprise markets
Health care with Citrix and Wyse Xenith next-generation zero-client devices at Seattle Children’s Hospital [WyseTechnology YouTube Channel, May 23, 2011]
Microsoft HIMSS 2011 – Interview with Andre Beuchat of Wyse Technology [WyseTechnology YouTube Channel, May 10, 2011]
Japan’s Largest Bank Turns to Wyse for VDI and Mobility [Wyse blog, April 10, 2012]
Today, Wyse announced that Bank of Tokyo-Mitsubishi is deploying 50,000 Wyse devices. The combination of Wyse’s desktop and mobile hardware, virtualization software and overall Wyse domain expertise in cloud and virtualization is the reason why the Bank of Tokyo-Mitsubishi selected Wyse for its VDI implementation. Bank of Tokyo executive Mizuhiko Tokunaga commented that “… the deciding points were the technological edge of their unique software, Wyse ThinOS, their specialization in VDI, and the sense of trust we felt toward Wyse as a company. Wyse has been a global market leader for a long time, and it shows.”
The Bank of Tokyo-Mitsubishi, the largest bank in Japan and eighth largest in the world, began what was considered the largest systems integration project in the world in 2008 when it started this ambitious project to strengthen security across all 773 branches in Japan and 73 abroad. For more information on this initiative and how Bank of Tokyo is using Wyse, visit: http://www.wyse.com/about/press/release/1917
Cloud Computing involves using information technology as a service over the network.
- Services with an API accessible over the Internet
- Using compute and storage resources as a service
- Built on the notion of efficiency above all
- Using your own datacenter servers, or renting someone else?s in granular increments, or a combination
We at Wyse believe cloud computing has the potential to change how we invent, develop, deploy, scale, update, maintain, and pay for applications and the infrastructure on which they run.
Essential technology and market information
XenDesktop and Metro Receiver [CitrixTV YouTube channel, May 9, 2012]
SYN229: What’s new with Citrix Receiver for desktop users [CitrixTV YouTube channel, May 10, 2012] — absolutely important to watch in order to understand how the virtual desktop future would be assured by the upcoming Citrix Receiver universal client experience across different end-user access points (PC, Mac, tablets, smartphones, thin clients and web browsers) for Windows, web and SaaS applications (at least go forward to the [18:53 – 23:05] timeframe in the video) !!!
Wyse, Marvell, and the Citrix System-on-Chip Initiative [Wyse blog, May 10, 2012]
Yesterday Marvell announced participation in the Citrix System-on-Chip (SoC) initiative with the Marvell® ARMADA® 510 SoC for seamless integration with Citrix HDX in a complete silicon solution. The SoC combines a high-performance, low-power SoC with a hardware graphics processing unit and video decoding acceleration hardware. The end result is excellent processing power for high-end apps like HD multimedia in a very efficient, cost-effective footprint.
Wyse already uses the Marvell ARM SoC in our industry-leading T class thin clients. Combining Marvell’s high performance SoC with software optimized for Citrix HDX enables Wyse to offer compact, efficient, and powerful thin clients like the Linux-based T50 thin client and the super-secure T10 thin client based on Wyse ThinOS. In addition, our newly announced Xenith 2 zero client for Citrix XenDesktop and HDX is also based on the ARM SoC, and sets a new price/performance standard for Citrix zero clients in its class.
Zenith2 – The Product that Changes Evertyhing [CitrixTV YouTube channel, May 24, 2012]
Wyse Zero [Engine] and Wyse ThinOS [Wyse webpage, Feb 24, 2012]
Built for VDI Optimized for Citrix XenApp, Citrix XenDesktop, Microsoft Terminal Server and VMware View virtual desktop environments Lightning fast Super-fast start-up provides access to virtual desktops in under 20 seconds Super Secure No attack surface provides immunity to viruses and malware Easy-to-manage Hands-off, scalable device management with Wyse Device Manager; easy FTP-based configuration and automatic updates Smart card support Seamless smart-card roaming ideal for workstation-based environments Rich user experience Integrated Wyse TCX Suite for enhanced audio, video and multimedia
Overview
Wyse ThinOS
Wyse ThinOS is the most optimized, management-free solution for Citrix XenApp, Citrix XenDesktop, Microsoft Terminal Server and VMware Viewvirtual desktop environments. With an unpublished API and no attack surface, Wyse ThinOS is immune to malware and viruses that make other operating systems vulnerable to attack. This super-fast, purpose-built thin computing OS boots up in seconds, updates itself automatically and delivers simple, scalable administration to eliminate time-consuming maintenance tasks related to configuration, management and updates. With full support for Wyse Virtual Desktop Accelerator (VDA), ThinOS neutralizes the effects of network latency and packet loss, even in remote-branch and field-based applications.
Related link
- What’s new in Wyse ThinOS with David Angwin, Wyse Technology Watch video »
Wyse Zero [Engine]
Already used in millions of thin clients, zero clients, and handheld smart devices, Wyse Zero [Engine] simplifies the development of cloud-connected smart devices, enabling seamless user access to cloud computing services and virtual desktops. Wyse Zero [Engine] addresses limitations with current embedded options, such as the typical security vulnerabilities of Windows and Linux-based operating systems, and slow initialization due to their large size. With a rich array of networking, management and protocol technology packaged in an engine less than 4MB in size, Wyse Zero reduces costs and simplifies management and updates. With no underlying OS to slow it down, it starts up instantly for a more satisfying user experience. And unlike Windows or Linux-based embedded products that require extensive protection, Wyse Zero [Engine] is original technology and therefore virtually immune to malware, viruses and hackers.
Wyse Stratus Overview [WyseTechnology YouTube channel, Feb 24, 2012]
Wyse Announces Private Beta of Cloud-Based Service to Secure and Simplify Corporate Access for Users Across All Devices [Wyse press release, May 8, 2012]
Project Stratus Directly Tackles Consumerization of IT Challenges with Intelligent, Integrated and Cross-Platform User and Device Management
05/08/2012 – Wyse Technology, the global leader in cloud client computing, today announced the Project Stratus private beta program. Project Stratus provides IT administrators with an intelligent and dynamic cloud-based console to securely manage and enable corporate access to any device, regardless if that device is owned by the company or by the individual. Initial support will focus on securing and provisioning corporate access to smartphones, tablets, thin clients, and zero clients with plans to quickly expand support to additional devices used in the workplace.
Project Stratus delivers a unified console that goes beyond standard device management solutions by providing a complete view of the IT infrastructure serving end-users. The console provides visibility not only into users and their devices, but also into their relationship with the IT ecosystem. The result for IT is valuable insight into usage models, trends, and the means to identify areas of investment to more securely and effectively provide corporate services to end users.
“The biggest challenges to IT in a BYOD world has to do with the securing of corporate access to all devices being used by employees. With Project Stratus, our goal is to eliminate the need to have a separate, silo’ed console for each device type and instead allow IT admins to set an access policy for a user that will apply regardless of what device they are using—providing for the first time a one-stop shop for device and access management,” said Hector Angulo, Product Manager at Wyse.
“For a company such as ours that relies on a distributed and mobile workforce, the means to simplify and secure our mobile devices is very appealing,” according to Adam Bari, Managing Director at IPM. “We are very much looking forward to deploying Project Stratus to better manage our mobile computing infrastructure.”
Wyse will be showcasing Project Stratus at Citrix Synergy™ 2012 in San Francisco, May 9th – 11th in Wyse Booth #206 at the Moscone Center. Companies interested in taking part of the private beta can sign-up by going to http://www.wyse.com/stratus
Key features of Project Stratus include:
• Simplicity. Streamlined, discoverable interface with user-centric policy management to help automate user access regardless of what device they are using, including easy exception handling– natural and intuitive management for today’s dynamic IT world
• TCO Reduction. Cloud-hosted service eliminates costly on-premise servers and enables instant deployment and scaling — drastically reduces the total cost of operations and ownership
• Real-time Analytics. Dynamic and instantly personalized data feeds always present admins with the most relevant insight to help expedite the task at hand – powerful analytic engine exposes most important activities, events, and trends
• Actionable. Pro-active alerts notify admins about compliance violations and other potential issues with option to take contextual actions in-place (i.e. warn user, block, ignore) or automate future mitigation (i.e. automatically approve roaming exception request for all members of ‘executive’ group)
• Time-Saving. User and device pages that provide instant visibility into any managed asset, including who is using the device, what it is interacting with, and any potential performance or security issues in order to expedite issue identification and resolution
• Unified Console. Visibility and management of all devices used in the enterprise, with support for smartphones, tablets, thin clients, and zero clients — one-stop shop for all devices, no more hassle of dealing with many consoles
• Security. Enterprise-ready, multi-tenant architecture with fully encrypted communication ensures only you have access to your data
HDX Ready Software-on-Chip with TI and NComputing [CitrixTV YouTube channel, Nov 8, 2011]
HDX Ready Thin Clients [Citrix microsite, May 9, 2012]
The HDX Ready designation is reserved for thin client devices that have been verified to work with all of the XenDesktop and XenApp HDX features. HDX refers to High Definition User eXperience – a term coined by Citrix to describe capabilities in XenDesktop that optimize the user experience when accessing hosted virtual desktops and applications. The HDX Ready category assists IT managers to easily identify thin client devices that deliver the best possible high definition user experience with XenDesktop and XenApp.
There is a trade-off between a thin client’s cost and its capabilities. Not all users require the functionality of all of HDX features of XenDesktop or XenApp. Devices that are not deemed HDX Ready may still be useful for certain user types and use cases, generally at a lower price point than HDX Ready devices. The Citrix Ready thin client designation exists for those devices that support connectivity to XenDesktop or XenApp but only a subset of HDX functionality. Information regarding HDX feature coverage by a particular thin client device is available on the Citrix Ready website
Citrix HDX SoC spurs innovation and cuts the cost of thin clients in half [The Citrix Blog, May 9, 2012]
Today Citrix celebrates with our partners the unveiling of exciting new client computing devices that leverage the HDX SoC initiative.
Thousands of Citrix customers are already using thin client devices to access virtual desktops and apps delivered by Citrix infrastructure. These customers who have successfully deployed thin clients are getting the benefits of reducing or even eliminating their device management footprint, decreased their dependency on lifecycle management, and have reduced their power consumption by efficiently leveraging computing resources in the datacenter or server room.
There are also many customers who look at the cost of desktop virtualization and can easily justify supporting mobile workers and BYO programs. However, when it comes to replacing desktops in their offices, they may find it harder to justify purchasing a thin client when the price of the endpoint also, after all the dust settles, might be close to the replacement cost of a PC.
Delivering cost reduction
Last October, at Synergy Barcelona 2011, Citrix announced the HDX System on Chip initiative in partnership with Texas Instruments and NComputing, to create new SoC reference designs based on ARM chipsets to accelerate HDX user experience technologies in silicon. By using optimized hardware-based acceleration rather than decoding and rendering virtual desktop traffic on a general purpose processors in software, these SoCs can deliver the user experience of thin client devices costing twice as much or more while reducing power consumption, heat, and footprint. However, don’t mistake hardware-acceleration for “all-hardware.” Devices built on the HDX SoC initiative still run a Citrix Receiver in an embedded OS that permits updates to provide devices new functionality over time, further extending the expected lifecycle.
Taking cues from the living room
This direction of optimized delivery of high definition experience is no different than what many of us are seeing play out in our living rooms. Instead of collecting massive collections of videos to store in cabinets or home servers, cloud providers like NetFlix, Amazon, Apple, Hulu, Pandora, and others store media for us, allowing us us to stream in many cases real time content to our homes. This media can be displayed from TV’s using integrated “internet streaming,” from most any smartphone, tablet, or computer, or through the addition of a $50 appliance from companies like Roku that we plug into our TVs. It is this revolution in cloud entertainment services and the drive for low-cost, low-powered – long battery life devices overtaking the consumer electronics industry that Citrix can now leverage to optimize end point devices for desktop virtualization.
To learn more about these exciting, market-changing, transformative new devices being unveiled by HP, Atrust, Centerm, NComputing, and ThinLinX, please check out the HDX SoC 2012 partner page here.
Dell Wyse: acqusition of Wyse Technologies by Dell
(a summary of the many original materials compiled in the closing part of this post)
- Wyse – a leader in Desktop Virtualization
- Wyse – ranking number one worldwide in thin clientunit share in the fourth quarter of 2011
- Differentiated IP and device management, thin client operating systems, and mobility software that is customized to offer the best user experience with Microsoft, Citrix and VMware virtual desktop infrastructures.
- Much of their software value is captured in the hardware itself. Their ThinOS and the IP around the ThinOS has allowed them to drive greater performance using less memory. So Wyse solutions require less memory and processing power than other comparable thin client solutions, making them more cost competitive and effective for customers.
- Wyse as an independent entity has really been gaining momentum to grow into a number one market share position. In fact, they are growth accelerated in their last fiscal year to 45 percent
- Dell’s view on that:
– The momentum around alternative computing is a trend that they see many customers continuing to experiment with and in many cases, beginning to deploy, although the adoption rates are still relatively low for desktop virtualization.
– They don’t see the entire world going to thin clients. They still think there’s a healthy PC demand in the industry and there’s a balance of alternative computing that allows people to take advantage of securing their information, managing the assets in a very differentiated way. Even a common thin client deployment today is on a standard PC that’s been virtualized.
– This is an opportunity particularly in the verticals around financial services, government healthcare, and the financial services sector to really take a leadership position. This is really specific use cases. For example, in regulated industries like healthcare and financial services, the value of centralizing your data to better have access and control is a specific use case that this thin client desktop virtualization lends itself to.
– They needed it because it is also a different workload to move forward their cloud computing strategy.
– Again, they don’t think a zero client or a thin client is an answer for all customers. They think in their mind that the bigger message here is they now have a range of devices, an incredibly strong portfolio of thin client devices and zero client devices from Wyse, the standard Dell set of PCs, which do virtualization, and now the ability to manage those in a very differentiated way with the key software assets that they’re bringing on board that expand themselves to tablets, expands itself to mobile phones.
- Wyse portfolio includes a wide selection of industry leading thin and zero client devices designed easily to integrate into a virtualized or web based infrastructure
- It compliments and extends the desktop virtualization capabilities that Dell has today.
- Also a big part of this transaction is the synergy that Dell would get from their datacenter solutions business, including servers, storage, networking services, and software. For every thin client hardware dollar that exists in the IT industry, there’s $5 of enterprise servers, storage, networking services that go along with that.
- This could also remove the barrier for some companies that did not have the right level of datacenter portfolio and datacenter ecosystem to exploit the thin client alternative of enterprise computing: i.e. deploying desktop virtualization centric cloud client portfolios and platforms.
- Wyse is a company that has 31 years of experience. They have the intellectual property, they have the software and 150 R&D engineers which 140 are in software. Wyse and one other competitor basically had almost 50 percent of the market. Wyse are pretty close partners with Microsoft, and they do a lot of work with VMware, with Citrix as well. As these providers provide desktop virtualization methodology and technology between the datacenter and end use computing platforms Wyse add to that value and they partner heavily with them and obviously that’s going to continue.
- [Wyse:] And also, one other piece to add, we provide some of the software we provide is differentiated in the marketplace, is the leader in this space also from the cloud, both on the infrastructure management side from the cloud, with a product called Wyse Stratus. So, many of you on the phone are using today, Wyse PocketCloud, the market leading product for content management from the cloud on any mobile device and also from your web browser, connecting your apps and content inside the content voice data video from your choice of your cloud, private or public.
- The software stack that brings together the edge device, the management software that manages that, that sits into the cloud or sits into the datacenter, and the ability to build that software from essentially ground zero to being able to acquire those capabilities and that experience and the technology with it, puts Dell in a leadership position. The differentiated technology that they are getting with the integration of Brad Anderson’s [Dell president, Enterprise Solutions] and Steve Schuckenbrock’s [Dell president of Services] businesses, allow them a unique position to do this for their customers. All this allows them to move quite quickly in the marketplace, much quicker than they could have done it on their own.
- IDC: worldwide thin client demand will grow 15 percent per year to approximately 3 billion by 2015
- IDC: the overall end to end solutions market with thin clients is expected to exceed 15 billion by 2015
Wyse Cloud Client Computing [WyseTechnology YouTube channel, April 1, 2012]
Citrix Announces New Innovations in Desktop Virtualization Lowering Cost and Accelerating the Transformation to Virtual Desktops [Citrix press release, May 9, 2012]
New XenDesktop, VDI-in-a-Box & AppDNA capabilities drive adoption
San Francisco, CA » 5/9/2012 » Today, at Citrix Synergy™, the conference where mobile workstyles and cloud services meet, Citrix announced a set of new innovations that help organizations transform their Windows desktops and apps into a cloud-like service that can be managed centrally and delivered to any device in any location. New releases of Citrix desktop virtualization products and new game-changing Citrix HDX Ready SoC-based endpoint devices from key partners are helping to ease the transition to virtual desktops, drive down the acquisition costs and provide expanded capabilities targeting broad use cases from the call center, to high-end engineering and mobile workers in enterprises, the public sector and SMBs, enabling organizations of all sizes to deliver anywhere, anytime access to desktops, applications and data to users.
With the tremendous explosion of new devices, operating systems and applications, organizations are struggling to keep up with the challenge of managing desktops and applications in this new highly mobile world. At the same time, trends such as consumerization and bring your own device (BYOD) programs are putting added strain on IT resources. Citrix is raising the bar once again delivering new innovations across its desktop virtualization products and working with partners to drive down the costs of virtual desktops.
Easier On-ramp to Desktop Virtualization
- New Remote PC Option in XenDesktop FlexCast– The new RemotePC option is part of the FlexCast® delivery technology in the Citrix XenDesktop® product line. Using the new RemotePC capability, XenDesktop customers will be able to quickly turn existing office PCs into distributed VDI hubs without setting up additional servers and storage in the datacenter. This innovative new solution makes it easy for IT to give end users fast, secure remote access to all the apps and data on their office PC from any device. Once IT is ready to move to a more full-service VDI implementation, these distributed RemotePC images can be easily moved into the datacenter to run in a traditional hosted VDI model for better consolidation, security and management efficiency. Remote PC functionality will be included in XenDesktop 5.6 Feature Pack 1, which will ship in June, 2012.
- New AppDNA Software Release – To ease the transition to Windows 7 and a virtual desktop infrastructure, the new release of Citrix AppDNA software brings a simplified overall installation, setup and user environment to accommodate a broader range of enterprises, the channel and global SIs. Citrix AppDNA also provides even more in-depth application details so enterprises can accurately assess, rationalize and act on applications before a project begins. The AppDNA 6.1 software will be available in Q2, 2012. (see announcement blog for more detail)
Reducing the Acquisition Costs of Virtual Desktops
- First Wave of Game-changing Endpoints Arrives – The first results of the Citrix HDX System-on-Chip initiative that was announced at Citrix Synergy Barcelona are being delivered to the market. The initiative was designed to enable an entirely new generation of devices that deliver high-definition virtual desktops and apps at game-changing price points and form factors. These devices reduce the cost of high-performance HDX Ready thin clients by more than half, further driving down the cost of desktop virtualization. New devices from ATrust, Centerm, HP, NComputingand ThinLinx are being announced today at Citrix Synergy San Francisco and are built for Citrix XenDesktop, and Citrix VDI-in-a-Box. (See announcement blog for more detail)
- Personalized VDI for Less than the Cost of PCs – The Project Aruba technology preview delivers a cost-efficient yet complete VDI solution by extending the simple affordable Citrix VDI-in-a-Box™ with layering technology using personal vDisks to deliver highly personalized virtual desktops that retain the cost-efficiencies of pooled desktops. Project Aruba also provides a validated blueprint for service providers looking to deliver cost-effective VDI-based Desktops-as-a-Service.
Citrix has also made available a license migration path from VDI-in-a-Box to XenDesktop for customers that want to extend beyond VDI to leverage the full flexibility of XenDesktop while preserving their investment. The end-user experience is consistent across both products as both VDI-in-a-Box and XenDesktop use the same HDX stack and Citrix Receiver. (See announcement blog for more detail)
Delivering Expanded Functionality for Broad Use Cases
Citrix is delivering new innovations that create a very seamless experience for end-users, delivering a more complete solution than other alternatives on the market.
- Empowering Point-to-Point Unified Communications for Cisco and Microsoft– With the introduction of HDX Real Time technologies for voice and video collaboration, industry-leading unified communications (UC) solutions including Cisco VXI Unified Communications and Microsoft Lync 2010 can process voice and video locally and create a peer-to-peer connection for the ultimate user experience while taking the load off datacenter processing and bandwidth resources. XenDesktop delivers new levels of efficiency and quality of service for the most demanding use cases. HDX Real Time will be available with XenDesktop 5.6 Feature Pack 1 in June, 2012. – Support for HDX Real-Time with select Cisco VXI clients was recently announced in April, 2012 representing the first optimized UC solution for desktop virtualization on the market. This solution represents one of the first deliverables from the recent collaboration agreement between Cisco and Citrix to optimize HDX for Cisco networks.- The new Optimization Pack for Microsoft Lync 2010 will be included in XenDesktop 5.6 Feature Pack 1. This pack supports Microsoft Lync 2010 for point to point voice and video communications to Windows and Linux devices and will extend across all Citrix Receiver™-enabled devices over the coming months.- Beyond traditional unified communications support, XenDesktop also optimizes voice and video collaboration for cloud-based solutions including Citrix GoToMeeting® by compressing voice and video traffic on the client before transmission over the network.
- Cutting Network Bandwidth for Demanding 3D Engineering Environments – Whether collaborating with design engineers across oceans using advanced CAD/CAM or GIS apps or consulting medical imaging at a patient’s bedside with an iPad, the secure, high performance delivery of GPU accelerated 3D applications and desktops with XenDesktop has never been more powerful or efficient. Using new deep compression codec technology that reduces bandwidth requirements by 50 percent, XenDesktop with HDX 3D Pro technologies secures sensitive intellectual property and privacy-sensitive data while improving collaboration and performance eliminating the need to synchronize and transfer massive data files. Meanwhile, users leverage state-of-the-art graphics processing hardware in the datacenter to access designs and images from any device, anywhere. HDX 3D Pro will be available with XenDesktop 5.6 Feature Pack 1 in June, 2012. (See the announcement blog for more detail)
- New XenClient Enterprise and Acquisition of Virtual Computer – Citrix announced the acquisition of Virtual Computer, provider of enterprise-scale management solutions for client-side virtualization. Citrix will combine the newly-acquired Virtual Computer technology with its market-leading XenClient® hypervisor to create the new Citrix XenClient Enterprise edition. The new XenClient Enterprise, available in Q2, 2012, will combine all the power of the XenClient hypervisor with a rich set of management functionality designed to help enterprise customers manage large fleets of corporate laptops across a distributed enterprise. The combined solution will give corporate laptop users the power of virtual desktops “to go”, while making it far more secure and cost-effective for IT to manage thousands of corporate laptops across today’s increasingly mobile enterprise.
- Simplifying Printing with New HDX Universal Print Server – Now, Citrix desktop virtualization products tame the complexity of printing by completing a universal printing architecture with the Citrix HDX Universal Print Server. Combined with the previously available Universal Print Driver, administrators may now install a single driver in the virtual desktop image or application server to permit local or network printing from any device, including thin clients and tablets, leveraging HDX optimization technology to reduce bandwidth load over wide area networks and manage printing communications outside of the virtual desktop channel for enhanced quality of service. HDX Universal Print server will be available with XenDesktop 5.6 Feature Pack 1 in June, 2012. (See the announcement blog for more details)
Quote
“Citrix is helping to drive down the costs of virtual desktops, and advancing technology around user experience and manageability to move desktop virtualization adoption forward at a rapid pace. Though product innovation and strong partner ecosystems we are addressing barriers on all fronts including acquisition costs, migration complexity and delivering complete solutions for all customer segments from large enterprises to SMBs.”
– John Fanelli, Vice President of Product Marketing, Enterprise Desktops and Applications at Citrix
Related Links
- Announcement: New XenDesktop Release Accelerates Migration to Windows 7 and Beyond
- Announcement: Dell and Citrix Deliver a Simple VDI Appliance for the Mass Market
Follow Us Online
- Citrix XenDesktoppage
- Citrix AppDNApage
- Citrix VDI-in-a-Boxpage
- Twitter: @Citrix, @XenDesktop, @VDIinaBox, @AppDNA
- Citrix on Facebook
NOW to understand the whole picture from/through a very practical demonstration of the whole range of possibilities watch these videos:
The Future is Now (17 minutes – part 1 of 2) [citrixvideos YouTube channel, April 11, 2011]
The Future is Now (28 minutes – part 2 of 2) [citrixvideos YouTube channel, April 11, 2011]
Citrix Receiver on the Wyse Xenith, connecting to a XenDesktop virtual desktop [citrixvideos YouTube channel, April 10, 2011]
Wyse Product/Technology Details
Wyse Changes Everything with Announcement of Xenith 2 Zero Client for Citrix VDI-Based Deployments [[Dell] Wyse press release, May 9, 2012]
Leading Zero Client Improves Performance for VDI Installations Using Citrix Desktop Virtualization Solutions
SAN JOSE, CA – 05/09/2012 –
Wyse Technology, the global leader in cloud client computing, today announced the Wyse Xenith 2, based on the ultra-secure Wyse zero framework. This breakthrough zero client was revealed today at Citrix Synergy™ 2012, the premier event on cloud computing, virtualization and networking. Wyse, the leading shipper of fixed and mobile desktop zero clients in the world, will be demonstrating the Xenith 2 at Wyse Booth #206 from May 9-10, 2012.
Following on the success of the Wyse Xenith and Wyse Xenith Pro, the Wyse Xenith 2 is the ideal Citrix zero client solution for both enterprise and SMB organizations. The Wyse Xenith 2 zero client is purpose-built for Citrix XenDesktop® blending the amazing cost benefits of the ARM System-on-Chip (SoC) architecture, with a non-Windows Citrix Receiver compatible client, developed in cooperation with Citrix. Improving on the success of the Xenith, with 30% faster performance and lower power consumption, the result is a super secure, very affordable, true high-fidelity desktop experience. For users requiring a diverse variety of applications, including HD multimedia, the Wyse Xenith 2 delivers a new standard in price and performance in a compact zero client and delivers an unprecedented combination of simplicity, performance and security for office-based workers.
The Wyse Xenith 2 requires no local configuration or management and can offer customers of all sizes a more secure client while helping reduce management and overall client cost. Full AES 128 bit encryption enables encryption of network certificates on the client, which is a truly ironclad level of security. Leveraging the Wyse zero framework, the Wyse Xenith 2 is able to provide a secure, ‘instant on’ experience for end users—booting up and logging into a Citrix XenDesktop® in less than 10 seconds. With no exposed API’s and no attack surface, the Wyse Xenith 2 zero client is malware and virus immune, removing client security concerns.
“Wyse Xenith has been a game-changer for us,” according to Wes Wright, Chief Technology Officer at Seattle Children’s Hospital. “Not only are we saving $6 million in hardware replacement costs, more than $1 million in staff time, and $300,000 per year in energy savings, we also have devices that are faster, more secure and more reliable than anything we had before. With Xenith 2, Wyse is simply adding more appeal to an endpoint device family that makes Citrix XenDesktop a great end-to-end VDI solution.”
Like the Wyse Xenith and Wyse Xenith Pro, the Wyse Xenith 2 changes everything, including the economics of desktop computing. Wyse Xenith 2 eliminates the complications of management and security issues associated with traditional client devices, while ensuring an unparalleled high-definition user experience, further lowering the barriers for mainstream adoption of desktop virtualization.
“As customers look to the flexibility of desktop virtualization, Citrix is enabling these enterprises to transform their traditional Windows computing environments into a cloud-like service, delivering anywhere, anytime access to desktops, applications and data. Through collaborative relationships like the one with Wyse, we are further driving down the costs of virtual desktop deployments and accelerating adoption. The Xenith 2 achieves this goal by providing a secure, affordable solution that is optimized to deliver a high-definition virtual desktop experience through Citrix Receiver,” said Sumit Dhawan, group vice president and general manager, Receiver and Gateways at Citrix Systems.
“By tightly-integrating with Citrix, we’re delivering a zero client that is second to none in performance, security, manageability, and ease of use for this class of VDI endpoint,” according to Param Desai, VP, Product Management at Wyse Technology. “All of this plus it is more affordable than ever before.”
“Vendors like Wyse continue to push the envelope in zero client technology,” according to Bob O’Donnell, Program VP, Clients and Displays at IDC. “The ability to improve device performance while adding additional functionality and reducing cost bodes well for future zero client customers.”
Top Product Benefits
• Secure. Stateless zero client has zero attack surface for viruses & malware; no local disk and no APIs. Xenith 2 also offers single sign-on and is integrated with Imprivata support. Full AES 128 bit encryption enables encryption of network certificates on the device.
• Powerful. The Wyse Xenith 2 includes a Citrix Receiver client and achieves unparalleled user experience, great graphics performance and high fidelity multimedia due to Wyse’s innovative performance optimizations for ARM SoC and available only on Xenith 2 and T10. Xenith 2 starts up in 6 seconds.
• Affordable. Sets a new level of price / performance.
• Easy to manage. Integrated out of the box with XenDesktop management console in addition to also being managed by Wyse Stratus as part of a comprehensive device management from the cloud. Xenith 2 also comes with auto detection of server and configuration and is a completely stateless device, always using the latest zero engine delivered directly from a central configuration file server and the XenDesktop server.
• Compact. Requires very little space or none — includes VESA mount for back of display mounting. Xenith 2 is 30 percent smaller than original Xenith and utilizes only 7 watts in full operation.
• Zero-compromise user experience. Network-based QoS ensures quality (HDX multi-stream). Devices offers true 720P 25+ fps HD for wmv and H.264 with HW decoding engines. Dual display with rotation and l-shaped [which is unique and essential for financial services environments with an additional screen for spreadsheet viewing in vertical] display capabilities. New WAN support with local echo and bandwidth reporting allowing remote and at home users greater flexibility and performance..
Pricing and Availability
The Wyse Xenith 2 will be available soon with an estimated customer price TBD. For more information, please visit:
http://www.wyse.com/products/cloud-clients/zero-clients/Xenith2

Overview
Establishing a new price/performance standard for zero clients for Citrix, the new Wyse Xenith 2 provides an exceptional user experience at a highly affordable price for Citrix XenDesktop and XenApp environments. With zero attack surface, the ultra-secureXenith 2 offers network-borne viruses and malware zero target for attacks. Xenith 2 boots up in just seconds and delivers exceptional performance for Citrix XenDesktop and XenApp users while offering usability and management features found in premium Wyse cloud client devices. Xenith 2 delivers outstanding performance based on its system-on-chip (SoC) design optimized with its Wyse Zero architecture, and a built-in media processor delivers smooth multimedia, bi-directional audio and Flash playback. Flexible mounting options let you position Xenith 2 vertically or horizontally on your desk, on the wall or behind your display. Using about 7 watts of power in full operation, the Xenith 2 creates very little heat for a greener, more comfortable working environment.



Specifications
| Operating System: | Wyse Zero™ Engine |
| Processor: | Marvell® ARMADA™ PXA 510 v7 1.0 GHz system-on-chip (SoC) |
| Memory: | 0MB Flash / 1GB RAM DDR3 |
| I/O peripheral support: | • One DVI-I port, DVI to VGA (DB-15) adapter included • Dual display support with optional DVI-I to DVI-D plus VGA-monitor splitter cable (sold separately) • Four USB 2.0 |
| Networking: | • 10/100/1000 Base-T Gigabit Ethernet • Optional internal wireless 802.11 b/g |
| Display: | • VESA monitor support with Display Data Control (DDC) for automatic setting of resolution and refresh rate • Dual monitor supported with ‘L shaped’ display rotation • Single: 1920×1200@60Hz; color depth: 32 bpp • Dual: Up To 1920×1080@60Hz; color depth: 32 bpp |
| Audio: | • Output: 1/8-inch mini jack, full 16-bit stereo, 48KHz sample rate • Input: 1/8-inch mini jack, 8-bit microphone |
| Included: | • Enhanced USB keyboard with PS/2 mouse port and Windows keys • PS/2 mouse |
| Power: | • Worldwide auto-sensing 100-240 VAC, 50/60 Hz. • Energy Star V5.0 • Phase V external and EuP compliant power adapter |
| Power consumption: | Under 7.2 Watts (average) |
| Dimensions: | • Height: 1 inch (25mm) • Width: 6.9 inches (177mm) • Depth: 4.69 inches (119mm) Weight: 1 lb (450g) |
| Shipping Weight: | 1.003 lbs. (.455kg) |
| Mountings: | • Stand for horizontal use and VESA/wall mounting (included) • Optional vertical stand |
| Temperature Range: | • Operating • Horizontal position: 50° to 95° F (10° to 35° C) • Vertical position: Power button up: 50° to 104° F (10° to 40° C) • Storage: 14° to 140° F (-10° to 60° C) |
| Humidity: | • 20% to 80% condensing • 10% to 95% non-condensing |
| Security: | Built-in Kensington security slot (cable sold separately) |
| Safety Certifications: | • Ergonomics: German EKI-ITB 2000, ISO 9241-3/-8 • Safety: cULus 60950, TÜV-GS, EN 60950 • RF Interference: FCC Class B, CE, VCCI, C-Tick • Environmental: WEEE, RoHS Compliant |
| Warranty: | 3-year limited hardware warranty |
Jeff McNaught Interview One [CitrixTV YouTube channel, May 24, 2012]
Marvell Joins Citrix System-on-Chip Initiative to Bring Citrix HDX Technology for Thin Clients to Market [Marvell press release, May 9, 2012]
Santa Clara, California (May 9, 2012) – Marvell (Nasdaq: MRVL) today announced participation in the Citrix System-on-Chip (SoC) initiative to enable an entirely new generation of thin clients for high-definition virtual applications and desktops at a low cost. The Marvell® ARMADA® 510 SoC seamlessly integrates Citrix HDX capabilities into a complete silicon solution. The first of many ARMADA chips to be verified as part of the Citrix SoC initiative, the ARMADA 510 is a high-performance, highly integrated, low-power SoC comprised of an ARM v6/v7-compliant superscalar processor core, a hardware graphics processing unit, video decoding acceleration hardware and a broad range of peripherals, answering the need for fast processing and a rich multimedia user experience.
“The future of enterprise computing is in the convergence between mobile devices and digital content – it’s imperative that end users have access to the content they need from any device, whether it’s a thin client, tablet or smartphone. Citrix has been abreast of this monumental shift in the computing landscape for years – and now the Citrix SoC initiative makes it even easier for companies to deliver a new category of mobile-enterprise friendly devices to users quickly and affordably,” said Jack Kang, director of marketing for mobile at Marvell Semiconductor, Inc. “Working closely with Wyse, Marvell is proud to integrate the performance enhancements from Citrix SoC initiative onto Wyse’s performance rich Citrix HDX Ready T50 device based on Marvell’s ARMADA 510. Marvell is also working closely with Citrix to verify its full portfolio of highly scalable enterprise silicon solutions, from cloud servers to mobile and consumer end point devices, and we look forward to further collaborations with Citrix Ready partners to deliver new and exciting products throughout the enterprise.”
“Citrix XenDesktop delivers the capabilities to enable enterprise customers to begin or accelerate their migration to Windows 7 and beyond, while gaining the mobility, flexibility, and management benefits of desktop virtualization.” said Ankur Shah, principal product manager at Citrix Systems. “We welcome Marvell to the Citrix System-on-Chip initiative. Marvell’s broad portfolio of technology will enable a wide variety of devices to leverage the benefits of Citrix desktop virtualization technology.”
”Wyse is excited about Marvell’s partnership with Citrix on the Citrix SoC initiative,” said Kiran Rao, director of product management at Wyse Technology. “The end-to-end approach, incorporating Marvell’s high performance hardware with software optimized for HDX technology, enables Wyse to quickly bring innovative devices to market that provide a superior end user experience. Wyse’s compact, affordable Citrix HDX Ready T50 and T10 thin clients, as well as the new Xenith 2 zero client, powered by Marvell’s ARMADA 510 SoC will further expand access to cloud-based desktop virtualization using Citrix XenDesktop in the enterprise and beyond.”
Wyse and Microsoft discuss cloud PCs and OS licensing [WyseTechnology YouTube channel, May 19, 2011]
Wyse Z Class Thin Client [WyseTechnology YouTube channel, Jan 31, 2011]
Comparison of the current Z class products: Wyse Z90DE7, Wyse Z90D7, Wyse Z90S7, Wyse Z50D, Wyse Z50S, Wyse Z90DW
All with dual-core AMD G-T56N. The 4 Windows® Embedded Standard 7 based ones at 1.6 or 1.65 GHz while the 2 Wyse-enhanced SUSE Linux based ones at 1.5 and 1.6 GHz respectively. Memory is 2/4/8GB Flash + 2/4GB RAM, DDR3, depending on the model. Memory on 3 models is expandable, and on 3 Windows® Embedded Standard 7 based ones SSD storage is also supported. Power consumption is under 15 Watts (average) for all. Dimensions are 200 x 47 x 225 millimeters. Weight is 1.1kg.
Wyse Introduces World’s Fastest Thin Client Family [Wyse press release, Aug 29, 2011]
Wyse, Cloud Client Computing, Z class, World, Fastest, Available, VMworld 2011
SAN JOSE, Calif. – 08/29/2011 – Today at VMworld® 2011, Wyse Technology, the global leader in cloud client computing, today announced that its fastest thin clients ever, the [Windows® Embedded Standard 7 based] Wyse Z90D7 and Z90DW are now shipping. In addition, Wyse today introduced two new Linux-based members of its Z class family – the Wyse Z50S and Wyse Z50D. The Wyse Z50 is the high performance thin client family based on Wyse Enhanced SUSE Linux Enterprise, the industry’s only enterprise-quality Linux operating system that combines the security, flexibility, and market-leading usability of SUSE Linux Enterprise from Novell, with Wyse’s thin computing optimizations in management and user experience.
In connection with the availability of these breakthrough thin clients, Wyse also announced the results of independent testing, recently conducted by The Tolly Group, of the Wyse Z class versus the competition. Wyse made this announcement in connection with VMworld® 2011, the global conference for virtualization and cloud computing held in Las Vegas, August 29th through September 1st at The Venetian. As part of VMworld 2011, Wyse is demonstrating their award-winning virtualization, management, and cloud software and a wide range of thin, zero, mobile and cloud PC client hardware at Booth #1111.
At the heart of the Wyse Z class thin clients lie an entirely new engine, one where all the major system elements – CPU cores, vector engines, and a unified video decoder for HD decoding tasks – live on the same piece of silicon. This design concept eliminates one of the fundamental constraints that limit performance.
The Wyse Z class delivers a combination of performance, simplicity, and connectivity never before seen in a thin client. With available dual-core AMD G-series Fusion accelerated processing units, the Wyse Z class is the world’s best-performing thin client, able to support the most processing-intensive applications including 3D solids modeling, HD graphics simulation, and unified communications with ease. They also include the first SuperSpeed USB 3.0 connectivity in a thin client, enabling the newest peripherals and speeds up to 10 times faster than USB 2.0. With Wyse Z class thin clients, users have more display options than ever before including DisplayPort and DVI.
The Wyse Z class also includes advanced networking capabilities, with support for gigabit Ethernet, and available integrated A/B/G/N dual band Wi-Fi. They are compliant with the ENERGY STAR Version 5.0 Thin Client specification.
Independent testing by The Tolly Group recently confirmed the Z90D7s substantial leadership position in thin client performance compared to rival products. In support of rich video-based Web applications, for example, the Z90D7 boasted a clear advantage in video playback quality while using just a fraction of its processing and memory capability. That equates to a clearly superior user experience on a much more energy-efficient platform. In addition, the Z90D7 scored up to five times higher in industry-standard performance ratings (CPU Mark, 3D Graphics Mark, and PassMark ratings) than the competition. Among secure, cost-effective, yet powerful thin clients, these independent tests confirmed that the Wyse Z class is the clear winner.
“Being able to combine power and performance in such an easily-managed device is something we are extremely proud of,” said Param Desai, Sr. Director, Product Management, with Wyse Technology. “With the availability of Wyse Z class we’ve more than doubled the performance capabilities of competing top-of-the-line thin clients with similar energy requirements.”
Built on the same exact advanced single and dual core processor hardware platform as the Wyse Z90 thin clients, the upcoming Linux-based Wyse Z50 promises more of the same industry leading power and capability on an enterprise-class Linux operating system.
“We are very familiar with the performance of Wyse products having deployed several Z90 devices throughout our campus,” according to Ryan Foster, Network Engineer at Montgomery County Community College in Southeast Pennsylvania. “We were particularly impressed with the improvements to our desktop security, and by the capabilities of these devices handling multimedia files such as audio, video and Flash.”
Supporting Quotes
“The Wyse Z Class and VMware View™ combine to take advantage of PCoIP® in ways that will enhance the end-user experience,” said Vittorio Viarengo, vice president, End-User Computing, VMware. “Better security, easier management and significant energy savings all combine in a high-performance thin client that will benefit both IT and end users.”
“Wyse has made innovative use of the AMD G-Series Accelerated Processing Unit which combines a multi-core CPU, a discrete-class DirectX® 11 capable GPU and HD video decoding in one tiny piece of silicon,” said Buddy Broeker, director of embedded solutions at Advanced Micro Devices “The Wyse Z class takes full advantage of the processor’s unprecedented level of graphics integration that delivers a unique combination of performance and efficiency.”
Availability
For more information on Wyse Z90 including independent report results, please visit:http://www.wyse.com/products/hardware/thinclients/Z90The Wyse Z50 will be available later this year.
Wyse PocketCloud Family Overview [WyseTechnology YouTube channel, Feb 21, 2012]
Wyse PocketCloud Personal Cloud [WyseTechnology YouTube channel, Sept 21, 2011]
More videos about the PocketCloud:
- Wyse PocketCloud wins 2011 Appy Award for Productivity category [WyseTechnologyYouTube channel, March 3, 2011]
- Wyse PocketCloud 2.0 Features for iOS Devices [WyseTechnologyYouTube channel, Oct 13, 2010]
- Wyse PocketCloud demo on an iPad [WyseTechnologyYouTube channel, Dec 14, 2010]
- Wyse PocketCloud Features for Android Devices [WyseTechnologyYouTube channel, Dec 8, 2010]
- Introducing Wyse PocketCloud [WyseTechnologyYouTube channel, Sept 15, 2009]
- Wyse PocketCloud [WyseTechnology YouTube channel, Sept 15, 2009]
Dell Wyse
Focus on Dell [May 24, 2012]
Dell Completes Acquisition of Cloud Client Computing Leader Wyse Technology [Dell press release, May 25, 2012]
- With Wyse, Dell assumes a leadership position in Thin Clients[1]
- Dell’s new Desktop Virtualization capabilities combined by Dell’s leadership position in Server, Storage and Networking solutions successfully positions the company as true end-to-end IT vendor
Dell today announced it has completed its acquisition of Wyse Technology, the global leader in cloud client computing. The combination of Wyse’s capabilities with Dell’s existing desktop virtualization offerings position the company as the leader in the desktop virtualization, enabling it to offer true end-to-end IT solutions for customers and partners.
Dell has made significant strategic investments over the past three years to expand its enterprise technology and services capabilities. The Dell Wyse portfolio with current Dell desktop virtualization offerings, leading data center products such as servers and storage, and Dell’s services division, provides customers and partners with a single vendor that can match the full range of their cloud computing and desktop virtualization needs.
The Dell Wyse solution portfolio includes industry-leading thin, zero and cloud client computing solutions with advanced management, desktop virtualization and cloud software supporting desktops, laptops and next generation mobile devices. Dell Wyse has more than 180 patents, both issued and pending, covering its solutions, software and differentiated intellectual property. Dell’s existing offerings include Desktop Virtualization Solution Simplified and Desktop Virtualization Solution Enterprise.
Dell recognizes it’s critical for the desktop virtualization solutions strategy to embrace simple device management, enhance security, scale, and boost user productivity, while providing the flexibility to support anytime, anywhere access on any device.
Dell plans to preserve Wyse’s channel offerings and all existing Wyse channel partners will be eligible for our PartnerDirect Program. Dell will combine the best of both companies’ channel deal registration programs, extend this new deal registration program to all partners, and introduce a program in which partners can grow and nurture a customer relationship.
Quotes
“We’re excited to officially welcome Wyse to Dell and help extend its industry-leading efforts to a broader range of customers and partners,” said Jeff Clarke, Dell vice chairman and president, Global Operations and End User Computing Solutions. “We believe the Dell Wyse capabilities, combined with our previous desktop virtualization offerings and the strength of the Dell enterprise portfolio, provides the most comprehensive and competitive DVS solution available today.”“Wyse and Dell share the vision and passion in helping our customers and partners create a frictionless user experience via the cloud,” said Tarkan Maner, Vice President and General Manager Dell Wyse, Cloud Client Computing. “Combining our relentless IP innovation and tight operational skills, and most importantly our laser focus on customer and partner advocacy, Dell cloud client computing will develop and deliver the most advanced solutions globally, from the data center to the end user. We are and will be completely focused on the best user experience for any user, for any content, using any app, on any device, anytime, anywhere; without any conflict, compromise and constraint.”
“As a current customer who has deployed Wyse cloud client computing solutions with Dell PowerEdge servers and Dell EqualLogic storage, Western Wayne School District is excited about the combination of Dell and Wyse,” said Brian Seaman, Network Administrator at Western Wayne School District in Pennsylvania. “Like most school districts, Western Wayne operates in a budget constrained environment and our move to desktop virtualization technologies supported with strong enterprise infrastructure has enabled us to do more with less in service of our students and community. In working with Dell and Wyse to scope and deploy our computing environment, Western Wayne now has the right technology to help us achieve our vision of educating our students of today to become the productive citizens of tomorrow.”
“End point computing models continue to evolve and are accelerating tremendous innovation and efficiencies across enterprise desktop and personal computing,” said Bob O’Donnell, vice president, Clients and Displays, IDC. “One area of strong customer growth is in the desktop virtualization space and we expect to see adoption rates continue to grow over the next several years. As use models continue to mature, so do the vendors who offer solutions in this product space. Dell’s acquisition of Wyse results in an industry-leading solutions and services provider with a formidable end-to-end technology stack from the end point to the datacenter to the cloud.”
Dell to Acquire Wyse Technology Conference Call This slideshow could not be started. Try refreshing the page or viewing it in another browser. Dave Johnson, Senior Vice President, Dell Corporate Strategy: We at Dell continue to execute on our strategy to develop and expand our solutions capability built on Dell’s intellectual property. These solutions are open with a focus on enhancing customer productivity, delivering results faster and eliminating unnecessary complexity. We’re making great progress in delivering solid results on this strategy. Today’s announcement is an important next step to our end user computing strategy. It enhances our portfolio in the critical area of client computing and further supports our efforts to help our customers innovate end to end IT solutions from the edge to the core of the cloud. The acquisition of Wyse Technology compliments and expands Dell’s existing desktop virtualization capabilities, allowing us to offer industry leading and differentiated solutions to a fast-growing segment of the end user computing space. In addition, it also provides synergies with our enterprise solutions business. Our ability to now offer an industry leading cloud client computing solution will provide opportunities for Dell to further accelerate the growth of our servers, storage and network portfolios. IDC estimates that worldwide thin client demand will grow 15 percent per year to approximately 3 billion by 2015, and that the end to end datacenter infrastructure stack for these solutions is expected to exceed 15 billion by 2015. And with Dell’s portfolio, we’ll be able to participate in this broader opportunity. Wyse Technology is a leader in the high growth and strategic area of cloud client computing, ranking number one worldwide in thin client unit share in the fourth quarter of 2011. Wyse delivered approximately $375 million in annual revenue over the trailing 12 months. Wyse has approximately 500 employees with 150 employees in research and development, most of which are software engineers. In addition, it has approximately 250 sales specialists that are solely focused on selling Wyse cloud client computing end to end solutions. They have more than 3000 channel partners that sell Wyse technology on a global basis. This transaction expected to be accretive to Dell’s non-GAAP earnings in the second half of fiscal year 2013. Dell’s reputation as a trusted adviser to our customer, our distribution and sales capabilities combined with Wyse’s innovative solutions in cloud computing will help address customers’ needs and is a great strategic fit, both operationally and culturally for Dell. Finally, Dell has a strong track record of integrating acquisitions of this size. Based on experience with similar acquisitions, we expect this transaction to be accretive to earnings on a non-GAAP basis in the second half of this year. We’re really excited about welcoming Wyse to Dell and even more excited about the opportunities for our customers. Jeff Clarke, Vice Chairman, Global Operations and End User Computing Solutions: We see a growing opportunity in cloud client computing. This includes thin and zero client hardware, client infrastructure management software, virtualization end user optimization software, datacenter networking and implementation and managed services. It compliments and extends the desktop virtualization capabilities that Dell has today. These solutions offer customers an alternative compute model and helps enterprises enhance security, streamline desktop management and boost user productivity. Examples of the benefits that a cloud client computing solution can provide include, We have discussed our strategy and end user computer was first to strengthen our core business by implementing sustainable supply chain improvements and the results of which were evident in FY ’12. Our next goal was to deliver solutions and include compelling devices plus the tools to secure, manage that hardware, software and data. You’ve seen the results of that with some of our recent product announcements, as well as the strong growth of our transactional services business in FY ’12. And finally, we indicated our intensions to expand our reach into new and fast-growing areas of the end user computing. The acquisition of Wyse Technology and its portfolio of industry leading capabilities is the next step in our end user computing strategy. Wyse is a global leader in client – excuse me – in cloud client computing. Its portfolio includes a wide selection of industry leading thin and zero client devices designed easily to integrate into a virtualized or web based infrastructure. Differentiated IP and device management, thin client operating systems, and mobility software that is customized to offer the best user experience with Microsoft, Citrix and VMware virtual desktop infrastructures. Wyse solutions require less memory and processing power than other comparable thin client solutions, making them more cost competitive and effective for customers. To date, Dell has relied on shared IP solutions to serve its thin client customers. With this transaction, we are moving to a more profitable industry leading and complete end to end solutions with Dell owned IP and the associated R&D capabilities with it. Wyse Technology’s portfolio complements and extends Dell vision of providing innovative and complete end to end solutions to our customers. In addition, the combination of Wyse Technology with Dell’s brand and customer reach presents a dramatic increase in Wyse’s addressable demand. I’d like to leave you with the following takeaways; Tarkan Maner, CEO of Wyse Technology: The entire team at Wyse is excited about joining the Dell team and becoming an integral part of enabling Dell’s end user company vision. This agreement is great news for our customers and channel members worldwide. We’ve been focused on delivering innovative solutions for our customers and channel members for the past 30 years now. To be exact, 31 years now. Dell and Wyse share a focus on delivering innovative IP, world class service support, and optimized overall value to our customers and channel members. Customers and channel members rely on Dell to provide comprehensive end to end IT solutions. Clearly, Dell distribution, reach and brand are well recognized across the industry and it has industry leading capabilities across servers, storage, networking services and end user computing solutions. Wyse has historically been recognized as a leader in cloud client computing where our skills and capabilities in security, manageability, availability, reliability, lower total cost of ownership both in terms of CAPEX and OPEX, and scalability have been key differentiators in delivering the best value to our customers and channel members. Through the combination with Dell, we see obviously a tremendous opportunities to grow our core desktop virtualization business, as well as to expand into new and fast growing market segments and on mobility, and cloud computing. These include infrastructure and content management as a service solution from the cloud for large enterprises, for small and medium businesses, as well as consumers. We have extended our solutions into the unified communications space lately as well, providing voice, data, and video (what we call triple play) type of content delivery from the cloud for any user, for any content, for any app on any device, anywhere, anytime. And we would like to say, without compromise, without constraint or conflict. Our strong alliance ecosystem will be able to benefit from the extensive solutions portfolio they can now provide to their customers in teaming up with Dell. The Dell PartnerDirect program currently has 100,000 channel members and a proven track record of effectively onboarding and training channel members of acquired companies. This is exciting for us. Wyse has a history of innovation across all of our product lines and have recently introduced many new solutions for our customers and channel members with more than 180 patents; to be exact, 182 patents in cloud client computing. We believe that taking the next step at Dell is a very natural progression for our business and offers our customers and channel members some great advantages that are not available to us today at our scale and size. It is exciting to think about the potential of integrating Wyse’s technology and R&D capabilities with Dell’s reach, existing solutions, capabilities and reputation. We believe our customers and channel members worldwide will benefit in a big way from this entire combination. … Q: … just some more detail on Wyse’s hardware/software mix and margin structure, and what growth assumptions did you guys make to justify the price and over what time period and did you make any assumptions about cross-selling Dell branded enterprise solutions when coming up with the price? Today, the majority of the revenue is from the thin client and zero client business with the growing percentage of that revenue now starting to come from some other areas, including some of the things that Tarkan spoke about. … If we look at and project out a few years, clearly a big part of this transaction is the synergy that we would get from our datacenter solutions business, including servers, storage, networking services, and software. We also would expect, you know, within the services space, maintenance and some ongoing hosting opportunities over time, and there are also opportunities even in software and peripherals (S&P) if you think about the things like monitors and other items that you would sell in conjunction with a thin client solution. … … Wyse as an independent entity has really been gaining momentum to grow into a number one market share position. In fact, they are growth accelerated in their last fiscal year to 45 percent. Far outstripping the mid-teens industry average growth, both historically and projected in the future for this segment. And that’s driven by the breadth of their portfolio and the differentiation that they bring to their customers. … the thin client portion of the entire stack is really a small piece. Our expectation and our experience has been as we engage with our customers on helping them determine how to solve for this workload set of requirements – and it really is a workload that you’re talking about – and your engaging at a much more comprehensive enterprise level about a solution. And if you move to a thin client solution, and clearly the network, compute and storage moves, whether that’s into a private cloud or a public cloud, it’s in part of the entire solution. Wyse is an independent entity that didn’t have, of course, access to the broad portfolio that we do. … So, we believe the combination of our service and enterprise with our capabilities and the added capabilities of Wyse in the client space is a great combination and will be extremely synergistic for us. … I think, a key element that much of their software value is captured in the hardware itself. So, for example, they build on top of the protocols in our industry events features ahead of others, whether that’s multi-monitor support, the integration of voice, data and video, and/or USB redirect. Their ability to put those features into the platform ahead of the industry has allowed Wyse to extract value for that from its customers. It also, as we mentioned in our remarks, their thin OS and the IP around the thin OS has allowed them to drive greater performance using less memory and they extract a value for that in the industry. And then the bigger picture Dave hit on, for every thin client hardware dollar that exists in our industry, there’s $5 of enterprise servers, storage, networking services that go along with that. So, our ability to really move into that $18 billion marketplace with an end to end set of solutions from Dell is certainly how we view the asset a key piece. Q: Obviously, this is a capability that Dell could have developed probably internally. Does the fact that you decided to do this acquisition now suggest that you’re – Dell is seeing an inflection in the number of customers that are looking for these types of solutions and maybe if you could just give a little more detail on that and what you’re hearing from customers at this point on thin client? … what we view is the momentum around alternative computing is a trend that we see many customers continuing to experiment with and in many cases, beginning to deploy. The adoption rates are still relatively low for desktop virtualization, but there clearly are a lot of customers out kicking the tires, very similar to maybe a decade ago around server virtualization. Not that I’m comparing the two, but more of just the adoption rate. And we think this is an opportunity particularly in the verticals around financial services, government healthcare, and the financial services sector to really take a leadership position. Wyse Technology does have a leadership position in the thin client itself. We have very strong presence in the enterprise and each of those verticals and us building – and Dell now being able to build end to end vertical solutions for these set of customers where it makes sense is key. And again, I would emphasize we don’t see the entire world going to thin clients. We still think there’s a healthy PC demand in the industry and there’s a balance of alternative computing that allows people to take advantage of securing their information, managing the assets in a very differentiated way. And as Dave said, which I think is key in our thinking here, this is a different workload. We look at this workload from the device out on the edge to what we do in the datacenter, providing a set of services and value offerings to our customers. … This is really specific use cases. For example, in regulated industries like healthcare and financial services, the value of centralizing your data to better have access and control is a specific use case that this thin client desktop virtualization lends itself to. And also, lends itself to environments in industries where, again, there’s a desire to simplify the endpoint and manage the application much more centrally. That is often the case in education and ever increasing in some of the emerging geographies. So, we see this as an opportunity, again, to provide specific solutions to specific customer problems and much more industry-centric approach to our business. Q: … do you have any specifics around what percentage of your VDI customers for Dell are incorporating a full PC versus a thin client? And then any thoughts as to whether there’s anything on the horizon that would, you know, increase the ratio of thin client penetration versus a full PC in virtualized installations? We don’t see any real dramatic change. The IDC forecast continues to project into the future a sort of steady 15 percent growth rate. So, there’s no apparent broad inflection point. And as we articulated a moment ago, these are mostly fairly specific situations where the value proposition applies. And so, today, the total opportunity is, you know, counting the entire stack is about $3 billion. And so that’s still a relatively de minimis piece of the overall PC industry. Q: But, just to be clear on that point, you do have customers who are virtualizing their desktop and still purchasing regular Dell PCs rather than thin client? … A common deployment today is on a standard PC that’s been virtualized. Yea, I mean, we’ve seen that business grow in demand through last year and expect it to grow in demand this year. … And again, I don’t think a zero client or a thin client is an answer for all customers. I think in our mind the bigger message here is we now have a range of devices, an incredibly strong portfolio of thin client devices and zero client devices from Wyse, the standard Dell set of PCs, which do virtualization, and now the ability to manage those in a very differentiated way with the key software assets that we’re bringing on board that expand themselves to tablets, expands itself to mobile phones. And the fact that in some cases these usage models are moving to the cloud and the ability to do client cloud computing, I think is key, and a key element of this acquisition. Q: … You mentioned earlier some of the verticals that have been early adopters for this type of technology, can you talk about what you think some of the remaining barriers to broader adoption may be and how, perhaps, Dell is still solving that and what this acquisition does to help you there? … from a vertical perspective … we see growth both in public sector and private sector, obviously, both in large enterprise and midmarket. And from a bigger perspective, we see from time to time, some companies do not have the right level of datacenter portfolio and datacenter ecosystem. Sometimes we see certain customers in certain – in vertical industries or geographies complain about the fact they don’t have the right networking systems in the backend. … these open up an opportunity, obviously. So, those two are mostly the biggest barriers for deploying desktop virtualization centric cloud client portfolios and platforms. … I think the key elements – one of the opportunities we have has changed the value proposition to make the total cost of ownership around manageability, securing the data and the devices much more efficient and attractive for our customers. I think the differentiated technology that we’re getting with the integration of Brad Anderson’s [Dell president, Enterprise Solutions] and Steve Schuckenbrock’s [Dell president of Services] businesses, allow us a unique position to do this for our customers. Q: … because you had mentioned seeing specific vertical opportunities, do you have any details on the split today of [Wyse] revenue by verticals or by geography? The geographic mix is roughly 40 percent U.S., 40 percent EMEA and 20 percent APJ. … from a vertical perspective, I would say 50 percent public sector, 50 percent private sector. When I say public sector, we mean, obviously, you know, state and local governments, healthcare, education, and federal government type of deployments and also private sector, you get the point. In terms of customer size segmentation, I would say about 50 percent large enterprise, 50 percent midmarket/small business is our business at very high level. Q: … if you expect to accelerate the growth rate actually from 45 percent, given synergies from Dell, and then, if you do or whatnot, is the revenue incremental or do you expect any substitutional revenue as well? Like, do you expect that maybe Dell client sales will be hurt by Wyse and then it wouldn’t be completely additive, we’d have to subtract a little from the client side? … our projection is that we will maybe conservatively grow with the industry relative to thin client. But, of course, as you’re pointing out, they didn’t have the ability to integrate the comprehensive solution with networking, storage, compute, as well as wrap all the services around it. So, much of the revenue acceleration is driven by those synergies that you’re pointing out and we expect that to be significant in terms of the growth rates that we’ll be able to achieve through the entire offering that we will provide. Q: … could you go back and speak to build versus buy because it seems to me that Dell would have had a fairly easy time replicating the thin clients from Wyse. … Getting to your point about internal versus external, a comment on this that this is one of the industries when you look at it where Wyse and one other competitor basically had almost 50 percent of the market and then it’s a tremendous drop off to the rest of the players, none greater than 10 percent. And so, the combination of Dell with Wyse will put us in a very dramatic number one – not dramatic, but clearly a number one market position. And so, there’s certain value, as you know, of being a significant player in that kind of an industry situation. … because one of the other elements of the question is Dell versus buy, could we have done this organically? And our view is, I think, very straightforward. This [Wyse] is a company that has 31 years of experience. They have the intellectual property, they have the software and as Dave mentioned earlier, 150 R&D engineers which 140 are in software. We think the stickiness and the solution in the stack that I showed on one of the earlier slides is the software stack that brings together the edge device, the management software that manages that, that sits into the cloud or sits into the datacenter, and the ability to build that software from essentially ground zero to being able to acquire those capabilities and that experience and the technology with it, puts in a, I think, a leadership position and in a position as we integrate this with Steve [Schuckenbrock’s] and Brad [Anderson]’s organizations and build out workloads and solutions to move quite quickly in the marketplace much quicker than we could have done it on our own. Q: … specifically, I noticed than one of your newer products is where the T10 is on an ARM based platform, so what type of ARM engineers are you bringing to Dell? … I’m just curious about ARM technology that’s being – will this further Dell’s ARM, I guess, initiatives? Well, the way that I’d like to answer that question is simply around we’re going to build client devices, both desktops, notebooks, tablets, smart phones, thin clients, zero clients at the appropriate hardware architecture. That will be a combination of x86 and ARM. Dell itself has a pretty strong capability around ARM processor architecture. And as we mentioned, there’s only a dozen or so hardware engineers inside Wyse technology that work on the hardware. So, us getting hardware competence or assets around the design of ARM from Wyse, that’s not the nature of this acquisition, it’s the 140 software engineers that were key. The hardware architects on the Dell side that are working on ARM implementation across the plethora of devices that I mentioned earlier would still be the core ARM architects and the knowledge based for our ARM implementations. The real question maybe lying in the fact, will we continue to support thin clients based on ARM architecture and this thin OS? Absolutely. We believe that’s part of the value proposition that Wyse has had in the marketplace today. It’s allowed them to move quite quickly in implementing new products to the marketplace, providing a performance advantage or a lower cost option because they’ve done a great job in designing for cost and providing comparable features in the marketplace that others do in a more costly way. And on top of that, they innovate the platform, as I mentioned earlier, around the management stack, and then the promise around the software engineer being able to take things like Stratus and PocketCloud and being able to build that around those platforms and integrate Dell’s services around that with the rest of our Dell client assets, we think is an opportunity for us to differentiate with this acquisition. Q: … how this sort of positions yourself with Citrix and the VMware’s of the world, i.e. you know, there’s not going to be any attempts to (inaudible) features and functionality you get with some of those software partners. … we have strong relationships with the key players in thin client computing and virtualization. Not only are we going to continue those partnerships, we’re going to grow those and foster even deeper relationships. … as you all know, we [Wyse] are pretty close partners with Microsoft, we do a lot of work with VMware, with Citrix. As these providers, you know, provide desktop virtualization methodology and technology between the datacenter and end use computing platforms. So, we add to that value and the partner heavily with them and obviously that’s going to continue and the opportunity now, obviously as Jeff said earlier, now we’re bringing the datacenter, the network and end user platform all in an integrated way to our customers for more value. So, we’re going to have more opportunities to partner with Microsoft, with VMware, with Citrix and others in that space. And also, one other piece to add, we provide some of the software we provide is differentiated in the marketplace, is the leader in this space also from the cloud, both on the infrastructure management side from the cloud, with a product called Wyse Stratus. So, many of you on the phone are using today, Wyse PocketCloud, the market leading product for content management from the cloud on any mobile device and also from your web browser, connecting your apps and content inside the content voice data video from your choice of your cloud, private or public. So, these are all opportunities for us to do more with Microsoft, with VMware and Citrix as they move forward. And that’s a big differentiator.
Standards-based adaptive layouts in Windows 8 (and IE10)
March 24, 2012 9:01 pm / 5 Comments on Standards-based adaptive layouts in Windows 8 (and IE10)
Windows 8 Consumer Preview: Product Demo [on WindowsVideos YouTube channel, Feb 28, 2012]
With Windows 8 (and IE10) Microsoft is carrying out a future-proof web platform strategy as well. Below I’ve collected the standards-based adaptive layout technologies (as the most critical ones from a scaling point of view) implemented by the company for the current Windows 8 Consumer Preview released on Feb 29, 2012.
Windows 8 Consumer Preview: Making great Metro style apps [on WindowsVideos YouTube channel, Feb 29, 2012]
For this post watch at least the #2 Snap and Scale beautifully part between [2:42] and [3:20] !
The corresponding W3C specifications are indicated along, namely:
- CSS GRID LAYOUT: actively developed but still Exploring
– upcoming: Working Draft] - CSS FLEXIBLE BOX LAYOUT: actively developed but already in Revising
– upcoming: Working Draft - CSS MULTI-COLUMN LAYOUT: in Testing but the CSS3 test suite is still in development
– upcoming: Candidate Recommendation - CSS EXCLUSIONS AND SHAPES: actively developed but still Exploring
– upcoming: Working Draft - CSS REGIONS: actively developed but still Exploring
– upcoming: Working Draft - CSS TEXT LEVEL 3 (with Hyphenation inside): actively developed but already in Revising
– upcoming: Working Draft
Notes:
- For text layout CSS regions may be a better option than the multi-column in situations where a more varied page layout is called for, or where there is a possibility that the inline content of an element could overflow the element.
- All of the layout constructs available in HTML are available for XAML developers as well. In this way developers in the C++ and the managed (C# etc.) worlds are having the exactly same capabilities, particularly from the point of view of adaptive layout technologies described here from web standards point of view.
Scaling to different screens [Building Windows 8 blog, March 22, 2012]
James Whittaker’s Quality Software Crusade from Academia to Microsoft, then Google and now back to Microsoft
March 14, 2012 3:37 pm / 4 Comments on James Whittaker’s Quality Software Crusade from Academia to Microsoft, then Google and now back to Microsoft
Updates: Why I joined Microsoft [published: March 21, 2012; written: March 13, 2012]
– James Whittaker @docjamesw 7:06 AM – 20 Mar 12 via web · Details
A web futurist is someone who hates the web as it is now and envisions a better future for it.
– Why I hate search [MSDN Blogs > JW on Tech , March 15, 2012]
…
We start from scratch each time. We search for things we’ve already found.
The problem with Internet search is that being stupid about it is profitable. The more ugly blue links you serve up, the more time users have to click on ads. Serve up bad results and the user must search again and this doubles the number of sponsored links you get paid for. Why be part of the solution when being part of the problem pays so damn well? It’s 2012 and we are still typing search queries into a text box. Now you know why, a ‘find engine’ swims in the shallow end of the profit pool. Is it any surprise that technology such as Siri came from a company that doesn’t specialize in search? (Where do you place an ad in a Siri use case?)
There’s no more reason to expect search breakthroughs from Google than there is to expect electric car batteries to be made by Exxon.
We can do better. We’ve been searching for over a decade. We know every place possible where the online equivalent of car keys are found. We know where our online pet is, always. We know so many things about the world that no longer need to be served up as search “results.” (Results indeed! If users ever wake up and divorce their search engine, the “results” page is likely to be exhibit A in the separation hearing.)
Search, my friends, is broken. Finding things has become secondary to monetizing the search process. Fixing this situation is not in the best interest of the incumbents. Which, actually, is all well and good because the fix will need a more web-wide effort anyway. The companies that own the data sources, the companies that ingest, store and conflate that data, the myriad small development shops that do interesting things with the data, the cleverness of the people who curate the data and the power of crowdsourced know-how need to come together and make search … better? No, not better, irrelevant.
Search is dead. The web doesn’t need it and neither do we.
– Now you see it, in 20 years you won’t [MSDN Blogs > JW on Tech , April 12, 2012]
Google’s Marissa Mayer gave a Flintstonian glimpse at what search might look like in 20 years including “predicting what restaurant you might like in a new city” and “connecting you with strangers based on common interests.” Things that take entire seconds today will take … entire seconds in 2032. Thankfully, for Mayer at least, violating Moore’s Law carries no actual criminal or civil penalties.
In a nutshell, what Mayer (and I assume Google) is proposing is that in twenty years Google and the web will still be standing between knowledge and its consumption. Google has 40 billion reasons to be patient regarding the future.
…
You want a prediction of the future? The trend of disappearing search will continue. The web will melt into the background and humans will progressively be removed from their labor intensive and frustrating present by automation. In five years the web is likely to be completely invisible. You will simply express your intent and the knowledge you seek will be yours. Users will be seamlessly routed to apps capable of fulfilling their intent. Apps won’t need to be installed by a user; they will be able to find opportunities to be useful all by themselves, matching their capabilities with a user’s intent. You need driving directions? Travel reservations? Takeout? Tickets to a show? Groceries? Tell your phone, it will spare you the ugly links. It will spare you the landing page. It will spare you the ads. It will simply give you what you asked for. This is already happening today, expect it to accelerate.
…
End of Updates
James Whittaker @docjamesw 8:03 PM – 29 Feb 12 via web · Details
I got my team today. Hmm…what shall I do with 300 developers? You won’t have to wait long to find out.
About JW on Tech [MSDN Blogs > JW on Tech > About James Whittaker, March 13, 2012]
James Whittaker is a technology executive with a career that spans academia, start-ups and top tech companies. He is known for being a creative and passionate leader and in technical contributions in testing, security and developer tools. He’s published dozens of peer reviewed papers, five books and has won best speaker awards at a number of international conferences. During his time at Google he led teams working on Chrome, Google Maps and Google+. He is currently at Microsoft reinventing the web [in a Partner Development Manager role as per LinkedIn, and as a web futurist at Microsoft according to his twitter account].
Want to read more? James wrote How to Break Software, How to Break Software Security (with Hugh Thompson), and How to Break Web Software (with Mike Andrews). While at Microsoft, James transformed many of his testing ideas into tools and techniques for developers and testers, and wrote the book Exploratory Software Testing [Sept 4, 2009]. His current book was written when he was a test engineering director at Google and is called How Google Tests Software (with Jason Arbon and Jeff Carollo) [Whittaker’s twitter: getting close to end of printing, otherwise April 8, 2012].
How Google Tests Software [Google Testing Blog, May 26, 2011]:
Part 1 – Part 2 – Part 3 – Interlude – Part 4 – Part 5 – Part 6 – Q&A – Part 7
Large-scale Exploratory Testing: Let’s Take a Tour [SQEVideo, published on Oct 2, 2011]
[James Whittaker’s Google+ post, Feb 13, 2012]
Signing off of Google+
This will be my last post on Google+. Anyone interested in my post-Google career can follow me on Twitter (@docjamesw).
[James Whittaker’s Google+ post, Feb 3, 2012]
There comes a time when all good things must end and my time at Google is one of them. This is not one of those “Google let me down” rants, nor is it a “I love this company, keep up the good work” farewell … just a realization that even as my perf scores and profile within the company has risen my ability to lead has diminished. It’s time to stop being part of a team changing the world and time to go lead one. Unfortunately, the place to do that is elsewhere. Today is my last day.
Keep in touch with me on Twitter @docjamesw. Or not.
James Whittaker interview [TCLGroupLimited, Dec 3, 2011]
James Whittaker’s testing blog posts while with Google [Google Testing Blog, June 8, 2009 – Nov 15, 2011]
James Whittaker joins Google [Google Testing Blog, June 2, 2009]
By Patrick Copeland
I’m excited to announce that James Whittaker has joined us as our newest Test Director at Google.
James comes to us most recently from Microsoft. He has spent his career focusing on testing, building high quality products, and designing tools and process at the industrial scale. In the not so distant past, he was a professor of computer science at Florida Tech where he taught an entire software testing curriculum and issued computer science degrees with a minor in testing (something we need more schools to do). Following that , he started a consulting practice that spanned 33 countries. Apparently, fashion is not high on his list as he he has collected soccer jerseys from many of these countries and wears those during major tournaments. At Microsoft he wrote a popular blog,and in the near future you can expect him to startcontributing here.
He has trained thousands of testers worldwide. He’s also written set of books in the How to Break Softwareseries. They have won awards and achieved best seller status. His most recent book is on exploratory testing is coming out this summer. It is not a stretch to say that he is one of the most recognizable names in the industry and has had a deep impact on the field of testing. If you have a chance, strike up a conversation with James about the future of testing. His vision for what we’ll be doing and how our profession will change is interesting, compelling and not just a little bit scary.
Join me in welcoming James to Google!
James Whittaker’s testing blog posts while with Microsoft 1st time [posted between July 8, 2008 and May 21, 2009]
tour of the month: the exit-stage-right tour [MSDN Blogs > JW on Test, May 21, 2009]
All tours much eventually come to an end and thus it is with my tour with Microsoft. I have resigned my position and am leaving the company. It was a great ride.
But the tours will continue. My book Exploratory Software Testing: Tips, Tricks, Tours and Techniques to Guide Manual Testers is in press and will appear through Addison-Wesley sometime this summer. I am truly thankful for the many wonderful testers at Microsoft who contributed wisdom, thoughts and even case studies to the effort. Special thanks go to Nicole Haugen, Geoff Staneff, David Gorena Elizondo, Shawn Brown and Bola Agbonile. Microsoft is full of great testers and even here, these guys manage to stand out.
I imagine that I will not be long in setting up a new blog as I have very much enjoyed this experience and being the only tester in Developer Division’s top ten bloggers was quite an honor. For that I thank you.
In case you are interested in my landing place, I can imagine that one or two of the more popular testing blogs around town will be talking about it.
Wish me luck …
before we begin [MSDN Blogs > JW on Test, July 8, 2008]
For those of you familiar with my writing I plan to update some of my more dated work (history of testing, testing’s ten commandments, and so forth) and preview some of the information that I will be publishing in paper and book form in the future. Specifically, I now (finally) have enough notes to revise my tutorial on manual exploratory testing: How to Break Software and will be embarking on that effort soon. This blog is where I’ll solicit feedback and report on my progress.
For now, here’s an update on what’s happening, testing-wise, for me at Microsoft:
- I am the Architect for Visual Studio Team System – Test Edition. That’s right, Microsoft is upping the ante in the test tools business and I find myself at the center of it. What can you expect? We’ll be shipping more than just modern replacements for tired old testing tools. We’ll be shipping tools to help testers to test: automated assistance for the manual tester; bug reporting that brings developers and testers together instead of driving them apart; and tools that make testers a far more central player in the software development process. I can’t wait!
- I am the Chair of the Quality and Testing Experts Community at Microsoft. This is an internal community of the most senior testing and quality thought leaders in the company. We kicked off the community with record-breaking attendance (the most of any of Microsoft’s technical network communities) at our inaugural event this past spring where some of our longest-tenured testers shared a retrospective of the history of testing at Microsoft followed by my own predictions for the future of the discipline. It was a lively discussion and underscored the passion for testing that exists at this company. In this quarter’s meeting we’re doing communal deep dives into the testing-related work that is coming out of Microsoft Research. MSR, the division responsible for Virtual Earth and the Worldwide Telescope also builds test tools! I can’t wait to ship some of this stuff!
- I am representing my division (DevDiv) on a joint project with Windows called aQuality Quest. Our quest is concerned with quality, specifically, what we need to do to ensure that our next generation of platforms and services are so reliable that users take quality for granted. Sounds like I took the blue pill, doesn’t it? Well, you won’t find us dancing around acting like our software is perfect. Anyone who has ever heard me speak (either before or after I joined Microsoft) has seen me break our apps with abandon. In this Quest, we’ll leave no stone unturned to get to the bottom of why our systems fail and what processes or technology can serve to correct the situation.
New hire into our group – James Whittaker [MSDN Blogs > Michael Howard’s Web Log, May 5, 2006]
I’m pleased to announce, actually I’m *thrilled* to announce, that James Whittaker has joined our group [SDL – Security Development Lifecycle]. James is a well-known author and speaker on software testing and security. He most recently worked as a professor of computer science at Florida Tech where he ran a huge software security research team. James created the “How to Break…” book series with Addison Wesley. He wrote How to Break Software [May 19, 2002], How to Break Software Security [May 19, 2003] and How to Break Web Software [Feb 12, 2006].
He’s also one of the folks behind the Holodeck testing tool.
He’s a cool guy, sharp as a tack, with a very dry sense of humor, so we should get along just fine! He’ll be a peer of mine, reporting to Steve Lipner [Trustworthy Computing Initiative chief], and is initially focused on our internal security and privacy training.ne of the folks behind the Holodecktesting tool.
As I’m sure most of you will agree, hiring good security people takes time, and hiring talent like James is rare indeed.
Welcome, James!
[Michael Howard is now the chief security officer for Microsoft as a so called Principal Cybersecurity Architect working with customers and partners. Before that he was a long-time member of the Security Development Lifecycle team, in fact a co-founder of that in 2001, the SDL being also closely related to the now 12 years old Trustworthy Computing Initiative by Microsoft. ]
GTAC 2008 Keynote Address: The Future of Testing by James Whittaker of Microsoft [GoogleTechTalks, published on Apr 7, 2009]
An Interview with James Whittaker [Dr.Dobb’s Journal, Sept 26, 2006]
Michael Hunter interviews James Whittaker, noted testing guru and author, to shed some light on his testing philosophy.
James Whittakeris, I dare say, one of the celebrities of the testing world. He was long a professor of computer science at the Florida Institute of Technology, where he became well-known for his efforts to find ways to make testing a teachable skill. He and his research group there created innovative testing technologies and tools, including the popular runtime fault injection tool Holodeck, and became highly skilled at breaking software security. James founded Security Innovation to productize his work, but recently he has left both that company and teaching to join Microsoft as a Security Architect, where he is working to integrate testing into the Security Development Lifecycle (SDL).
James wrote How To Break Software – one of my favorite books on testing, co-wrote How To Break Software Security (also very good) with Hugh Thompson, and co-wrote How To Break Web Software(haven’t read it yet) with Mike Andrews. James’ talks at Microsoft are always standing room only; this interview will give you a taste of why.
DDJ: What was your first introduction to testing? What did that leave you thinking about the act and/or concept of testing?
JW: I was in graduate school in a software engineering group studying high assurance software engineering methodologies (cleanroom to be specific) and the bloody dev group met at 7:30 on Saturday mornings! I missed the first three meetings (dude, in grad school the nerd act doesn’t happen that early on a weekend) so the professor put me in charge of the independent test team (which I discovered was just me). So that left me with the idea that testers get more sleep than devs but that we need it because we are woefully outnumbered.
And that perception remains, sans the sleep part.
DDJ: What has most surprised you as you have learned about testing/in your experiences with’ testing?
JW: The sheer number of people *passionate* about testing, particularly at Microsoft. It gives me a great deal of confidence in the future knowing that such skill and talent is being applied to the hardest problem the discipline has to offerwhich is quality.
DDJ: What is the most interesting bug you have seen?
JW: The most interesting bug is always the latest bug. Just today everyone in our group was surprised at an Inbox with thousands of recall status messages. Someone sent a mail from an alias of 1275 members, then recalled it. The recall then sent success/failure notices to EVERYONE on the alias. That’s 1275 x 1275 (about 1.6 million) emails! How’s that for exploiting a design flaw!
DDJ: How would you describe your testing philosophy?
JW: Eyes open, brain on, test! Or the longer explanation covered in How to Break Software. Thanks for the chance to plug one of my books!
DDJ: What do you see as the biggest challenge for testers/the test discipline for the next five years?
JW: There are a number of trends that testers are going to have to grapple with. The first is that software is getting better. The result of this is that bugs are going to become harder and harder to find and the weaker testers will be relegated to Darwinian insignificance. Keeping sharp, building skills and maintaining a cutting edge testing knowledge has never been more important.
The second is that software process is finally taking over. For years processes haven’t much affected the way software is built (which doesn’t say much for legacy processes). But here at Microsoft the SDL is revolutionizing the way software is constructed. Testers have to figure out their role in this process. We have to be there, working, at project initiation and play a key role in every single phase of the lifecycle. Testing is not a task for the latter stages of the ship cycle. Testers who realize this and customize their work accordingly will rise in prominence within their product group and be able to influence the growth of the SDL rather than be steamrolled by it.
[See my Table Of Contents post for more details about this interview series.]
“if Microsoft is so good at testing, why does your software suck?” [MSDN Blogs > JW on Test, Aug 11, 2008]
What a question! I only wish I could convey the waythat question is normally asked. The tone of voice is either partially apologetic (because many people remember that I was a major ask-er of that same question long before I became an ask-ee) or it’s condescending to the point that I find myself smiling as I fantasize about the ask-er’s computer blue-screening right before that crucial save. (Ok, so I took an extra hit of the kool-aid today. It was lime and I like lime.)
After 27 months on the inside I have a few insights. The first few are, I readily concede, downright defensive. But as I’ve come to experience firsthand, true nonetheless. The last one though is really at the heart of the matter: that, talent notwithstanding, testers at Microsoft do have some work to do.
I’m not going down the obvious path: that testing isn’t responsible for quality and to direct the question to a developer/designer/architect instead. (I hatethe phrase ‘you can’t test quality in,’ it’s a deflection of blame and as a tester, I take quality directly as my responsibility.)
But I am getting ahead of myself. I’ll take up that baton at the end of this post. Let’s begin with the defensive points:
- Microsoft builds applications that are among the world’s most complex. No one is going to argue that Windows, SQL Server, Exchange and so forth aren’t complex and the fact that they are in such widespread use means that our biggest competitors are often our own prior versions. We end up doing what we call “brown field” development (as opposed to ‘green field’ or version 1 development) in that we are building on top of existing functionality. That means that testers have to deal with existing features, formats, protocols along with all the new functionality and integration scenarios that make it very difficult to build a big picture test plan that is actually do-able. Testing real end-to-end scenarios must share the stage with integration and compatibility tests. Legacy sucks and functionality is only part of it…as testers, we all know what is really making that field brown! Be careful where you step. Dealing with yesterday’s bugs keeps part of our attention away from today’s bugs.
(Aside: Have you heard that old CS creationist joke: “why did it take god only seven days to create the universe?” The answer: “No installed base.” There’s nothing to screw up, no existing users to piss off or prior functionality and crappy design decisions to tiptoe around. God got lucky, us…not so much.)
- Our user-to-tester ratio sucks, leaving us hopelessly outnumbered. How many testers does it take to run the same number of test cases that the user base of, say, Microsoft Word can run in the first hour after it is released? The answer: far more than we have or could hire even if we could find enough qualified applicants. There are enough users to virtually ensure that every feature gets used in every way imaginable within the first hour (day, week, fortnight, month, pick any timescale you want and it’s still scary) after release. This is a lot of stress to put our testers under. It’s one thing to know you are testing software that is important. It’s quite another to know that your failure to do so well will be mercilessly exposed soon after release. Testing our software is hard, only the brave need apply.
- On a related point, our installed base makes us a target. Our bugs affect so many people that they are newsworthy. There are a lot of people watching for us to fail. If David Beckham wears plaid with stripes to fetch his morning paper, it’s scandalous; if I wore my underpants on the outside of my jeans for a week few people would even notice (in their defense though, my fashion sense is obtuse enough that they could be readily forgiven for overlooking it). Becks is a successful man, but when it comes to the ‘bad with the good’ I’m betting he’s liking the good a whole lot more. You’re in good company David.
But none of that matters. We’ll take our installed base and our market position any day. No trades offered. But still, we always ready to improve. I think testers should step up and do a better job of testing quality in. That’s my fourth point.
- Our testers don’t play a strong enough role in the design of our apps. We have this “problem” at Microsoft that we have a whole lot of wicked smart people. We have these creatures called Technical Fellows and Distinguished Engineers who have really big brains and use them to dream really big dreams. Then they take these big dreams of theirs and convince General Managers and VPs (in addition to being smart they are also articulate and passionate) that they should build this thing they dreamt about. Then another group of wicked smart people called Program Managers start designing the hell out of these dreams and Developers start developing the hell out of them and a few dozen geniuses later this thing has a life of its own and then someone asks ‘how are we going to test this thing’ and of course it’s A LITTLE LATE TO BE ASKING THAT QUESTION NOW ISN’T IT?
Smart people who dream big inspire me. Smart people who don’t understand testing and dream big scare the hell out of me. We need to do a better job of getting the word out. There’s another group of wicked smart people at Microsoft and we’re getting involved a wee bit late in the process. We’ve got things to say and contributions to make, not to mention posteriors to save. There’s a part of our job we aren’t doing as well as we should: pushing testing forward into the design and development process and educating the rest of the company on what quality means and how it is attained.
We can test quality in; we just have to start testing a lot sooner. That means that everyone from TF/DE through the entire pipeline needs to have test as part of their job. We have to show them how to do that. We have to educate these smart people about what quality means and take what we know about testing and apply it not only to just binaries/assemblies, but to designs, user stories, specs and every other artifact we generate. How can it be the case that what we know about quality doesn’t apply to these early stage artifacts? It does apply. We need to lead the way in applying it.
I think that ask-ers of the good-tester/crappy-software question would be surprised to learn exactly how we are doing this right now. Fortunately, you’ll get a chance because Tara Roth, one of the Directors of Test for Office is speaking at STAR West in November. Office has led the way in pushing testing forward and she’s enjoyed a spot as a leader of that effort. I think you’ll enjoy hearing what she has to say.
Test Talk with James Whittaker [Oct 3, 2011]
James Whittaker is in software testing as long as he can remember. During his study he wrote his graduation paper about Model Based Testing. He made fame at Microsoft and recently he ´joined the enemy´ by going over to Google. His books “how to break software” are bestsellers and the presentation in which he is hacking websites live in front of the audience are fantastic. His last book about Exploratory Software Testing is released last year. I was in the opportunity to ask him the questions below.
1. Can you introduce yourself and explain how you already became a tester during college at the University of Tennessee-Knoxville, judging the name of your dissertation.
My name is James Whittaker and I am a Director of Engineering at Google. I own Test for a bunch of Google products including Chrome browser, Chrome operating system, Google Toolbar as well as some Search and Geo products and a bunch of back-end data center infrastructure applications. I also own Development of engineering tools including both developers and testing tools.I got into test when I was a grad student. Mostly it was by default as my software engineering research team met on Saturday mornings and I had better things to do early Saturday than spend it with a bunch of coding nerds. My professor gave me two choices: get fired or be a tester. Neither he or I knew what a favor he did for me at the time. I really hit the ground running and did a lot of innovative work in model-based testing. In fact, I got my PhD two years before any of those developers. Testing was a great career move for me even back then.
2. How hard was it to change from a Microsoft employee towards a Google employee, and is testing very different at these two companies?
Not hard at all difficult. Microsoft was great preparation for Google. Culturally they are polar opposites with Microsoft being more top down whereas Google is more engineering driven. At Microsoft the high ratio of testers to developers is a case in point. The numbers of people on a project is made by execs and managers. At Google it is made by engineers and no engineer at Google believes a 1:1 ratio is necessary or even healthy. The fewer testers on a project means more involvement by developers for QA. I had a meeting yesterday with the development director for Chrome OS and the entire subject was what they could do to make our job easier. The director was genuinely concerned that his developers were engaging deeply enough on testing issues. A culture like that makes the 1:1 ratio irrelevant … everyone on the project is a tester.3. I read somewhere that you are busy at Google with forging a future in which software just works. Is that possible, a world without software bugs?
Not in my lifetime. However we are getting closer. Even a few years ago I had to pull the battery on my smart phone every week or so. I’ve not even turned my current one off for 3 months and it works fine. Quality assurance for software is much like health care for humans. Humans will always get sick but with good prevention, good hygiene and regular maintenance our bodies do ok. We need to make testing like this: continuous and ongoing. One of the things that annoys me is the whole “push quality upstream” movement. Some people seem to believe that we can rig it so we just write perfect code. That’s like taking all your vitamins when you are a baby and then expecting a long healthy life. Obviously upfront debugging is good, but quality is an ongoing endeavor. It starts at the beginning and is a constant activity throughout the life of a product.4. Patrick Copeland said that your vision on the future of testing is interesting, compelling and not just a little bit scary. Can you shortly tell us your vision, so we don’t get scared?
I am happy that it scares people and honored that it scares smart people like Patrick who thinks deeply on these matters. Too many people are dogmatic about testing. Some say “avoid rigor and do only exploratory testing” and they say it with a fervor that reminds me of religious fundamentalist who see only black and white. Others say the same of automation with the same amount of self righteousness. One thing I do know is that when you think your world view is the only view, there is a problem. People like this have stopped thinking about alternatives. They’ve stopped being open minded. They’ve definitely stopped being right.I am also not going to stand in the middle and start every answer with ‘it depends.’ It turns out that there are some absolutes. There are some testing problems that can be driven to extinction with automation. There are some problems where exploratory testing is exactly the opposite of a good idea. I think it is smart to be problem-oriented and not solution-oriented. The latter is the proverbial hammer solution where every problem looks like a nail because you sell hammers for a living. I’ve laid out my full vision for software testing in my latest book but let me just say here that the part people find scary is that my vision requires far fewer testers than the world currently employs.
5. Your last book is about Exploratory Testing. Can you explain how taking the supermodel tour will improve our testing skills?
All the tours focus a tester’s attention. The idea is to test on purpose. Exploratory testing does not have to lack rigor and it does not have to encompass endless wandering hoping that you find a bug. It also should be about finding important bugs. I find myself endlessly annoyed by speakers who show bugs that no one would care about. Any exploratory method can find easy bugs; what about the hard ones?Many of the tours focus on a general class of bug. The Supermodel Tour as a specific case focuses on presentation layer bugs. It asks you to first identify important properties of the UI and then choose paths that force those properties to change and then be displayed on the UI. We called it the Supermodel Tour to get the idea across that we are looking only skin-deep for bugs (only at the UI level). The tour gives both general guidance in terms of focusing on displayable properties and specific guidance about what part of the application should be visited during an exploratory session (the functions that allow you to change and then display those values). So you see that it requires some pre-work and planning but then allows for exploration once that planning is done. For example, in Maps we run the Supermodel Tour on our classification of landmarks. We make a list of all the landmarks (national parks, places of interest and so forth) in advance and explore the UI to find each location. We (actually Brendan Dhein) found a bug where Arlington National Cemetery was classified as a restaurant! It’s a subtle bug if you are just exploring. But if you are running the Supermodel Tour is jumps out at you. The idea is that a good tester can become a great tester with the right focus and by testing on purpose.
6. What will your next book be about?
I’m writing a book called How Google Test Chrome which details our testing process start to finish on Chrome OS. It’s a totally open kimono assessment of everything that we are going. Right, wrong, false starts, great ideas, cool innovation, new tools and every test artifact we generate from plans to test cases to open source automation. I am psyched about this as I don’t believe anyone has every fully documented and published a complete project before, particularly one of the complexity of an operating system.I plan to include the browser too in this but as I have only written the first chapter on the test plan I do not want to over commit!
7. How is testing managed at Google? From one place or per country or per application or … ?
It’s divided by product lines or what we call “Focus Areas.” In my case I own the Client Focus Area. However since I am in a remote site I also have authority over all the work that goes on in Kirkland and Seattle Washington. I’m a busy boy but I like it that way. A single product would bore me.It’s funny that on the Dev side each product has a Director in charge of it. Whereas I am the Director over many products. I have Test Managers over each product who have to interact with a Director on the dev side. So if you match us up one to one, you might have a Test Manager matching wits on a daily basis with Development Director three levels above them in rank. You talk about character building, this is the place for that. Google test managers are a breed apart. Cream of the crop.
8. Are you still collecting soccer jerseys? And if so, is there one you really want to add to your collection?
Yes and I cannot wait to wear them during the World Cup. By tradition I never wear anything else during that tournament (sorry for the visual). When people invite me to speak I often get them as gifts and I have dozens. I am hoping I get a Swiss one this trip (hint, hint) and I lost my Australian one (please don’t ask) and am looking for a replacement. But I have a lot of club jerseys too and will relish the chance to wear different colors. Send me a jersey and I’ll send you some signed books!9. I hear you will giving a keynote at the Swiss Testing Day. Can you give us a sneak preview on what it will be about?
“Testing On Purpose” is the title. I am talking in far more depth about how we are testing Chrome at Google. I hope to see you there.
All that testing is getting in the way of quality by James Whittaker, Part One [TCLGroupLimited, Dec 8, 2011]
All that testing is getting in the way of quality by James Whittaker, Part Two [TCLGroupLimited, Dec 8, 2011]
A Brave New World of Testing? An Interview with Google’s James Whittaker by Forrest Shull [IEEE Software, March/April 2012 pp. 4-7]
… In their introduction, the guest editors have compiled a list of questions related to what our future, cloud-intensive world is going to look like—many of which I’ve heard myself from colleagues in government and commercial positions. The one that I hear most often is this: How should organizations leverage the power of this approach to improve testing and quality assurance of software? To get an answer, I turned to James Whittaker, an engineering director at Google, which has been at the forefront of leveraging the cloud. James is a noted expert and author on software testing, whose team has been managing Google’s cloud computing testing. Some excerpts of our conversation:
What is it like right now, looking across cloud computing testing at Google? It sounds like a pretty major undertaking.
…
In one of your previous interviews, I came across a statement of yours that has become one of my favorite thought-provoking quotes. You said, “Anyone who says that testing is getting harder is doing it wrong.” Could you expand on this a bit?
…
In the cloud, all the machines automatically work together; there’s monitoring software available, and one test case will run anywhere. There’s not even a test lab. There’s just a section of the datacenter that works for you from a testing point of view. You put a test case there and it runs. And all of the different scheduling software that any datacenter uses to schedule tasks can be used to schedule tests. So, a lot of the stuff that we used to have to write and customize for our test labs, we just don’t need anymore.
…
The other thing the cloud has done is brought us closer to our users. Think of Google Maps: it’s really impossible to hire a group of testers to exhaustively test it. It’s literally a piece of software of planetary proportions. If there’s a bug in my address on Google Maps, I’m likely to be the only one who will find it. But the cloud also enables us to reach out to users who are early adopters to get better and richer bug feedback than we were ever able to do back in the client-server days, when once software got to the field it was very difficult to update and instrument. Now, it’s easy to update a datacenter, it’s easy to instrument a datacenter. If a customer finds a bug, it’s easy for them to tell us about it, and it’s easy for us to fix it and push that fix to all our users, by just refreshing a browser.
So the cloud really does change things. It’s a different model of development; it’s a different model of testing; it’s a different model of usage.
Regarding testers and the skill sets that they’ve traditionally been applying on the job, does the same skill set still apply? Or are people being asked to develop new skills to take advantage of all these cloud features?
…
So, if I can paraphrase what you’ve been saying, the cloud is changing the whole underlying economics of software development and software testing. It’s easier and quicker for a company to try something, push it out to users, hear from the users what the problems are, and fix them, than it is to follow the traditional path of getting the requirements right up front, then getting the architecture right and nailed down, then getting the coding done well ….
Absolutely. By the time you do all that stuff, you’re too late. Your competitor’s beaten you to the market. On the cloud, you can really release and iterate—that’s much more the development model of modern times.
But you have to be careful: Google’s not pushing software out to its users saying, “Hey, is this any good? We’re not sure!” There are a lot of intermediate steps. We have an internal process we call dogfooding, as in, if you’re trying to sell dog food, you should eat your own product first to make sure it’s okay. All our software is used internally first by Googlers before we push it out to the world. If you look at something like Google+, which we released last year, we used that internally among Googlers for many months before we released it. In that process of dogfooding Google+, we found far more bugs and far richer bugs than the test team associated with Google+.
The points you’re making, about having representative users from the beginning who are able to use the product and help mature it, represents a much bigger paradigm shift than I had originally realized.
To me, that is just one of the most crucial things that companies absolutely have to get good at. In the past, if you found a bug in, say, your browser, you didn’t know how to report it. You’d have to find some bug-reporting page on the vendor’s site, and it would ask you what operating system you were using and what version of the browser you were using, and what other plug-ins you had installed…. But the machine knows all that stuff! So the idea is that once you crash, or once a user finds a bug, you just grab that machine state and send it back to the vendor so that they can understand the state the user was in exactly.
This seems like a very concrete model to use for functional testing. But does the same paradigm work if I’m worried about things like reliability, performance, or throughput?
Or better yet, security, privacy, and so on. I agree with you completely. I think the idea of paying top dollar for engineers to do functional testing really is an artifact of the 1990s and 2000s, and shouldn’t be something that companies invest in heavily in the future. But things like security, privacy, and performance are very technical in nature. You don’t do security testing without understanding a lot about protocols, machine states, or how the Web works; a lot of a priori knowledge is required. You can’t replace that. So when I give advice to functional testers who say that I’m predicting the end of their job, specialization is one of the things I recommend. Specialization is crucially important.
…
How does the simplistic testing model that we all learned in school—where you go first through unit testing, then integration testing, then system testing—adapt to the new paradigm?
We do integration testing, but we call it something different. People always say that Google just likes to change the names of things, but we did this one on purpose. We don’t have to integrate it from environment to environment, but we do have to integrate it across developers. So developer A writes one module, developer B writes another module; to us, integration testing hits both developer A’s anzsoftware that you simply do not have to run on the cloud: any sort of configuration test, and any sort of load testing, just isn’t necessary in this new modern environment. Load is taken care of for you; if it slows down, new cells in the datacenter are spun off automatically.
When you hire new testers for your teams at Google, is there something in particular that you’re looking for? You mentioned specialization as being important, but is there anything else that makes a good cloud tester versus just a good tester?
…
For folks who are trying to move legacy systems onto the cloud, does their development and testing process look a lot different from what they’d use when trying to do something more greenfield?
…
Where are things going in the future? Will abstractions allow developers and testers to worry about even fewer issues over time, or will there be new things that we do need to worry about as more and more people go on the cloud?
There are definitely some new things that we’ll need to worry about. First and foremost, connecting to customers is going to be really important. As much as we have the server side of it down (instead of having a massively complex server, we just have this cloud that takes care of itself), there’s still a lot of variation on the device/user side. If you look at the number of Android devices that are out there, and the number of operating systems and apps that people have configured onto them, that is still a hard testing problem.
The cloud actually makes that easier, too. Crowdsourcing companies are now connecting certain specific people with specific devices to people who are writing apps on those devices. So the idea of leveraging the crowd through the cloud is definitely something that hasn’t been done before, and is a new phenomenon that we’re watching really carefully here.
One thing is for sure, we’re never going to settle on a single platform. Humankind doesn’t seem to be capable of doing that, and I don’t think it would be a good thing to eliminate competition among platforms. The Linux/Windows competition has always been healthy, and the same thing is happening in the mobile space now. So we’re always going to have to develop for multiple platforms, and those platform owners are going to want to innovate as quickly as they can and they’re not always going to be checking with you or each other on those innovations, so the developers are just going to have to be on their toes.
Learn More
My conversation with James touched on many more issues than I could note here. If you’re interested in hearing more of the conversation we had, which ranged over additional issues such as cloud testing tools and handling privacy and robustness, then check out our half-hour audio interview at http://doi.ieeecomputersociety.org/10.1109/MS.2012.23.
More than anything else, my conversation with James made me aware again of the significant changes to the way we do business that accompany the cloud, and the new skills that are becoming important. Perhaps the best summary was James’ comments that “People really need to take the cloud seriously and rethink testing from the ground up. There are a lot of sacred cows in testing that just go away with the transition to the cloud. Keeping an open mind and taking advantage of the efficiencies of the cloud are going to be really important.” I certainly hope the remainder of this special issue on cloud computing will help give you useful food for thought in doing so.
Why I left Google [MSDN Blogs > JW on Tech, March 13, 2012]
Ok, I relent. Everyone wants to know why I left and answering individually isn’t scaling so here it is, laid out in its long form. Read a little (I get to the punch line in the 3rdparagraph) or read it all. But a warning in advance: there is no drama here, no tell-all, no former colleagues bashed and nothing more than you couldn’t already surmise from what’s happening in the press these days surrounding Google and its attitudes toward user privacy and software developers. This is simply a more personal telling.
It wasn’t an easy decision to leave Google. During my time there I became fairly passionate about the company. I keynoted four Google Developer Day events, two Google Test Automation Conferences and was a prolific contributor to the Google testing blog. Recruiters often asked me to help sell high priority candidates on the company. No one had to ask me twice to promote Google and no one was more surprised than me when I could no longer do so. In fact, my last three months working for Google was a whirlwind of desperation, trying in vain to get my passion back.
The Google I was passionate about was a technology company that empowered its employees to innovate. The Google I left was an advertising company with a single corporate-mandated focus.
Technically I suppose Google has always been an advertising company, but for the better part of the last three years, it didn’t feel like one. Google was an ad company only in the sense that a good TV show is an ad company: having great content attracts advertisers.
Under Eric Schmidt ads were always in the background. Google was run like an innovation factory, empowering employees to be entrepreneurial through founder’s awards, peer bonuses and 20% time. Our advertising revenue gave us the headroom to think, innovate and create. Forums like App Engine, Google Labs and open source served as staging grounds for our inventions. The fact that all this was paid for by a cash machine stuffed full of advertising loot was lost on most of us. Maybe the engineers who actually worked on ads felt it, but the rest of us were convinced that Google was a technology company first and foremost; a company that hired smart people and placed a big bet on their ability to innovate.
From this innovation machine came strategically important products like Gmail and Chrome, products that were the result of entrepreneurship at the lowest levels of the company. Of course, such runaway innovative spirit creates some duds, and Google has had their share of those, but Google has always known how to fail fast and learn from it.
In such an environment you don’t have to be part of some executive’s inner circle to succeed. You don’t have to get lucky and land on a sexy project to have a great career. Anyone with ideas or the skills to contribute could get involved. I had any number of opportunities to leave Google during this period, but it was hard to imagine a better place to work.
But that was then, as the saying goes, and this is now.
It turns out that there was one place where the Google innovation machine faltered and that one place mattered a lot: competing with Facebook. Informal efforts produced a couple of antisocial dogs in Wave and Buzz. Orkut never caught on outside Brazil. Like the proverbial hare confident enough in its lead to risk a brief nap, Google awoke from its social dreaming to find its front runner status in ads threatened.
Google could still put ads in front of more people than Facebook, but Facebook knows so much more about those people. Advertisers and publishers cherish this kind of personal information, so much so that they are willing to put the Facebook brand before their own. Exhibit A: http://www.facebook.com/nike, a company with the power and clout of Nike putting their own brand afterFacebook’s? No company has ever done that for Google and Google took it personally.
Larry Page himself assumed command to right this wrong. Social became state-owned, a corporate mandate called Google+. It was an ominous name invoking the feeling that Google alone wasn’t enough. Search had to be social. Android had to be social. You Tube, once joyous in their independence, had to be … well, you get the point. Even worse was that innovation had to be social. Ideas that failed to put Google+ at the center of the universe were a distraction.
Suddenly, 20% meant half-assed. Google Labs was shut down. App Engine fees were raised. APIs that had been free for years were deprecated or provided for a fee.As the trappings of entrepreneurship were dismantled, derisive talk of the “old Google” and its feeble attempts at competing with Facebook surfaced to justify a “new Google” that promised “more wood behind fewer arrows.”
The days of old Google hiring smart people and empowering them to invent the future was gone. The new Google knew beyond doubt what the future should look like. Employees had gotten it wrong and corporate intervention would set it right again.
Officially, Google declared that “sharing is broken on the web” and nothing but the full force of our collective minds around Google+ could fix it. You have to admire a company willing to sacrifice sacred cows and rally its talent behind a threat to its business. Had Google been right, the effort would have been heroic and clearly many of us wanted to be part of that outcome. I bought into it. I worked on Google+ as a development director and shipped a bunch of code. But the world never changed; sharing never changed. It’s arguable that we made Facebook better, but all I had to show for it was higher review scores.
As it turned out, sharing was not broken. Sharing was working fine and dandy, Google just wasn’t part of it. People were sharing all around us and seemed quite happy. A user exodus from Facebook never materialized. I couldn’t even get my own teenage daughter to look at Google+ twice, “social isn’t a product,” she told me after I gave her a demo, “social is peopleand the people are on Facebook.” Google was the rich kid who, after having discovered he wasn’t invited to the party, built his own party in retaliation. The fact that no one came to Google’s party became the elephant in the room.
Google+ and me, we were simply never meant to be. Truth is I’ve never been much on advertising. I don’t click on ads. When Gmail displays ads based on things I type into my email message it creeps me out. I don’t want my search results to contain the rants of Google+ posters (or Facebook’s or Twitter’s for that matter). When I search for “London pub walks” I want better than the sponsored suggestion to “Buy a London pub walk at Wal-Mart.”
The old Google made a fortune on ads because they had good content. It was like TV used to be: make the best show and you get the most ad revenue from commercials. The new Google seems more focused on the commercials themselves.
Perhaps Google is right. Perhaps the future lies in learning as much about people’s personal lives as possible. Perhaps Google is a better judge of when I should call my mom and that my life would be better if I shopped that Nordstrom sale. Perhaps if they nag me enough about all that open time on my calendar I’ll work out more often. Perhaps if they offer an ad for a divorce lawyer because I am writing an email about my 14 year old son breaking up with his girlfriend I’ll appreciate that ad enough to end my own marriage. Or perhaps I’ll figure all this stuff out on my own.
The old Google was a great place to work. The new one?
Nokia under transition (as reported by the company)
March 11, 2012 12:52 am / 6 Comments on Nokia under transition (as reported by the company)
Note and updates: stock price is up 3.17% as per above (those numbers are in US$)
– see more: Nokia trying the first Lumia month in China with China Telecom exclusive [March 28, 2012]
– Nokia seeks to retake China market share [Reuters, March 28, 2012]: “Shares in Nokia rose 3 percent to 4.116 euros, helped also after Sweden’s Swedbank lifted its rating to “buy” from “neutral”.”
– Are Nokia’s Largest Shareholders Betting on a Turnaround With New Releases in China? [Wall St. Cheat Sheet, March 28, 2012]
279 institutional firms indicated owning shares of Nokia Corporation (NYSE:NOK) in both Q3 2011 and Q4 2011. These firms reported owning a total of 348.305 million shares on 09/30/2011 and 382.757 million shares [out of 3.74B, i.e. ~10%] on 12/31/2011. The shares closed at $5.66 on 09/30/2011 and $4.82 on 12/31/2011, for an aggregate market value of $1.971 billion and $1.845 billion, respectively.
– Nokia: The Recovery Begins; One Analyst Turns Bullish [Forbes, March 30, 2012]
… Town Hall Investment Research analyst Jamie Townsend this morning upped his rating onNokia to Buy from Avoid.
His view: for Nokia, the turnaround has begun. And for that he credits the company’s still unfolding new relationship with Microsoft, and its decision to adopt Windows Phone 7 as the operating system for its high-end smartphones.
“Our renewed enthusiasm is primarily driven by Nokia’s smartphone business and our belief that long term the company is now poised to slowly reestablish itself as a meaningful player in smartphone markets around the world,” Townsend writes in a research note. “While we believe that Q1 and Q2 2012 will continue to show the struggle between the death of Symbian and the rise of WP7, we also believe the pieces are now in place for a gradual reversal in the market share losses experienced in the last three years. Specifically, we are expecting positive unit surprises in the U.S. and Western Europe over the next two quarters, albeit coming off a very low base and expectations. While only a wild card right now, we also believe that some sort of partnership between Microsoft, Nokia and RIM is now a real possibility.”
…
“We believe that there are two issues for RIM that relate to NOK,” he writes. “First, we believe that RIM is now where NOK was approximately a year ago. There was no longer any doubt as to the declining state of the smartphone business but also no clear path to recovery. As we know from Nokia’s last year, the recovery required bold action and the a long lead time to the actual point of product improvement. We believe investors should wait until the recovery is clear which in our view is not yet the case with RIM, but is now on the near horizon for NOK.”
“Second, RIM management on the quarterly conference call made it abundantly clear that the company is seeking a new partnership that will allow it to enhance its consumer appeal but allow it to focus its attention on its core historical strength with the enterprise,” he adds. “We believe that this strategy carries a number of risks, but also believe that Nokia/Microsoft represents the most likely candidate for such a partnership. We have no data points to support that this will happen or that Nokia/Microsoft would want it to, but believe it to be a real possibility over the next six months. Should it occur we believe it would be perceived as a meaningful positive for NOK shares.”
NOK this morning is up 7 cents, or 1.2%, to $5.49.
End of updates
According to the below excerpts from the Nokia 2011 fiscal year report [March 8, 2012]
Current strategic business units, their responsibilities and accountabilities:
[F-9] As of April 1, 2011, the Group’s operational structure featured two new operating and reportable segments: Smart Devices and Mobile Phones, which combined with Devices & Services Other and unallocated items form Devices & Services business.
As of October 1, 2011, the Group formed a Location & Commerce business which combines NAVTEQ and Nokia’s social location services operations from Devices & Services. Location & Commerce business is an operating and reportable segment. From the third quarter 2008 until the end of the third quarter 2011, NAVTEQ was a separate reportable segment of Nokia. As a consequence, Nokia currently has four operating and reportable segments: Smart Devices and Mobile Phones within Devices & Services, Location & Commerce and Nokia Siemens Networks.
Prior year segment specific results for 2009 and 2010 have been regrouped and recasted for comparability purposes according to the new operational structure.
[F-26] Nokia’s reportable segments represent the strategic business units that offer different products and services. The chief operating decision maker receives monthly financial information for these business units. Key financial performance measures of the reportable segments include primarily net sales and contribution/operating profit. Segment contribution for Smart Devices and Mobile Phones consists of net sales as well as its own, directly assigned costs and allocated costs but exclude major restructuring projects/programs and certain other items that are not directly related to the segments. Operating Profit is presented for Location & Commerce and Nokia Siemens Networks. Nokia evaluates the performance of its segments and allocates resources to them based on operating profit/contribution.
Smart Devices focuses on smartphones and smart devices and has profit-and-loss responsibility and end-to-end accountability for the full consumer experience, including product development, product management and product marketing. ([52] Nokia’s portfolio of smartphones covers price points ranging from around EUR 100 to more than EUR 500, excluding taxes and subsidies. During 2011, we shipped approximately 77.3 million smartphones.)
Mobile Phones focuses on mass market feature phones and related services and applications and has profit-and-loss responsibility and end-to-end accountability for the full consumer experience, including development, management and marketing of feature phone products, services and applications. ([54] Nokia’s portfolio of feature phones covers a wide range of price points from the Nokia 100, our most affordable device which costs about EUR 20, excluding taxes and subsidies, through to devices with more premium features costing upwards of EUR 100, excluding taxes and subsidies. During 2011, we shipped approximately 339.8 million feature phones.)
Devices & Services Other includes net sales of Vertu, spare parts and related cost of sales and operating expenses, as well as intellectual property related royalty income. Operating expenses of Devices & Services Other also include common research and development. Other income and expenses include major restructuring projects/programs related to the Devices & Services business as well as other unallocated items.
Location & Commerce develops a range of location-based products and services for consumers, as well as platform services and local commerce services for the Group’s feature phones and smartphones ([96] in support of our strategic goals) as well as ([96] a portfolio of products for the broader Internet ecosystem, including products for our direct competitors) for other device manufacturers, application developers, Internet service providers, merchants, and advertisers. Location & Commerce also continues to serve NAVTEQ’s existing customers both in terms of provision of content and as a business-to-business provider of map data ([56]providing comprehensive digital map information and related location-based content and services for mobile navigation devices, automotive navigation systems, Internet-based mapping applications and government and business solutions). Location & Commerce has profit and loss responsibility and end-to-end accountability for the full consumer experience.
Nokia Siemens Networks provides a portfolio of mobile, fixed and converged network technology, as well as professional services including managed services, consultancy and systems integration, deployment and maintenance to operators and service providers.
[F-71] Nokia Siemens Networks B.V., the ultimate parent of the Nokia Siemens Network group, is owned approximately 50% by each of Nokia and Siemens and consolidated by Nokia. Nokia effectively controls Nokia Siemens Networks as it has the ability to appoint key officers and the majority of the members of its Board of Directors, and accordingly, Nokia consolidated Nokia Siemens Networks.
Business and segment information:
| 2009 | 2010 | 2011 | |
| Devices & Services | |||
| Net sales (EUR in M) | 27853 | 29134 | 23943 |
| Operating profit (EUR in M) | 3564 | 3540 | 884 |
| Gross margin | 33.10% | 29.90% | 27.70% |
| Operating margin | -1% | 12.20% | 3.70% |
| Volume (units in M) | 431.8 | 452.9 | 417.1 |
| ASP (EUR) | 64 | 64 | 57 |
| Smart Devices | |||
| Net sales (EUR in M) | 12649 | 14874 | 10820 |
| Gross margin | 37.20% | 30.80% | 23.70% |
| Contribution margin | 11.40% | 9.30% | -3.80% |
| Volume (units in M) | 67.8 | 103.6 | 77.3 |
| ASP (EUR) | 187 | 144 | 140 |
| Mobile Phones | |||
| Net sales (EUR in M) | 14644 | 13696 | 11930 |
| Gross margin | 28.50% | 28.00% | 26.10% |
| Contribution margin | 15.30% | 17.00% | 12.40% |
| Volume (units in M) | 364 | 349.2 | 339.8 |
| ASP (EUR) | 40 | 39 | 35 |
| Location & Commerce | |||
| Net sales (EUR in M) | 756 | 869 | 1091 |
| Operating profit (EUR in M) | -594 | -663 | -1526 |
| Gross margin | 82.70% | 80.60% | 80.40% |
| Operating margin | -78.60% | -76.30% | -139.90% |
| Nokia Siemens Networks | |||
| Net sales (EUR in M) | 12574 | 12661 | 14041 |
| Operating profit (EUR in M) | -1639 | -686 | -300 |
| Gross margin | 27.10% | 26.80% | 27.10% |
| Operating margin | -58% | -5.40% | -2.10% |
| Nokia Group | |||
| Net sales (EUR in M) | 40984 | 42446 | 38659 |
| Operating profit (EUR in M) | 1197 | 2070 | -1073 |
| Gross margin | 32.40% | 30.20% | 29.30% |
| Operating margin | 2.90% | 4.90% | -2.80% |
The overall market situation and the related Nokia strategies and actions:
Devices & Services:
[87] In 2011, the global mobile device market benefited from continued strength in key growth markets, such as the Middle East and Africa, Greater China and Latin America and, according to our estimate, industry mobile device volumes increased by 11% during the year. Smartphones continued to capture the major part of the volume and value growth, as well as the public focus, in the mobile device market. We estimate that our mobile device volume market share was 26% in 2011, compared to an estimated 32% in 2010, with the decline primarily driven by market share losses in the smartphones segment.
In February 2011, we announced our new strategy for our Devices & Services business, which has three core elements.
First, in smartphones, we announced our partnership with Microsoft, discussed below, to bring together our respective complementary assets and expertise to build a new global mobile ecosystem for smartphones. Under the partnership, formalized in April 2011, we are adopting and licensing Windows Phone from Microsoft as our primary smartphone platform. We launched our first Nokia products with Windows Phone under the Lumia brand in October 2011.
Second, in feature phones, our strategy continues to be to leverage our innovation and strength in growth markets to connect the next billion people to the Internet and information. Through our investments in developing assets designed to bring a modern mobile experience – software, services and applications – we believe we have the opportunity to connect the “next billion” aspirational consumers around the world to the Internet and information, especially in key emerging markets.
Third, we believe we must also invest to take advantage of future technology disruptions and trends. Through ongoing research and development, we plan to explore and lead next-generation opportunities in devices, platforms and user experiences to support our industry position and longerterm financial performance.
The competitive landscape for that is the following:
[60] The mobile device market continues to undergo significant changes, most notably due to the broad convergence of the mobile telecommunications, computing, consumer electronics and Internet industries. With the traditional feature phone market continuing to mature, a major part of volume and value growth in the industry has been in smartphones offering access to the Internet. Additionally, other large handheld Internet-centric computing devices, such as tablets and e-readers, have emerged, trading off pocketability and some portability for larger screen sizes, but in many cases offering both cellular and non-cellular connectivity in the same way conventional mobile devices do. Due to their larger size, such devices are not replacing conventional mobile devices, but are generally purchased as a second device. Nevertheless, larger-screened Internet-enabled devices have captured a significant share of consumer spend across the broader market for mobile products and digital content and in different ways. For example, some competitors seek to offer hardware at a low price to the consumer with the aim of capturing value primarily through the sale of content.
The increasing demand for wireless access to the Internet has had a significant impact on the competitive landscape of the market for mobile products and digital content. Companies with roots in the mobile devices, computing, Internet and other industries are increasingly competing directly with one another, making for an intensely competitive market across all mobile products and services. At the same time, and particularly in the smartphone and tablets segments, success for hardware manufacturers is increasingly shaped by their ability to build, catalyze or be part of a competitive ecosystem, where different industry participants, such as hardware manufacturers, software providers, developers, publishers, entertainment providers, advertisers and e-commerce specialists are forming increasingly large communities of mutually beneficial partnerships in order to bring their offerings to the market. A vibrant ecosystem creates value for consumers, giving them access to a rich and broad range of user experiences. As a result, the competitive landscape is increasingly characterized in terms of a “war of ecosystems” rather than a battle between individual hardware manufacturers or products.
At the heart of the major ecosystems is the operating system and the development platform upon which devices are based and services built. In smartphones, our competitors are pursuing a wide range of strategies. Many device manufacturers are utilizing freely available operating systems, the development of which is not paid for from device sales revenue or software license fees. The availability of Google’s Android platform has made entry into and expansion in the smartphone market easier for a number of hardware manufacturers which have chosen to join Android’s ecosystem, especially at the mid-to-low range of the smartphone market. For example, some competitors’ offerings based on Android are available for purchase by consumers for below EUR 100, excluding taxes and subsidies, and thus address a portion of the market which has been traditionally dominated by feature phone offerings, including those offered by Nokia. Accordingly, lower-priced smartphones are increasingly reducing the addressable market and lowering the price points for feature phones.
In general, we believe product differentiation with Android is more challenging, leading to increased commoditization of these devices and the resulting downward pressure on pricing. In addition, there is uncertainty in relation to the intellectual property rights in the Android ecosystem, which we believe increases the risk of direct and indirect litigation for participants in that ecosystem. Google, HTC, LG, Motorola, Samsung and Sony Ericsson are among competitors which have deployed the Android operating system on their smartphones. Samsung is among our strongest competitors, competing with us across a broad range of price points.
Other companies favor proprietary operating systems, including Apple, whose popular high-end iPhone models use the iOS operating system, and Research in Motion (RIM), which deploys Blackberry OS on its mobile devices. Both Apple and RIM have developed their own application stores, through which users of their products can access applications.
Apple, which has already gained a strong position in the market for high-end smartphones and tablets, has also used the strength of its ecosystem to further expand its offering of digital content through other interfaces such as television sets. Similarly, Google has sought to extend the Android ecosystem with its Google TV Internet-based television service.
Nokia currently offers smartphones based on the Symbian, MeeGo and Windows Phone operating systems, and we are transitioning to using Windows Phone as our primary smartphone platform. Users of Symbian-based Nokia products can access digital content and third-party applications through Nokia Store, while users of our Windows Phone devices can access the Microsoft-run Marketplace for digital content and third-party applications. The Windows Phone operating system is also being deployed on smartphones by others, including HTC and Samsung.
The significant momentum and market share gains of the global ecosystems around the Apple and Android platforms have increased the competitive barriers to additional entrants looking to build a competing global smartphone ecosystem, such as Nokia with the Windows Phone platform. At the same time, other ecosystems are being built which are attracting developers and consumers, and which may result in potential fragmentation among ecosystem participants and the inability of new ecosystems to gain sufficient competitive scale.
We also face intense competition in feature phones where a different type of ecosystem from that of smartphones is emerging involving very low-cost components and manufacturing processes, with speed to market and attractive pricing being critical success factors. In particular, the availability of complete mobile solutions chipsets from low-cost reference design chipset manufacturers has lowered the barriers of market entry and enabled the very rapid and low-cost production of feature phones by numerous manufacturers in China and India, which are gaining significant market share in emerging markets, as well as bringing some locally relevant innovations to market. Such manufacturers have also demonstrated that they have significantly lower gross margin expectations than we do.
We also face competition from vendors of unlicensed and counterfeit products with manufacturing facilities primarily centered around certain locations in Asia and other emerging markets which produce inexpensive devices with sometimes low quality and limited after-sales services that take advantage of commercially-available free software and other free or low-cost components, software and content. In addition, we compete with non-branded feature phone manufacturers, including mobile network operators, which offer mobile devices under their own brand, as well as providers of specific hardware and software layers within products and services at the level of those layers rather than solely at the level of complete products and services and their combinations. In the future, we may face competition from established Internet companies seeking to offer smartphones under their own brand.
Our competitors use a wide range of other strategies and tactics. Certain competitors choose to accept significantly lower profit margins than we are targeting. Certain competitors have chosen to focus on building products and services based on commercially available components and content, in some cases available at very low or no cost. Certain competitors have also benefited from favorable currency exchange rates. Further, certain competitors may benefit from support from the governments of their home countries and other measures which may have protectionist objectives.
Transition:
[88] Year 2011 was a year of transition for Nokia. Prior to the announcement of our partnership with Microsoft in February 2011 and the adoption of Windows Phone as our primary smartphone platform, the Symbian and MeeGo operating systems were our primary smartphone platforms. Following our announcement of the Microsoft partnership, we expected to sell approximately 150 million more Symbian devices in the years to come and to ship one MeeGo device. However, the demand for our Symbian devices began to deteriorate. The consequent decline in our Smart Devices net sales and profitability was a result of both a decline in our Symbian smartphone volume market share and pressure on pricing as competitors aggressively capitalized on our platform and product transition. Towards the end of 2011, the competitiveness of our Symbian devices continued to deteriorate as changing market conditions created increased pressure on Symbian, which further adversely affected our Smart Devices net sales, profitability, market share and brand perception. In certain markets, there has been an acceleration of the trend towards lower-priced smartphones with specifications that are different from Symbian’s traditional strengths, which has contributed to a faster decline in our Symbian volumes than we anticipated. We expect this trend to continue in 2012.
To endeavor to maximize the value of our Symbian asset going forward, we expect to continue to ship Symbian devices to specific regions and distribution channels, as well as to continue to provide software support to our Symbian customers, through 2016. The software support for our Symbian customers was outsourced to Accenture commencing from September 2011. As a result of the changing market conditions, combined with our increased focus on Nokia products with Windows Phone, we believe we will sell fewer Symbian devices than previously anticipated.
Towards the end of 2011, we launched the Nokia Lumia 800 and Nokia Lumia 710, our first smartphones based on the Windows Phone platform. During 2011, we also launched the Nokia N9, which was the outcome of efforts in our MeeGo program. Since the start of 2012, we have continued to bring the Lumia experience to several more geographies, including the United States, where we have launched the Nokia Lumia 900, the first LTE device designed specifically for the North American market, which is available exclusively through AT&T. In late February 2012, we announced our intention to bring the Lumia 900 to markets outside the United States and introduced the Lumia 610, our lowest cost Lumia smartphone to date.
During the first half of 2011, our mobile device market share decline was further negatively affected by weakness in our feature phone portfolio primarily due to a lack of a dual SIM offering. During the second half 2011, however, the competitiveness of our feature phones improved when we introduced several dual SIM devices, as well as the new Nokia Asha range of feature phones, which offers a more smartphone-like user experience. These new additions helped us recapture some market share in the feature phone segment.
Year 2012 is expected to continue to be a year of transition, during which our Devices & Services business will be subject to risks and uncertainties, as our Smart Devices business unit continues to transition from Symbian products to Nokia products with Windows Phone and our Mobile Phones business unit continues to bring more smartphone-like features and design to our feature phone portfolio. Those risks and uncertainties include, among others, continued deterioration in demand for our Symbian devices; the timing, ramp-up and demand for our new products, including our Lumia devices; further pressure on margins as competitors endeavor to capitalize on our platform and product transition; and uncertainty in the macroeconomic environment. Mainly due to these factors, we believe that it is not appropriate to provide annual financial targets for 2012.
Longer-term, we target:
• Devices & Services net sales to grow faster than the market, and
• Devices & Services operating margin to be 10% or more, excluding special items and purchase price accounting related items.
Partnership with Microsoft:
[F-26] In February 2011, Nokia announced a partnership with Microsoft to bring together the respective complementary assets and expertise of both parties to build a new global mobile ecosystem for smartphones. The partnership, under which Nokia is adopting and licensing Windows Phone from Microsoft as its primary smartphone platform, was formalized in April 2011.
The Group is paying Microsoft a software royalty fee to license the Windows Phone smartphone platform, which the Group records as royalty expense in its Smart Devices cost of goods sold. Nokia has a competitive software royalty structure, which includes annual minimum software royalty commitments and reflects the large volumes that the Group expects to ship, as well as a variety of other considerations related to engineering work to which both companies are committed. The Group expects that the adoption of Windows Phone will enable it to reduce significantly its operating expenses.
In recognition of the contributions that the Group is providing, the Group will receive quarterly platform support payments from Microsoft. ([90] In the fourth quarter of 2011, we received the first quarterly payment of USD 250 million (approximately EUR 180 million).) The received platform support payments are recognized over time as a benefit to our Smart Devices costs of goods sold. The total amount of the platform payments is expected to slightly exceed the total amount of the minimum software royalty commitments.
The Microsoft partnership also recognizes the value of intellectual property and puts in place mechanisms for exchanging intellectual property rights.
[89] We are contributing our expertise on hardware, design and language support to the Microsoft partnership, and plan to bring Nokia products with Windows Phone to a broad range of price points, market segments and geographies. We and Microsoft are closely collaborating on joint marketing initiatives and on a shared development roadmap on the future evolution of mobile products. The goal for both partners is that by bringing together our complementary assets in search, maps, locationbased services, e-commerce, social networking, entertainment, unified communications and advertising, we can jointly create an entirely new consumer proposition. We are also collaborating on our developer ecosystem activities to accelerate developer support for the Windows Phone platform on our mobile products. Although Microsoft will continue to license Windows Phones to other mobile manufacturers, the Microsoft partnership allows us to customize the Windows Phone platform with a view to differentiating Nokia smartphones from those of our competitors that also use the Windows Phone platform.
Specific initiatives include the following:
- Contribution of our mapping, navigation, and certain location-based services to the Windows Phone ecosystem. We aim to build innovation on top of the Windows Phone platform in areas such as imaging, while contributing our expertise on hardware design and language support, to help drive the development of the Windows Phone platform. Microsoft will provide Bing search services across our mobile device portfolio and will contribute its strength in productivity tools, advertising, gaming, social media and a variety of other services. We believe that the combination of navigation with advertising and search services will enable better monetization of our navigation assets and create new forms of advertising revenue.
- Joint developer outreach and application sourcing to support the creation of new local and global applications, including making Windows Phone developer registration free for all Nokia developers.
- Planning towards opening a new Nokia-branded global application store that leverages the Windows Marketplace infrastructure. Developers would be able to publish and distribute applications to hundreds of millions of consumers that use Windows Phone, Symbian and Series 40 devices.
- Contribution of our expertise in operator billing to ensure participants in the Windows Phone ecosystem can take advantage of our billing relationships with 112 operators in 36 markets.
Strategy for the trend: Continued Convergence of the Mobile Communications, Computing, Consumer Electronics and Internet Industries
[90] Value in the mobile handset industry continues to be increasingly driven by the convergence of the mobile communications, computing, consumer electronics and Internet industries. As consumer demand and interest for smartphone and tablets with access to a range of content has accelerated, new opportunities to create and capture value through innovative new service offerings and user experiences have arisen, with a greater emphasis and importance on software and ecosystem-driven innovation, rather than standalone devices. These opportunities seek to capitalize on various elements of ecosystems such as search services, maps, location-based services, e-commerce, social networking, entertainment, communications and advertising. Capturing these opportunities requires capabilities to manage the increased complexity and to provide an integrated user experience where all these various elements interact seamlessly either in one device or across multiple devices and electronic products. We expect these new opportunities to continue to emerge in 2012.
We believe that we are well-positioned with our new strategy and partnership with Microsoft, including our collective goal to build a new global mobile ecosystem for smartphones, to capture a number of these opportunities.
In Mobile Phones, we plan to leverage our innovation and strength in growth markets to connect the next billion people to the Internet and information. We also plan to drive third party innovation through working with our partners to engage in building strong, local ecosystems for our feature phones.
Strategy for the trend: Increasing Importance of Competing on an Ecosystem to Ecosystem Basis
[91] The increasing importance of ecosystems is, to a large degree, driven by the convergence trends mentioned above and the implications for the competencies and business model adjustments required for longer-term success. In the market for smartphones, we have seen significant momentum and emphasis on the creation and evolution of new ecosystems around major software platforms, including Apple’s iOS platform and Google’s Android platform, bringing together devices, software, applications and services. A notable recent development has been the increased affordability of devices based on the Android smartphone platform, which has enabled them to compete with a portion of the market that has traditionally been dominated by feature phone offerings. As Android is available free of charge and a significant part of the source code is available as open source software, entry and expansion in the smartphone market has become easier for a number of hardware manufacturers that have chosen to join Android’s ecosystem. Additionally, the success of an ecosystem and its ability to continue to grow may also depend on the support it lends to different kinds of devices. With multiple products available to suit different needs, such as mobile devices, tablets, computers and televisions, there is demand for greater seamless interaction between these devices. A number of vendors across different ecosystems are pursuing multi-screen strategies to capitalize on these opportunities.
Our partnership with Microsoft brings together complementary assets and competencies with the aim of creating a competitive smartphone ecosystem. We believe that together with Microsoft we will succeed in attracting the necessary elements for the creation of a successful ecosystem and that by extending the price points, market segments and geographies of our Windows Phone smartphones, we will be able to significantly strengthen the scale and attractiveness of that ecosystem to developers, operators and partners.
Strategy for the trend: Increased Pervasiveness of Smartphones and Smartphone-like Experiences Across the Price Spectrum
[91] During the past year, we saw the increasing availability of more affordable smartphones, particularly Android-based smartphones, connected devices and related services which were able to reach lower price points contributing to a decline in the average selling prices of smartphones in our industry.
This trend affects us in two ways.
First, it puts pressure on the price of our smartphones and potentially our profitability, as we need to price our smartphones competitively. We currently partially address this with our Symbian device offering in specific regions and distribution channels, and we plan to introduce and bring to markets new and more affordable Nokia products with Windows Phone in 2012, such as the Nokia Lumia 610 announced in February 2012.
Second, lower-priced smartphones put pressure on our higher-end feature phone offering from our Mobile Phones unit. We are addressing this with our planned introductions in 2012 of smarter, competitively priced feature phones with more modern user experiences, including software, services and application experiences. In support of our Mobile Phones business, we also plan to drive third party innovation through working with our partners to engage in building strong, local ecosystems.
Strategy for the trend: Increasing Challenges of Achieving Sustained Differentiation and Impact on Overall Industry Gross Margin Trends
[91] Although we expect the mobile device industry to continue to deliver attractive revenue growth prospects, we are less optimistic about the gross margin trends going forward. The creation and momentum of new ecosystems, especially from established Internet players with disruptive business models, has enabled handset vendors that do not have substantial software expertise or investment in software development to develop an increasingly broad and affordable range of smartphones and other connected devices that feature a certain user interface, application development and mobile service ecosystems. At the same time, this has significantly reduced the amount of differentiation in the user experience in the eyes of consumers. Our ability to achieve sustained differentiation with our mobile products is a key driver of consumer retention, net sales growth and margins. We believe that as it becomes increasingly difficult for many of our competitors to achieve sustained differentiation, overall industry gross margin trends may be depressed going forward.
Through our partnership with Microsoft and development of the Windows Phone ecosystem, we will focus more of our investments in areas where we believe we can differentiate and less on areas where we cannot, leveraging the assets and competencies of our ecosystem partners. Areas where we believe we can achieve sustained product differentiation and leadership include distinctive design with compelling hardware, leading camera and other sensor experiences and leading location-based products and services. Other ways for us to differentiate our products include using our localization capabilities, global reach, strong brand and marketing. We believe that our first Lumia devices reflect a number of these new and differentiated experiences on Windows Phone. We expect to continue to introduce new and more differentiated products from our Lumia product family in multiple markets throughout 2012.
In the Mobile Phones business, we believe our competitive advantages – including our scale, brand, quality, manufacturing and logistics, strategic sourcing and partnering, distribution, research and development and software platforms and intellectual property – continue to be important to our competitive position. Additionally, we plan to extend our Mobile Phones offerings and capabilities during 2012 in order to bring a modern mobile experience – software, services and applications – to aspirational consumers in key growth markets as part of our strategy to bring the Internet and information to the next billion people. At the same time, we plan to drive third party innovation through working with our partners to engage in building strong, local ecosystems.
Finally, we believe that we must invest in new projects to drive differentiation and take advantage of future technology disruptions and trends. Through ongoing research and development, we plan to explore and lead next-generation opportunities in devices, platforms and user experiences to support our industry position as well as our ability to further differentiate over the longer-term. For example, new web technologies such as those commonly referred to as HTML5 may lead to less operating system-centric ecosystems. It is important to be able to drive such industry developments, which we believe will define the future of our industry.
Strategy for the trend: Emergence of New Business Models
[92] We believe that the traditional industry monetization model – capturing the value of the overall experience through the sale of a mobile device – will continue to dominate in the near to medium term. However, we are also seeing the emergence of new indirect monetization models where the value is captured through indirect sources of revenue such as advertising revenue through applications rather than the actual sale of a device. These indirect monetization models could become more prominent in our industry in the longer-term. Accordingly, we believe that developing a range of indirect monetization opportunities, such as advertising-based business models, will be part of successful ecosystems over the coming years. Obtaining and analyzing a complex array of customer feedback, information on consumer usage patterns and other personal and consumer data over the largest possible user-base is essential in gaining greater consumer understanding. We believe this understanding is a key element in developing new monetization opportunities and generating new sources of revenue, as well as in facilitating future innovations, including the delivery of new and more relevant user experiences ahead of the competition.
The exploration of new revenue streams is a key element of our partnership with Microsoft. We are jointly developing new services with Microsoft to drive innovation and new sources of revenue from our ecosystem. We believe that our ability to understand the specific needs of different geographic markets and consumer segments and to localize services and applications appropriately will be a key competitive differentiator. To support this, in the coming years we plan to invest in local advertising platforms to further enhance and enrich our localized offerings. Supported by our scale, we believe that we have the opportunity to deliver more compelling and relevant local services and to build new monetization models for Nokia and the Windows Phone ecosystem.
Strategy for the trends in: Supply Chain, Distribution and Operator Relationships
[93] The industry in which we operate is one of the fastest growing and most innovative, with a broad range of industry participants contributing product and technological innovations. In particular, the role of component suppliers has grown in importance. At the same time, much of the value creation for consumers has shifted from hardware to software. Nevertheless, we believe that there continues to be substantial room to innovate in hardware. From that perspective and in order to deliver market-leading innovations and sustainable differentiation through hardware, it is critical to have good relationships with high quality suppliers. With good supplier relationships, allied with the strength of our world-class manufacturing and logistics system, we believe we are well-positioned to deliver high-quality hardware as well as to respond quickly to customer and consumer demand.
Amid rapid change in the industry, we have also seen new sourcing models emerge. Especially in smartphones, our competitors have shifted from traditional multi-sourcing strategies where you have multiple suppliers for each component, to more focused sourcing strategies where they integrate key strategic suppliers closer to their operations as well as use advance cash payments to secure supply for several quarters in advance in order to have more unique and differentiated components as well as more predictability in their sourcing. This means that we also need to look for new and more innovative ways of sourcing key components, particularly in our Smart Devices business.
Our own manufacturing network continues to be a valuable asset, especially in our high-volume Mobile Phones business. We realized, however, that we need to adjust our manufacturing to meet the lower overall demand for our products and increase our speed to market for our mobile products. In 2011 and in February 2012, we announced our plans to adjust our manufacturing capacity and renew our manufacturing strategy to focus product assembly primarily in Asia to better reflect how our global networks of customers, partners and suppliers have evolved. The changes included the closure of our manufacturing facility in Cluj, Romania at the end of 2011. We also announced planned changes at our facilities in Komárom, Hungary, Reynosa, Mexico and Salo, Finland. These three facilities are planned to focus on smartphone product and sales package customization, serving customers mainly in Europe and the Americas, while our smartphone assembly operations will be transferred to our facilities in Asia – Beijing, China and Masan, South Korea – where the majority of our component suppliers are based. With these adjustments to our manufacturing network, we are aiming to continue to generate meaningful benefits relative to our competitors.
As in any global consumer business, distribution continues to be an important asset in the mobile device industry. We believe the breadth of our global distribution network is one of our key competitive advantages. We have the industry’s largest distribution network with more than 850,000 points of sale globally. Compared to our competitors, we have a substantially larger distribution and care network, particularly in China, India and the Middle East and Africa.
During 2011, the importance of operator-driven distribution increased. Whereas in the past operators dominated distribution only in the large western markets in Europe and the United States, they have recently been growing their share of distribution in large growth markets such as China, a traditionally strong market for us. We have been historically more successful where our mobile products are sold to consumers in open distribution through non-operator parties. It is therefore increasingly important to not only have a large number of points of sale globally, but also to have good relationships with key operators in each region.
Strategically, we want to be the preferred ecosystem partner for operators. By creating a new global mobile ecosystem with Microsoft and focusing on driving operator data plan adoption in lower price points with our feature phone offering, we believe we will be able to create a greater balance for operators and provide attractive opportunities to share the economic benefits from services and applications sales compared to other competing ecosystems, thereby improving our long-standing relationships with operators around the world.
Strategy for the trends related to: Speed of Innovation, Product Development and Execution
[94] As the mobile communications industry continues to undergo significant changes, we believe that speed of innovation and product development are important drivers of competitive strength. For example, a number of our competitors have been able to successfully leverage their software expertise to continuously bring innovations to market at a pace faster than typical hardware cycles. This has placed increasing pressure on all industry participants to continue to shorten product creation cycles and to execute in a timely, effective and consistent manner.
In February 2011, we announced our new strategy, including changes to our operational structure, company leadership, decision-making, ways of working and competencies designed to accelerate our speed of execution in an intensely competitive environment. The changes to our ways of working fall into six categories:
- globally accountable business units;
- a revised services mission;
- local empowerment;
- simplified decision-making;
- a performance-based culture with consistent behavior; and
- a new leadership structure with new leadership principles.
We believe under the new operational structure and with these new ways of working we can deliver noticeable improvements to our speed of innovation, product development and execution of both our Smart Devices and Mobile Phones business units.
Strategy for the trends related to: More Active Licensing Strategies of Patents and Intellectual Property
[94] Success in our industry requires significant research and development investments, with intellectual property rights filed to protect those investments and related inventions. In recent years, we have seen new entrants in the industry as new ecosystems have lowered the barriers to entry. In 2011, we saw intensified and more active licensing and enforcement strategies of patents and intellectual property emerge through a series of legal disputes between several industry participants as patent holders sought to protect their intellectual property against infringements by new entrants. It is not only traditional industry participants that have sought to safeguard their intellectual property; non-manufacturing patent licensing entities owning relevant technology patents have also actively been enforcing their patents against new entrants. These companies’ sole business model is to buy patents from the innovators and to maximize the value from those patents. As a result, the industry’s focus on patents and intellectual property has increased significantly and patent portfolios have become increasingly valuable for industry participants. Increased activity has also created lucrative opportunities to monetize patents by selling them to others. We expect this trend to continue in 2012. We believe we are well-positioned to both protect our existing business as well as generate incremental value to our shareholders through our industry-leading patent portfolio.
We are a world leader in the development of mobile devices and mobile communications technologies, which is also demonstrated by our strong patent position. During the last two decades, we have invested more than EUR 45 billion in research and development and built one of the mobile device industry’s strongest and broadest intellectual property right portfolios, with over 10 000 patent families. In 2011, we continued to work hard to enforce our patents against unlawful infringement and realize the value of our intellectual property. Our 2011 initiatives included, among other things, the signing of a patent license agreement with Apple, which we expect will have a positive financial impact on our future business, as well as capitalizing on strong market conditions by divesting several hundred patent families in a series of transactions to non-manufacturing patent licensing entities. Despite such divestments, we have maintained the strength and size of our patent portfolio on a stable level of approximately 10 000 patent families.
Strategy for the trends related to: Uncertain Global Macroeconomic Environment
We are currently experiencing a time of great global macroeconomic uncertainty. This uncertainty can cause unprecedented and dramatic shifts in consumer behavior, which can have significant effects on the mobile device industry. These effects could include, for example, consumers reducing the amount they are willing to spend on mobile products, which would negatively affect industry average selling prices, or consumers postponing purchases of new products, which would negatively affect device replacement cycles. These types of shifts in consumer behavior could potentially have a material adverse effect on our net sales and profitability in 2012.
While negative to the industry overall, we believe that the impact of any dramatic shifts in consumer behavior could be mitigated to a certain extent by our global distribution network, geographically well diversified supply-chain, relatively fragmented customer space and the breadth of our offering, which covers a wide range of price points. Furthermore, during our ongoing transition to Windows Phone as our primary smartphone platform our financial position has continued to be relatively strong. We continuously monitor the strength of our financial position and assess its adequacy in different net sales and profitability scenarios.
Additionally, we have identified and implemented certain precautionary measures designed to limit the possible immediate direct negative consequences resulting from the potential deterioration of the economic situation within the eurozone.
Restructuring in accordance with all that:
[F-64] In April 2011, Nokia announced plans to reduce its global workforce by about 4 000 employees by the end of 2012, as well as plans to consolidate the company’s research and product development sites so that each site has a clear role and mission. In September 2011, Nokia announced plans to take further actions to align its workforce and operations, which includes reductions in Sales and Marketing and Corporate functions in line with Nokia’s earlier announcement in April 2011. The measures also include the closure of Nokia’s manufacturing facility in Cluj, Romania, which – together with adjustments to supply chain operations – has affected approximately 2 200 employees. As a result, Devices & Services recognized a restructuring provision of EUR 456 million in total.
In 2010, Devices & Services recognized restructuring provisions of EUR 85 million mainly related to changes in Symbian Smartphones and Services organizations as well as certain corporate functions that were expected to result in a reduction of up to 1 800 employees globally.
[96] The factors and trends discussed above influence our net sales and gross profit potential. In addition, operational efficiency and cost control are important factors affecting our profitability and competitiveness. We continuously assess our cost structure and prioritize our investments. Our objective remains to maintain our strong capital structure, focus on profitability and cash flow and invest appropriately to innovate and grow in key strategic areas.
We expect that the adoption of Windows Phone as our primary smartphone platform will enable us to reduce significantly our operating expenses. For example, the Microsoft partnership allows us to eliminate certain research and development investments, particularly in operating systems and services, which we expect will result in lower overall research and development expenditures over the longer-term in our Devices & Services business.
We announced in 2011 that we are targeting to reduce our Devices & Services operating expenses by more than EUR 1 billion for the full year 2013, compared to the Devices & Services operating expenses of EUR 5.35 billion for the full year 2010, excluding special items and purchase price accounting related items.
We have announced a number of planned changes to our operations during 2011 and 2012 in connection with the implementation of our new strategy in our Devices & Services business and the creation of our new Location & Commerce business. The planned changes include substantial personnel reductions, site and facility closures and reconfiguration of certain facilities.
Initially, we announced that we are focusing our restructuring work primarily on the research and development teams to ensure that we correctly allocate resources for the new strategy at appropriate cost levels. In addition, we agreed to outsource our Symbian software development and support activities to Accenture, which resulted in the transfer of approximately 2 300 employees to Accenture.
We later announced that we are accelerating structural change in other parts of the organization in order to ensure that we are responsive to the changing dynamics in our industry. This phase includes the alignment of our markets organization and other supporting functions. For sales, this includes a move to simplify our model based around four regions, twenty areas and additional local offices that serve individual countries or territories.
We also announced plans to adjust our manufacturing capacity and renew our manufacturing strategy to reflect how our global networks of customers, partners and suppliers have evolved, including the closure of our facility in Cluj, Romania, the review of our manufacturing operations in Komárom, Hungary, Reynosa, Mexico and Salo, Finland and the transfer of smartphone assembly operations to Beijing, China and Masan, South Korea.
With respect to combining NAVTEQ and our Devices & Services social location services operations to form our Location & Commerce business, we announced a plan to capture potential synergies and opportunities to increase effectiveness through automation. The planned changes in the Location & Commerce business are estimated to affect approximately 1 300 employees.
Since we outlined our new strategy, we have announced total planned employee reductions of approximately 11 500 employees, as well as the transfer of approximately 2 300 employees to Accenture as noted above.
The planned measures support the execution of our strategy and are expected to bring efficiencies and speed to the organization. In line with our values, we are offering employees affected by the planned reductions a comprehensive support program. We remain committed to supporting employees and the local communities through this difficult change.
As of December 31, 2011, we had recognized cumulative net charges in Devices & Services of EUR 797 million related to restructuring activities in 2011, which included restructuring charges and associated impairments. While the total extent of the restructuring activities is still to be determined, we currently anticipate cumulative charges in Devices & Services of around EUR 900 million before the end of 2012. We also believe total cash outflows related to our Devices & Services restructuring activities will be below the level of the cumulative charges related to these restructuring activities.
In the past, our cost structure has benefited from the cost of components eroding more rapidly than the price of our mobile products. Recently, however, component cost erosion has been generally slowing, a trend that adversely affected our profitability in 2010 and 2011, and may do so in the future.
The currency volatility of the Japanese yen and United States dollar against the euro continued to put pressure on our costs in 2011. During 2011, we were able to manage the currency volatility driven cost pressure with an appropriate level of hedging and by managing our sourcing towards more favorable currencies. Our currency exposure profiles have not changed significantly and continued currency volatility of the Japanese yen and US dollar against the euro may negatively affect us in the future.
Location & Commerce:
[97] Our Location & Commerce business aims to positively differentiate its digital map data and location-based offerings from those of our competitors and create competitive business models for our customers.
In the fourth quarter 2011, we conducted our annual impairment testing to assess if events or changes in circumstances indicated that the carrying amount of our goodwill may not be recoverable. As a result, we recorded a charge to operating profit of EUR 1.1 billion for the impairment of goodwill in our Location & Commerce business. The impairment charge was the result of an evaluation of the projected financial performance of our Location & Commerce business. This took into consideration the market dynamics in digital map data and related location-based content markets, including our estimate of the market moving long-term from fee-based towards advertising-based models especially in some more mature markets. It also reflected recently announced results and related competitive factors in the local search and advertising market resulting in lower estimated growth prospects from our location-based assets integrated with different advertising platforms. After consideration of all relevant factors, we reduced the net sales projections for Location & Commerce which, in turn, reduced projected profitability and cash flows.
Location & Commerce’s resources are primarily focused on the development of:
(i) content, which involves the mapping of the physical world and places such as roads and points of interest, as well as the collection of activity data generated and authorized for use by our users;
(ii) the platform, which adds functionality on top of the content and includes the development tools for us and others to create on top of it; and
(iii) applications built on the content and platform.
Our Devices & Services business is a key customer of Location & Commerce. Devices & Services purchases map and application licenses from Location & Commerce for its Nokia Maps service sold in combination with GPS enabled smartphones.
Competition:
[61] With respect to digital map data and related location-based content, several global and local companies, as well as governmental and quasi-governmental agencies, are making more map data with improving coverage and content, and high quality, available free of charge or at lower prices. For example, our Location & Commerce business competes with Google which uses an advertising-based model allowing consumers to use its map data and related services in their products free of charge. Google has continued to leverage Google Maps as a differentiator for Android, bringing certain new features and functionality to that platform. Apple has also sought to strengthen its location assets and capabilities through targeted acquisitions and organic growth.
Location & Commerce also competes with companies such as TomTom, which licenses its map data and where competition is focused on the quality of the map data and pricing, and Open Street Map, which is a community-generated open source map available to users free of charge. Aerial, satellite and other location-based imagery is also becoming increasingly available and competitors are offering location-based products and services with the map data to both business customers and consumers in order to differentiate their offerings.
Strategy for the trend: Location-Based Products and Services Proliferation
[97] A substantial majority of Location & Commerce net sales in 2011 came from the licensing of digital map data and related location-based content and services for use in mobile devices, in-vehicle navigation systems, Internet applications, geographical information system applications and other location-based products and services. Location & Commerce’s success depends upon the rate at which consumers and businesses use location-based products and services. In recent years, there has been a strong increase in the availability of such products and services, particularly in mobile devices and online application stores for such devices. Furthermore, as the use of the Internet through mobile devices has been growing rapidly, the anchor of the Internet is moving from the desktops to mobiles. This shift is making location-based content a key element of most Internet experiences. We expect this trend to continue, but we also expect that the level of quality required for these products and services and the ability to charge license fees for the use of map data incorporated into such products and services may vary significantly. By combining our NAVTEQ business with our Devices & Services social location services operations, we believe our Location & Commerce business will be better positioned to capture emerging business opportunities with a broader offering which is no longer limited to digital map data.
Strategy for the trend: Increasing Importance of Creating an Ecosystem around Location-Based Services Offering
[97] Creating a winning ecosystem around our Location & Commerce’s services offering will be critical for the success of this business. The longer-term success of the Location & Commerce business will be determined by our ability to attract strategic partners and developers to support our ecosystem. Location & Commerce is aiming to support its ecosystem by enabling strategic partners and independent developers to foster innovation on top of their location platform. We believe that making it possible for other vendors to innovate on top of Location & Commerce’s high quality location-based assets will further strengthen the overall experience and make our offering stronger and more attractive.
Strategy for the trend: Emergence of the Intelligent Sensor Network
[98] Mobile Internet devices are increasingly being enabled with a rich set of sensors such as a GPS, a camera and an accelerometer which enable interaction with the real world. This interaction also enables the collection of large volumes of rich data which, when combined with analytics, enable the development of increasingly sophisticated, contextually-aware devices and services. We believe the combination of NAVTEQ with our Devices & Services social location services operations will enable Location & Commerce to participate in this industry development and seize new opportunities to deliver new experiences that bridge the virtual with the real world.
Strategy for the trend: Price Pressure for Navigable Map Data Increasing
[98] Location & Commerce’s net sales are also affected by the highly competitive pricing environment. Google is offering turn-by-turn navigation in many countries to its business customers and consumers on certain mobile handsets at no charge to the consumer. While we expect these offerings will increase the adoption of location-based services in the mobile handset industry, we also expect they may lead to additional price pressure from Location & Commerce’s business customers, including handset manufacturers, navigation application developers, wireless carriers and personal navigation device (“PND”) manufacturers, which are seeking ways to offer lower-cost or free turn-by-turn navigation to consumers. Turn-by-turn navigation solutions that are free to consumers on mobile devices may also put pressure on automotive OEMs and automotive navigation system manufacturers to have lower cost navigation alternatives. This price pressure is expected to result in an increased focus on advertising revenue as a way to supplement or replace license fees for map data.
In response to the pricing pressure, Location & Commerce focuses on offering a digital map database with superior quality, detail and coverage; providing value-added services to its customers such as distribution and technical services; enhancing and extending its product offering by adding additional content to its map database, such as 3D landmarks; and providing business customers with alternative business models that are less onerous to the business customer than those provided by competitors. Location & Commerce’s future results will also depend on Location & Commerce’s ability to adapt its business models to generate increasing amounts of advertising revenues from its map and other location-based content.
We believe that Location & Commerce’s PND customers will continue to face competitive pressure from smartphones and other mobile devices that now offer navigation, but that PNDs continue to offer a viable option for consumers based on the functionality, user interface, quality and overall ease of use.
Strategy for the trend: Quality and Richness of Location-Based Content and Services Will Continue to Increase
[98] Location & Commerce’s profitability is also driven by Location & Commerce’s expenses related to the development of its database and expansion. Location & Commerce’s development costs are comprised primarily of the purchase and licensing of source maps, employee compensation and thirdparty fees related to the construction, maintenance and delivery of its database.
In order to remain competitive and notwithstanding the price pressure discussed above, Location & Commerce will need to continue to expand the geographic scope of its map data, maintain the quality of its existing map data and add an increasing amount of new location-based content and services, as well as using innovative ways like crowd sourcing to collect data. The trends for such location-based content and services include real-time updates to location information, more dynamic information, such as traffic, weather, events and parking availability, and imagery consistent with the real world. We expect that these requirements will cause Location & Commerce’s map development expenses to continue to grow, although a number of productivity initiatives are underway designed to improve the efficiency of our database collection processing and delivery. In addition, we will need to continue making investments in this fast paced and innovative location-based content and services industry, for instance through research and development, licensing arrangements, acquiring businesses and technologies, recruiting specialized expertise and partnering with third parties.
Restructuring in accordance with all that:
[F-64] In September 2011, Nokia announced a plan to concentrate the development efforts of the Location & Commerce business in Berlin, Germany and Boston and Chicago in the U.S., and other supporting sites and plans to close its operations in Bonn, Germany and Malvern, U.S. As a result, Location & Commerce recognized a restructuring provision of EUR 25 million.
Nokia Siemens Networks:
[99] Nokia Siemens Networks’ has a broad portfolio of products and services designed to address evolving needs of network operators from GSM to LTE wireless standards, a base of over 600 customers in over 150 countries serving over 2.5 billion subscribers and one of the largest services organizations in the telecommunications infrastructure industry. The company’s global customer base includes network operators such as Bharti Airtel, China Mobile, Deutsche Telekom, France Telecom, Softbank, Telefonica O2, Verizon and Vodafone.
Geographical diversity provides Nokia Siemens Networks with opportunities in both emerging markets, which may experience rapid growth, and developed markets where it believes its technologically advanced products and services portfolio provides a competitive advantage, while the geographic diversity of its customer base reduces exposure to fluctuating economic conditions in individual markets.
Nokia Siemens Networks’ net sales depend on various developments in the global telecommunications infrastructure and related services market, such as network operator investments, the pricing environment and product mix. In developed markets, operator investments are primarily driven by capacity and coverage upgrades, which, in turn, are driven by greater usage of the networks primarily through the rapid growth in data usage. Those operators are targeting investments in technology and services that allow them to provide end users with fast and faultless network performance in the most efficient manner possible, allowing them to optimize their investment. Such developments are facilitated by the evolution of network technologies that promote greater efficiency and flexibility.
In addition, those operators are increasingly investing in software and services that provide them with the means to better manage end users on their network, and also allow them additional access to the value of the large amounts of subscriber data under their control. In emerging markets, the principal factors influencing operator investments are the continued growth in customer demand for telecommunications services, including data, as well as new subscriber growth. In many emerging markets, this continues to drive growth in network coverage and capacity requirements.
The telecommunications infrastructure market is characterized by intense competition and price erosion caused in part by the entry into the market of vendors from China, Huawei and ZTE, which have gained market share by leveraging their low cost advantage in tenders for customer contracts. In recent years, the technological capabilities of those vendors, particularly Huawei, has improved significantly, resulting in competition not only on price but also on quality.
The pricing environment remained intense in 2011. In particular, the wave of network modernization that has taken place, particularly in Europe but increasingly in other regions including Asia Pacific, has experienced some aggressive pricing as all vendors fight for market share.
Nokia Siemens Networks’ net sales are impacted by those pricing developments, which show some regional variation, and in particular by the balance between sales in developed and emerging markets. While price erosion is evident across most geographical markets, it continues to be particularly intense in a number of emerging markets where many operator customers have been subject to financial pressure, both through lack of availability of financing facilities during 2011 as well as profound pricing pressure in their domestic markets.
Pricing pressure is evident in the traditional products markets, in particular, where competitors may have products with similar technological capabilities, leading to commoditization in some areas. Nokia Siemens Networks’ ability to compete in those markets is determined by its ability to remain price competitive with its industry peers and it is therefore important for Nokia Siemens Networks to continue to reduce product costs to keep pace with price attrition. Nokia Siemens Networks continued to make progress in reducing product and procurement costs in 2011, and will need to continue to do so in order to provide its customers with high-quality products at competitive prices. There is currently less pricing sensitivity in the managed services market, where vendor selections are often largely determined by the level of trust and demonstrated capability in the field.
In November 2011, Nokia Siemens Networks articulated its regional strategy, identifying three markets, Japan, Korea and the United States, as its priority countries where it will target growth. The Middle East and Africa, where political, financial and competitive pressures have led to particular weakness in 2011, will be the focus of turnaround efforts. In the remaining regions, Latin America, China, Asia-Pacific, Canada and Europe, Nokia Siemens Networks goal will be to defend market share and find areas for future profitable growth.
Over recent years, the telecommunications infrastructure industry has entered a more mature phase characterized by the completion of the greenfield roll-outs of mobile and fixed network infrastructure across many markets, although this is further advanced in developed markets. Despite this, there is still a significant market for traditional network infrastructure products to meet coverage and capacity requirements, even as older technologies such as 2G are supplanted by 3G and LTE. As growth in traditional network products sales slows, there is an emphasis on the provision of network upgrades, often through software, as well as applications, such as billing, charging and subscriber management, and services, particularly the outsourcing of non-core activities to companies
The competitive landscape for that is the following:
[70] Conditions in the market for mobile and fixed network infrastructure and related services improved, but remained challenging and intensely competitive in 2011. The market continued to be characterized by mixed trends as growth in mobile broadband and services was offset by equipment price erosion, a maturing of legacy industry technology and intense price competition.
Industry participants have changed significantly in recent years. Substantial industry consolidation occurred in 2007 with the emergence of three major European vendors: Alcatel-Lucent, Ericsson and Nokia Siemens Networks. The break-up of Nortel occurred in 2009 when it entered bankruptcy protection and many parts of the business were sold, including the wireless carrier unit, Metro Ethernet Networks, and its GSM business. In January 2011, Motorola Solutions completed its separation from Motorola Mobility Holdings Inc. In April 2011, Nokia Siemens Networks acquired the majority of Motorola Solutions’ wireless network infrastructure assets.
During 2011, the competitive environment in the telecommunications infrastructure market was characterized by continued overall growth in global network operators’ capital expenditures in Euro terms, mainly attributable to the Japanese, Chinese, APAC, North East Europe and Latin American markets. Growth in capital expenditures declined in the Middle East and remained relatively unchanged in the European and North American markets in Euro terms in 2011. Increased smart phone usage drove increased investments in the United States and European wireless markets. The vendors from China, Huawei and ZTE, continued to grow their market share but at a slower pace than in previous years and continued to challenge Alcatel-Lucent, Ericsson and Nokia Siemens Networks. Nokia Siemens Networks’ ability to compete with low-cost vendors primarily depends on its ability to be price competitive and, in certain circumstances, its ability to provide or facilitate vendor financing. In recent years, the technological capabilities of the Chinese vendors, particularly Huawei, has improved significantly, resulting in competition not only on price but also on quality. In addition to the major infrastructure providers, Nokia Siemens Networks also competes with Cisco and NEC.
In the Networks Systems business, the decline of 2G (GSM, CDMA) continued in 2011, whereas investments in 3G continued and increased worldwide. Also, fourth generation (4G) LTE trials and pilots continued strongly as operators continued to merge towards next generation LTE and all-IP networks. Within the LTE segment, leading vendors are competing based on factors including technology innovation, network typology and less complex network architectures as well as integration towards all-IP networks.
Growth in wireline and wireless broadband services sped up optical and wireless network upgrades in developed markets. In addition, the related investment in mobile backhaul networks continued to increase due to data traffic increases in the operator networks.
In services, which remained the fastest growing part of the industry, competition is generally based on a vendor’s ability to identify and solve customer problems rather than their ability to supply equipment at a competitive price. Competition in services is from both traditional vendors such as Alcatel-Lucent, Ericsson and Huawei, as well as non-traditional telecommunications entities and system integrators, such as Accenture and IBM. In addition to these companies, there are also local service companies competing, which have a narrower scope in terms of served regions and business areas.
Nokia Siemens Networks’ Business Solutions business unit assists network operators in transforming their business, processes and systems to enhance the customer experience, drive new revenue and improve operational efficiency to enable them to successfully address the challenges and opportunities of mobile broadband, smartphones, tablet computers, multi-play offerings, service innovation and new growth areas. In this area, Nokia Siemens Networks faces competition also from information technology and software businesses like Accenture, Amdocs, HP, IBM and Oracle, which are active in areas such as the service delivery platform market and business insight and analysis services.
Certain competitors may receive governmental support allowing them to offer products and services at substantially lower prices. Further, in many regions restricted access to capital has caused network operators to reduce capital expenditure and has produced a stronger demand for vendor financing. Certain of Nokia Siemens Networks’ competitors may have stronger customer financing possibilities due to internal policies or government support. While the amount of financing Nokia Siemens Networks provided directly to its customers in 2011 remained at approximately the same level as in 2010, as a strategic market requirement it plans to offer this financing option only to a limited number of customers and primarily to arrange and facilitate such financing with the support of export credit or guarantee agencies.
Strategy for the trends in: Mobility and Data Usage
[100] Over recent years the two most evident trends in the telecommunications market – the rise in use of mobile services and the exponential increase in data traffic – have converged. One result is that services once regarded as available primarily, if not exclusively, through fixed or wireline network are increasingly in demand from wireless networks also.
Alongside traditional voice and data services, such as text messaging, end-users access a wealth of media services through communications networks, including email and other business data; entertainment services, including games and music; visual media, including high definition films and television programming; and social media sites. End-users increasingly expect that such services are available to them everywhere, through both mobile and fixed networks, and a wealth of new devices, optimized to allow them to do so, have become available including tablet computers, highly sophisticated multimedia smartphones, mobile broadband data dongles, set-top boxes and mobile and fixed line telephones.
The widespread availability of devices has been matched by a proliferation of products and services in the market that both meet and feed end-user demand. These continue to drive dramatic increases in data traffic and signaling through both mobile access and transport networks that carry the potential to cause network congestion and complexity. During 2011, this increase continued to gain momentum as more users moved towards smartphones and tablets and even more devices that require constant connectivity were introduced to the market.
While the growth in traffic is clear, it has not been met by corresponding growth in operators’ revenues from data traffic, where growth appears to be slowing. This presents operators with a challenge: to cope with the growing traffic load within networks, it is fundamental that operators continue to invest in their networks, but within the financial constraints that their current business models dictate.
This means that while the addition of capacity, speed and coverage is crucial, it is critical that networks are built efficiently and effectively in a manner that optimizes capital investment and delivers networks with architecture sufficiently flexible to cope with evolving requirements.
During 2011, Nokia Siemens Networks recognized the centrality of mobile networks to the future development of telecommunications and announced that it would place mobile broadband at the heart of its strategy, articulating an ambition to provide the world’s most efficient mobile networks, the intelligence to maximize the value of those networks and the services capability to make all elements work together seamlessly. Nokia Siemens Networks said it expected to increase investment in mobile broadband.
Also during 2011, Nokia Siemens Networks launched the network architecture designed to equip operators to meet the challenges they are facing. “Liquid Net” architecture provides flexibility across networks to adapt to changing customer needs instantly, using existing resources more efficiently. This optimizes capital investment and allows operators to seek new revenue opportunities. Liquid Net uses automated, self-adapting broadband optimization to remain constantly aware of the network’s operational status, as well as the services and content being consumed, to ensure the best user experience. Liquid Net consists of three areas: Liquid Radio, Liquid Core and Liquid Transport.
Strategy for the trends in: Managed Services and Outsourcing
[101] There has been an acceleration in the development of the managed services market as operators increasingly look to outsource network management to infrastructure vendors. The primary driver for this trend is that managed services providers are able to offer economies of scale in network management that allow the vendor to manage such contracts profitably while operators can reduce the cost of network management. The outsourcing trend is also underpinned by many operators taking the view that network management is no longer either a core competence or requirement of their business and are increasingly confident they can find greater expertise by outsourcing this activity to a trusted partner that can also improve quality and reliability in the network.
Nokia Siemens Networks believes that this trend will continue and that it could in future be driven by financial imperatives of its customers facing slowing revenue growth but a continuing requirement for capital investment in their networks, a dynamic that has the potential to threaten their profitability levels. This results in some operators aiming to control their operating expenditure. In those circumstances, the outsourcing of the management of their network to infrastructure vendors, such as Nokia Siemens Networks, can be an attractive option.
In emerging markets, such as Africa and India, price pressure and competition in the end-user market has increased the financial pressure on many operators, which in turn has resulted in a similar trend as operators have looked to control and cut costs through outsourcing network management.
The trend towards network management outsourcing is evident in every region of the world and has intensified. Nokia Siemens Networks believes that this trend generates its own momentum in the market as vendors can increasingly demonstrate their capabilities with reference accounts and operators are exposed to their competitors taking steps that can enhance profitability and improve network quality and reliability.
In the announcement of its new strategy in November 2011, Nokia Siemens Networks reaffirmed its commitment to services, and will continue to aim to support mobile operators with high end services and will seek to maximize the potential of its global delivery model, with its global network solution centers in Portugal and India which offer the benefits of scale and efficiencies both to Nokia Siemens Networks and its customers.
Strategy for the trends in: Customer Experience Management
As operators in many markets see the growth of net new subscribers slowing or even stopping, they are increasingly focused on leveraging the value of the subscribers they have. As the acquisition of new subscribers to networks in such markets can be both difficult and expensive, customers look to limit “churn”, where end users transfer to a rival service provider, as well as to increase the revenue derived from each user through the addition of value-added services, such as access to media and entertainment and social networking services. This often requires that operators invest in software and solutions that allow customers to enjoy an improved experience. One of the key foundations for this improved end-user experience is understanding an end user’s behavior and preferences, which in turn allows the operator to tailor service offerings to the individual consumer. This not only includes services and applications, but also bespoke billing platforms and identity management solutions.
Nokia Siemens Networks continues to develop and enhance its offerings in this area, and in November 2011 announced that its Customer Experience Management unit would be a lead business area in its new strategy. Nokia Siemens Networks believes it has the industry’s leading subscriber database management platform, complemented by flexible billing and charging platforms and other software and solutions that provide its customers with the tools, flexibility and agility required to respond to a rapidly changing end-user market. Nokia Siemens Networks also provides business process and consulting services that help to lead its customers through business transformation opportunities.
Strategy related to: Motorola Solutions Acquisition
[102] In April 2011, Nokia Siemens Networks acquired the majority of the wireless network infrastructure assets of Motorola Solutions for a total consideration of EUR 642 million. The acquisition increased Nokia Siemens Networks’ global presence and expanded its position and product offerings in key markets. See Item 4B. “Business Overview – Nokia Siemens Networks – Motorola Solutions Acquisition.”
Trasition to a: New Strategy and [the corresponding] Restructuring Program
[103] Nokia Siemens Networks’ focus is on becoming the strongest, most innovative and highest quality mobile broadband and services business in the world. Rather than targeting the full spectrum of telecommunications equipment and services, Nokia Siemens Networks is the first of the telecommunications companies to refocus on providing the most efficient mobile networks, the intelligence that maximizes the value of those networks and the services that make it all work seamlessly.
In November 2011, Nokia Siemens Networks announced a new strategy, including changes to its organizational structure and an extensive restructuring program, aimed at maintaining and developing Nokia Siemens Networks, position as one of the leaders in mobile broadband and services and improving its competitiveness and profitability. Nokia Siemens Networks expects substantial charges related to this restructuring program in 2012. See Item 4B. “Business Overview—Nokia Siemens Networks—New Strategy and Restructuring Program” for a description of the main elements of the new strategy.
Year 2012 will be a year of transition for Nokia Siemens Networks as it implements its new strategy and restructuring program. Accordingly, Nokia and Nokia Siemens Networks believe it is currently not appropriate to provide annual targets for Nokia Siemens Networks for 2012. Additionally, the macroeconomic environment is making it increasingly difficult to estimate the outlook for 2012.
Longer-term, Nokia and Nokia Siemens Networks target Nokia Siemens Networks’ operating margin to be between 5% and 10%, excluding special items and purchase price accounting related items.
Nokia Siemens Networks targets to reduce its annualized operating expenses and production overheads, excluding special items and purchase price accounting related items, by EUR 1 billion by the end of 2013, compared to the end of 2011. While these savings are expected to come largely from organizational streamlining, the company will also target areas such as real estate, information technology, product and service procurement costs, overall general and administrative expenses and a significant reduction of suppliers in order to further lower costs and improve quality.
Nokia Siemens Networks plans to reduce its global workforce by approximately 17 000 by the end of 2013. These planned reductions are designed to align the company’s workforce with its new strategy as part of a range of productivity and efficiency measures. These planned measures are expected to include elimination of the company’s matrix organizational structure, site consolidation, transfer of activities to global delivery centers, consolidation of certain central functions, cost synergies from the integration of Motorola’s wireless assets, efficiencies in service operations and company-wide process simplification.
Nokia Siemens Networks has begun the process of engaging with employee representatives in accordance with country-specific legal requirements to find socially responsible means to address these reduction needs. Nokia Siemens Networks will continue to share information in affected countries as the process proceeds. In order to reduce the impact of the planned reductions, Nokia Siemens Networks intends to launch locally led programs at the most affected sites to provide re-training and re-employment support.
MWC 2012 day 1 news [Feb 27, 2012]: Samsung and Nokia
February 28, 2012 3:21 pm / 2 Comments on MWC 2012 day 1 news [Feb 27, 2012]: Samsung and Nokia
Samsung had a number of enhanced GALAXY products (see them in the “Details for Samsung” section below). The really strong message from innovation point of view from them has, however, been (considered by them as “hidden gems”):
Samsung Mobile – Beyond Product [SAMSUNGmobile YouTube Channel]
Tour the Samsung Mobile booth at Mobile World Congress 2012 in Barcelona. Find out more about our new innovations, from AllShare Play and Control through Smart Driving and Smart School to NFC mobile payments.
UPDATE: for Nokia the major competition is the overall Android ecosystem, and not only in the proper smartphone market as:
– repeatedly stressed by Stephen Elop, the CEO of Nokia:
Our number-one focus is competing with Android. [see here and here]
The principal competition is Android, and then Apple. [see here]
– indicated in relevant excerpts from the Nokia 2011 fiscal year report [March 8, 2012] as:
Market overview
… Today, however, the distinction between these two classes of products is blurring. Increasingly, basic feature phone models, supported by innovations in both hardware and software, are also providing people with the opportunity to access the Internet and applications and, on the whole, offering them a more smartphone-like experience.
Whether smartphones or feature phones, mobile devices geared for Internet access and their accompanying Internet data plans are also becoming increasingly affordable and, consequently, they are becoming attractive to a broader range of consumer groups and geographic markets. A notable recent development has been the increased affordability of devices based on the Android platform, which has enabled some vendors to offer smartphones for below EUR 100, excluding taxes and subsidies, and thus address a portion of the market which has been dominated by more basic feature phone offerings.
….
Competition
… some competitors’ offerings based on Android are available for purchase by consumers for below EUR 100, excluding taxes and subsidies, and thus address a portion of the market which has been traditionally dominated by feature phone offerings, including those offered by Nokia. Accordingly, lower-priced smartphones are increasingly reducing the addressable market and lowering the price points for feature phone. …
Principal Factors & Trends Affecting our Results of Operations
Devices & Service
…
Increased Pervasiveness of Smartphones and Smartphone-like Experiences Across the Price Spectrum
During the past year, we saw the increasing availability of more affordable smartphones, particularly Android-based smartphones, connected devices and related services which were able to reach lower price points contributing to a decline in the average selling prices of smartphones in our industry.
This trend affects us in two ways. First, it puts pressure on the price of our smartphones and potentially our profitability, as we need to price our smartphones competitively. We currently partially address this with our Symbian device offering in specific regions and distribution channels, and we plan to introduce and bring to markets new and more affordable Nokia products with Windows Phone in 2012, such as the Nokia Lumia 610 announced in February 2012. Second, lower-priced smartphones put pressure on our higher-end feature phone offering from our Mobile Phones unit. We are addressing this with our planned introductions in 2012 ofsmarter, competitively priced feature phones with more modern user experiences, including software, services and application experiences. In support of our Mobile Phones business, we also plan to drive third party innovation through working with our partners to engage in building strong, local ecosystems.
…
Full information is in the Nokia’s strategy for “the next billion” based on software and web optimization with super low-cost 2.5/2.75G SoCs [Feb 14 – March 8, 2012] post on this blog.
END OF UPDATE
For Nokia, accordingly, a number of innovations have already been introduced on the MWC 2012, from the hardware level up to the services which surround all that. So for Nokia I will provide a video-based overview here well before going into the “Details for Nokia” section in the very end:
Nokia Press Conference Highlights from MWC 2012 [nokia YouTube channel]
Key points: Nokia Lumia 610 is announced. Award-winning Nokia Lumia 900 will become available in various markets outside the US. Nokia PureView elevates industry standard in smartphone imaging. New Asha feature phones and services grow increasingly ‘smarter’.
Nokia Lumia 610 Hands-On Video [nokia YouTube channel]
The funky Nokia Lumia 610 http://nokia.ly/AztJvZ is the most affordable Lumia phone yet, but it delivers everything you need in a smartphone. The People Hub pulls family and friends’ contact details in one place, along with Facebook and Twitter feeds. A choice of colours, with metallic trim, makes the phone an individual style statement. [$254 (€189). Has a 3.7” 800 x 480 WVGA LCD display.]
The Windows Phone Xbox tie-in and 5-megapixel camera add to the funky package. And Nokia Music, with Mix Radio (availability may vary by market), Nokia Maps, Nokia Drive, Nokia Transport and Nokia Reading – make this phone unbeatable value.
UPDATE: the Nokia Lumia 610 won Tom’s Hardware Best in show and Best Budget Smartphone from Laptop. See here.
Introducing the White Nokia Lumia 900 – Live Large [nokia YouTube channel]
Meet the new Nokia Lumia 900 with Windows Phone http://nokia.ly/zoyq6L Find out how fast amazing can be. And social. And beautiful. With its award winning design including front facing camera and Live Tiles, keeping in touch with friends, and the entire Internet, has never been so easy. [$645 (€480). Has a 4.3” 800 x 480 WVGA AMOLED ClearBlack display with Gorilla Glass.]
Experience The Amazing Everyday.
First Look at Nokia Reading on Nokia Lumia [nokia YouTube channel]
In this hands on video, Rhidian from Nokia talks about Nokia Reading, a premium e-book and audio experience service announced at Mobile World Congress 2012, and shows how it works on Nokia Lumia.
Nokia Reading will be available for Nokia Lumia handsets from April and will first launch in six markets (UK, France, Germany, Italy, Spain and Russia) with more to follow.
UPDATE: Nokia Reading: Get gripped by a great book [Nokia Coversations blog, Feb 28, 2012]
Nokia Reading follows the same simple and elegant panorama design we’ve become used to with other services, delivering the whole experience through a beautifully designed “reading hub.”
Nokia is working with some of the world’s biggest publishers, including Penguin and Hachette, and Pearson to launch a world class e-book and audiobook experience that’s been designed specifically for the Nokia Lumia.
Using a single, simple app you can choose your own favourite authors, or select bestselling novels and the top local books in your own language. If you’re not sure that you’ll like a book, Nokia Reading lets you browse some sample pages before you buy. Or you can download and read one of the thousands of classic works of literature that will be available for free.
Once you have chosen a book, large, clear, smartphone screens like those on the Nokia Lumia make reading an enjoyable experience – and you can switch to ‘night mode,’ change the font or adjust brightness, if your eyes get tired in the evening. It’s also great on an underground train or plane, because you can read everything offline after downloading beforehand over WiFi or mobile network
In coming months you’ll also be able to create a personalized magazine page (called “news stream”) that updates content across the most popular categories, and adds web content from your chosen sites.
Nokia 808 PureView – The next breakthrough in photography [nokia YouTube channel]
The game changer! Nokia 808 PureView http://nokia.ly/xz6mhS takes every bit of image goodness captured by a 41MP sensor and Carl Zeiss lens and turns it into beautifully detailed images and Full HD videos. Be ready to shoot and share with friends in an instant. [$605 (€450). Has a 4” 640 x 360 16:9 nHD AMOLED display.]
The Nokia 808 PureView also features exclusive Dolby Headphone technology, transforming stereo content into a personal surround sound experience over any headphones and Dolby Digital Plus for 5.1 channel surround sound playback.
UPDATE: Zooming in on Nokia PureView [article on the Nokia Conversations blog, Feb 29, 2012]: meet the brains behind Nokia PureView Eero Salmelin and Juha Alakarhu, and also learn the history of this 5 years long journey that lead to the delivery on MWC 2012
UPDATE: Nokia 808 PureView partner makes it unbeatable [Nokia Conversations blog, March 1, 2012]
Dolby reveals audio secret of new phone’s success
…
Taking pride of place at their stand, the world’s best camera phone owes much to Dolby technologies for helping to make it an HD mobile entertainment device.
For the PureView is also about pure audio thanks to its high-definition Dolby Digital Plus 5.1-channel surround sound which plays on HD TVs, and home theatre systems, and when combined with Dolby Headphone technology – also built into the PureView – provides a personal 5.1 surround experience over any headphones.
Nokia is also bringing the Dolby experience to other smartphones with Nokia Belle Feature Pack 1 software upgrade for the Nokia 700, Nokia 701, and Nokia 603, also displayed on the Dolby stand.
Mobile Sales Director Shawn Richards talked us through the tech on a Nokia 700 with a demo from Batman movie The Dark Knight.
He explained that the Dolby Headphone upgrade transforms stereo content into a personal surround sound.
“You get a more natural, engaging, and authentic sound,” he said. “Good audio is even more important when you are watching a movie on a small screen. And Dolby Headphone creates a totally immersive feel.”
…
UPDATE: Nokia 808 Pureview – Best New Mobile Handset, Device or Tablet at Mobile World Congress 2012 [nokia YouTube channel, March 1, 2012]
Nokia 808 PureView wins top MWC award!
Our awesome camera phone scoops the top award from Mobile World Congress 2012 judges.
UPDATE: Damian Dinning explains Nokia PureView technology [nokia YouTube channel, Feb 29, 2012]
Nokia’s imaging expert Damian Dinning explains the breakthrough camera technology behind Nokia 808 PureView.
You could also check out the gorgeous photos taken with Nokia 808 PureView from the flickr.
UPDATE: Nokia PureView Q&A with Damian Dinning [interview on the Nokia Conversations blog, March 1, 2012]
Nokia Stereo Bluetooth Headset BH-221 – See what you hear [nokia YouTube channel]
The new Nokia Stereo Bluetooth Headset BH-221 comes with an integrated FM radio and OLED display. It as excellent audio quality and NFC for easy pairing with your phone. Learn more at: www.accessories.nokia.com
Nokia Asha 302: Meet the designer [nokia YouTube channel]
Nokia Asha 302 http://nokia.ly/xXK4kV was designed with one simple goal in mind – to design the best looking QWERTY phone for today’s urban professionals. The metallic touch points, bold and sophisticated colors and smooth edges help users stand out and project success giving the phone a great premium feel. [$128 (€95). Has a 2.4” 320 x 240 QVGA TFT display.]
UPDATE: The Nokia C3-00 won Best Feature Phone or Entry Level Phone at the GSMA Awards 2012 in Barcelona. Blanca Juti, VP for Mobile Phones Product Marketing said to Nokia Conversations after collecting the prize: “It’s great for our products going forward, because the Nokia Asha 302 we launched yesterday is pretty much the successor to C3 which has had an amazing run in the market.” See here.
Nokia Asha 302: Premium All Round QWERTY [nokia YouTube channel]
Nokia Asha 302 http://nokia.ly/x5m2zm is a QWERTY phone with great value for money. It is packed with a 1 Ghz processor and is great for social networking, Email, Instant messaging, supports Mail for Exchange and has a premium design with stunning looks.
Nokia Asha 203: Simply touch, connect and play [nokia YouTube channel]
The Nokia Asha 203 http://nokia.ly/x78ZBe is a touch phone with a traditional keypad, offering fast and affordable access to the internet, easy access to email and social networks as well as a 40 EA games gift offering. [$81 (€60). Has a 2.4” QVGA display.]
Nokia Asha 202 Dual SIM: Simply touch, connect and play [nokia YouTube channel]
The Nokia Asha 202 http://nokia.ly/yOGbDA is a touch phone with a traditional keypad, offering fast and affordable access to the internet, easy access to email and social networks as well as a 40 EA games gift offering. Plus it comes with Easy Swap Dual SIM. [$81 (€60). Has a 2.4” QVGA display.]
After exactly a year from the announcement of their new strategic set-up and direction it is quite obvious from all that above that Nokia is well on to realizing the corresponding transition. In fact they are redefining themselves which is well described by this video just published 2 days before the start of MWC 2012:
The New Essence of Nokia [nokia YouTube channel]
We believe that everybody can have a richer, fuller life every day, everywhere. That means upgrading an ordinary moment to an exciting one or finding an unexpected experience to share with others. Intuitively, fast and easy. This is Nokia’s new mantra, this is the new essence of Nokia.
I see this overall brand message fitting rather well with their new and enhanced portfolio as you could judge for yourself from the above video presentations. In this way they have proceeded quite well from the disastrous situation they were a year ago, and which had been described quite extensively in the following post on this blog: Be aware of ZTE et al. and white-box (Shanzhai) vendors: Wake up call now for Nokia, soon for Microsoft, Intel, RIM and even Apple! [Feb 21 – March 25, 2011].
Details for Samsung
- Samsung’s new GALAXY Tab 2 (7.0) offers optimal multimedia experiences in life[Samsung global news, !! on Samsung Forum 2012 in Europe, Feb 13, 2012 !!]
- Samsung’s new GALAXY Tab 2 series offers optimal multimedia experiences in life[Samsung Mobile Press release on MWC 2012]: the content is different, with detailed product specifications
- Samsung GALAXY Beam [global Samsung microsite]: “Beam Projector Smartphone”
- Share the Fun with Samsung GALAXY Beam[Samsung Mobile Press release on MWC 2012 with product specification included]
- Game On With Samsung GALAXY S WiFi 4.2 [Samsung Mobile Press release on MWC 2012]: “Samsung introduces GALAXY S WiFi 4.2, the best of Android experiences with powerful gaming on the go. This experience is further enhanced with the device’s superior SoundAlive audio system, offering great sound either through a front stereo speaker or headphones. A gyroscope sensor enables the user to control the device by moving it, providing truly engaging and intuitive gaming.”
- Boost your Creativity and Productivity with GALAXY Note 10.1 [Samsung global news, on MWC 2012], also as a Samsung Mobile press release
- Samsung Enriches SAP’s Mobile Offerings with Android Devices [Samsung Mobile Press release on MWC 2012]: “Samsung today announced that SAP AG has selected Samsung’s GALAXY S II and Galaxy Tab 10.1 for internal use. Today’s announcement marks the first time that Android-based smartphones and tablets have been deployed by SAP to help enhance employee productivity.”
- Related new product pages on Samsung Mobile Press:
– GALAXY Tab 2 (7.0)
– GALAXY Tab 2 (10.1)
– GALAXY Beam
– GALAXY S WiFi 4.2
– GALAXY Note 10.1 - MWC 2012: The Android arms race is heating up: Case in point, Samsung
- Samsung Galaxy S III full specs [leaked]: 1.5GHz quad-core, 1080p display, ceramic case
- UPDATE: Chinese closing gap in hardware [Korea JoongAng Daily, Feb 29, 2012]: “Samsung decided not to reveal its third-generation Galaxy S smartphone at this year’s MWC due to fears it could give Chinese handset makers an advantage, insiders say. Samsung showcased its Galaxy S II smartphone at last year’s event, and launched it commercially three months later. Chinese companies are believed to have taken cues from the device, and they subsequently made strides in developing their own gadgets. To avoid a repeat this time round, Samsung plans on revealing the latest Galaxy S smartphone at the same time that it drops it on the market.”
- MWC 2012: Samsung’s Galaxy Tab 2, Galaxy Beam, Galaxy Note 10.1
- MWC 2012: Samsung Galaxy Note 10.1 unveiled
- Video: [GALAXY Beam] First Hands-on Video [SAMSUNGmobile YouTube Channel]
- all 1080+ news items for Feb 27-28 on: Samsung “Galaxy S III”»
- all 879+ news items for Feb 27-28 on: Samsung “Galaxy Tab 2″»
- all 720+ news items for Feb 27-28 on: Samsung “Galaxy Beam”»
- all 651+ news items for Feb 27-28 on: Samsung “Galaxy Note 10.1″»
Details for Nokia
All the launches: Nokia at Mobile World Congress [Nokia Conversations blog]
BARCELONA, Spain – Nokia announces six new phones and an array of new and updated services, advancing its new strategy and setting the pace for 2012.
Here’s our star-studded line-up for Barcelona 2012.
Nokia Lumia 610
The Nokia Lumia 610 is our most affordable Windows Phone to date – and the fourth we’ve brought to market. It’s aimed at young people who want access to a smartphone experience at the right price. Offering access to social networking, games, Nokia Maps and navigation, web-browsing and Nokia Music, the Lumia 610 comes in four bright colours. It will cost just €189 [$254] before taxes and subsidies, and starts shipping in April.
Nokia Lumia 900
First announced in January for AT&T’s LTE network in the US, the Nokia Lumia 900 will now be available worldwide in an HSPA+ edition. The Dual Carrier HSPA phone will allow for downloads up 42.2 Mbps. With a 4.3-inch ClearBlack AMOLED display, mobile media never looked so good, while an upgraded battery means there’s no compromise on longevity.
[Lumia 900 [DC-HSPA variant] $645 (€480) according to the press release]
Nokia 808 PureView
The Nokia 808 PureView extends our leadership in camera phones, with an amazing 41-megapixel sensor, Carl Zeiss optics and brand new pixel over-sampling technology. This means pin-sharp pictures, great low-light performance, yet with the ability to save your images in a suitable file size for social media, MMS and email. Also watch out for full 1080p video recording and exclusive Dolby Headphone technology to enrich the sound of any stereo content.
[The Nokia 808 PureView has a current price of €450 [$605]. It will be hitting stores in Q2 2012. – according to a press report]Nokia Asha 302, 202 and 203
We’re also introducing three new Nokia Asha mobile phones with new capabilities to bring them to smarter heights than ever. Aimed at urban consumers across the world, the Nokia Asha 302, 202 and 203 offer more than ever in terms of work and play. The Asha 302 is a QWERTY phone with support for Microsoft Exchange synchronisation, a first for Series 40 phones. The Asha 202 and 203 bring touch screens to a lower price point than ever and come with a massive entertainment bundle.
[Asha 202/203 $81 (€60), Asha 302 $128 (€95) according to the press release]
Super Services
Not satisfied with six new phones, there’s a whole raft of new and improved services. Nokia Drive for Windows Phone will now offer full, offline maps and turn-by-turn navigation. In addition, there’s Nokia Reading, the best e-book experience for Nokia Lumia. And Nokia Life bringing life skills, parenting, education, agriculture and entertainment services to Series 30 and 50 phones in India, China, Indonesia and Nigeria.
Click through for all the in-depth stories from today’s press conference. We’ll be bringing you even more detail, hands-on experiences and interviews with the brains behind these beauties over the course of the week.
Nokia 808 PureView
- Nokia 808 PureView – revolutionary camera technology; great smartphone[Nokia Conversations blog]
- Nokia 808 PureView ushers in a revolution in smartphone imaging[Nokia press release]
- Nokia PureView imaging technology [Nokia whitepaper, 10 pp.]
- Nokia 808 PureView creates a stir at MWC[Nokia Conversations blog]
- Nokia 808 PureView is star of the booth[Nokia Conversations blog]
- Nokia 808 PureView – The next breakthrough in photography [Nokia Europe microsite]: How it works – It’s a smart smartphone – link to the Product page
- MWC 2012: Nokia’s surprising 41MP Symbian phone
- MWC 2012: Nokia unveils “PureView”, stunning 41-megapixel smartphone
- Nokia 808 PureView tech specs and official photos: “The Nokia 808 PureView has a current price of €450 [$605]. It will be hitting stores in Q2 2012.”
- Video: Nokia 808 PureView [Hands-on][41-Megapixel Camera][MWC 2012][HD]
- MWC 2012: Nokia 808 PureView Ushers in a Revolution in Smartphone Imaging with 41MP Camera (VIDEO)
- all 128+ news items for Feb 27-28 on: Nokia Symbian “41 Megapixel” Camera »
Nokia Lumia 610 and 900 [DC-HSPA variant]
- New Nokia Lumia 610 and Nokia Lumia 900 [DC-HSPA variant] for all[Nokia Conversations blog]
- Nokia expands Lumia experience to new price points and geographies [Nokia press release] Lumia 610 $254 (€189), Lumia 900 [DC-HSPA variant] $645(€480)
- Nokia and Microsoft: 12 months on[Nokia Conversations blog]
- Product pages [Nokia Europe]:
– Lumia 900
– Lumia 610 - Data sheet: Lumia 900
- MWC 2012: Nokia Shows Low-Cost Lumia 610 Smartphone
- MWC 2012: Nokia Lumia 610, Skype Beta for Windows Phone, Tango
- Video: Nokia Lumia 610 demo
- all 1640+ news items for Feb 27-28 on: Nokia “Lumia 610” »
- all 2250+ news items for Feb 27-28 on: Nokia “Lumia 900” »
Nokia Asha 302, 202 and 203
- Asha Monday at Mobile World Congress[Nokia Conversations blog]
- Nokia expands its Asha range with smarter feature phones that improve ways to work, learn and play [Nokia press release]: Asha 202/203 $81 (€60), Asha 302 $128(€95)
- Product pages [Nokia Europe]:
– Asha 202 with dual SIM and touch screen [2.75G connectivity]
– Asha 203 with easy web browsing on an intuitive touchscreen [2.75G connectivity]
– Asha 302with fast 3.5G connectivity in an elegant design - Nokia Browser for Series 40 [Nokia data sheet]: “Even faster and better for more affordable access to the internet. Now available on Nokia Asha 202, Nokia Asha 203 and Nokia Asha 303”
- Nokia Asha for business [Nokia UK page]: “Mail for Exchange … compatible phones include the Nokia Asha 302 and Nokia Asha 303”
- Nokia launches three new Asha phones
- MWC 2012: Nokia Focuses on Developing World
- Hands-on: Nokia Asha 202, 203 and 302 at MWC 2012
- Video: Nokia Asha 302 Hands-on Review
Super Services
- New Nokia Services at Mobile World Congress [Nokia Conversations blog]:
– [Nokia Maps updated for Windows Phone: make the experience simpler, with fewer intrusive objects and signs to get in the way of where you want to go – and a reduced colour palette to allow the brain to process information more easily]
– Nokia Drive for Windows Phone: full, offline maps and turn-by-turn navigation
– [Nokia Public Transport now available as an app on the Nokia Lumia: to plan fast inner-city routes from point to point, and work out your time of arrival]
– Nokia Reading: the best e-book experience for Nokia Lumia
– Nokia Life: life skills, parenting, education, agriculture and entertainment services to Series 30 and 50 phones in India, China, Indonesia and Nigeria - UPDATE: New Maps, Drive and Transport in depth [Nokia Conversations blog, March 1, 2012]
- UPDATE: Mapping the new digital world [Nokia Conversations blog, Feb 28, 2012]
UPDATE: Bing Maps and Nokia Release Unified Map Design [Bing Maps Blog, Feb 28, 2012] - UPDATE: Nokia Transport Beta [NokiaBetaLabs, Feb 29, 2012]
- Nokia Services debut at MWC 2012
- MWC 2012: Nokia Reading, Nokia Transport, Nokia Maps and Nokia Drive 2.0/3.0 demoed
- Nokia updates Drive app, intros Nokia Reading and Nokia Transport
- Nokia Drive upgrade, new Nokia Reading and Groupon deal announced
- Nokia announces Reading, Transport and Life services
- Nokia Reading: Get gripped by a great book[Nokia Conversations blog]:
- Nokia announces Reading app for Lumia phones
- Nokia Drive 2.0, Reading, and Transport apps coming to Lumia range
- Nokia Drive 3.0: first look at future features
- all 518+ news items for Feb 27-28 on: Nokia Drive »
- all 315+ news items for Feb 27-28 on: Nokia Reading »
- all 812+ news items for Feb 27-28 on: Nokia Life »
- all 323+ news items for Feb 27-28 on: Nokia Maps »
- all 258+ news items for Feb 27-28 on: Nokia Transport »
AMD 2012-13: a new Windows 8 strategy expanded with ultra low-power APUs for the tablets and fanless clients
February 3, 2012 2:31 pm / 3 Comments on AMD 2012-13: a new Windows 8 strategy expanded with ultra low-power APUs for the tablets and fanless clients
AMD Strategy Transformation Brings Agile Delivery of Industry-Leading IP to the Market [AMD press release, Feb 2, 2012]
At its annual Financial Analyst Day, AMD (NYSE: AMD) detailed a new “ambidextrous” strategythat builds on the company’s long history of x86 and graphics innovation while embracing other technologies and intellectual property to deliver differentiated products.
AMD is adopting an SoC-centric roadmap designed to speed time-to-market, drive sustained execution, and enable the development of more tailored customer solutions. SoC design methodology is advantageous because it is a modular approach to processor design, leveraging best practice tools and microprocessor design flows with the ability to easily re-use IP and design blocksacross a range of products.
“AMD’s strategy capitalizes on the convergence of technologies and devices that will define the next era of the industry,” said Rory Read, president and CEO, AMD. “The trends around consumerization, the Cloud and convergence will only grow stronger in the coming years. AMD has a unique opportunity to take advantage of this key industry inflection point. We remain focused on continuing the work we began last year to re-position AMD. Our new strategy will help AMD embrace the shifts occurring in the industry, marrying market needs with innovative technologies and become a consistent growth engine.”
Roadmap Updates Focus on Customer Needs
Additionally, AMD today announced updates to its product roadmaps for AMD Central Processing Unit (CPU) and Accelerated Processing Unit (APU) products it plans to introduce in 2012 and 2013. The roadmap modifications address key customer priorities across form factors including ultrathin notebooks, tablets, all-in-ones, desktops and servers with a clear focus on low power, emerging markets and the Cloud.
AMD’s updated product roadmap features second generationmainstream (“Trinity”) and low-power (“Brazos 2.0”) APUs for notebooks and desktops; “Hondo,” an APU specifically designed for tablets; new CPU cores in 2012 and 2013 with “Piledriver” and its successor “Steamroller,” as well as “Jaguar,” which is the successor to AMD’s popular “Bobcat” core. In 2012, AMD plans to introduce four new AMD Opteron™ processors. For a more in-depth look at AMD’s updated product roadmap, please visit http://blogs.amd.com.
Next-generation Architecture Standardizes and Facilitates Software Development
AMD also provided further details on its Heterogeneous System Architecture (HSA), which enables software developers to easily program APUs by combining scalar processing on the CPU with parallel processing on the Graphics Processing Unit (GPU), all while providing high bandwidth access to memory at low power. AMD is proactively working to make HSA an open industry standard for the developer community. The company plans to hold its 2nd annual AMD Fusion Developer Summitin June, 2012.
New Company Structure Strengthens Execution
In conjunction with announcing its restructuring plan in November 2011, AMD has strengthened its leadership team with the additions of Mark Papermaster as senior vice president and chief technology officer, Rajan Naik as senior vice president and chief strategy officer, and Lisa Su as senior vice president and general manager, Global Business Units. These executives will help ensure that sustainable, dependable execution becomes a hallmark of AMD.
Supporting Resources
- Visit the AMD Financial Analyst Day websitefor webcast replay, presentations, updated roadmap, and more
- Visit AMD Blogsfor more details on AMD’s product roadmap changes
- Follow AMD on Twitter at @AMD_Unprocessed
- Like AMD on Facebook
AMD started talking about ‘Trinity’ and ‘Hondo’ last summer. See in Acer repositioning for the post Wintel era starting with AMD Fusion APUs [June 17, 2011]
What AMD could definitely be proud of for 2011 is A “Brazos” Story: The Little Chip That Could (And Then Just Kept On Going) [AMD Fusion blog, Feb 1, 2012]:
In late 2010, AMD shipped its first-ever Accelerated Processing Units (APUs), internally codenamed “Brazos,”which combined the tremendous processing power of graphics and x86 on a single chip.
We had high expectations for the low-voltage “Brazos” APU: great computing, HD, long battery life and DirectX 11 capable graphics, all on a single chip. Yet still we were blown away by the initial industry reception. It was only a year ago we left CES with seven highly-sought after innovation and technology awardsfor the little product we ultimately named the C- and E-Series APUs, including:
- 2010 PC Magazine Technical Excellence Award
- CES 2011 Design & Engineering Innovations Award
- CHIP China 2010 Highlight Awards
- Computer World China Innovation Award
- Notebooks.com Best Innovation CES 2011
- Popular Mechanics Editors’ Choice Award
- Shopping Guide China Most Advanced Digital Product Award
After CES we should have re-nicknamed “Brazos” the “Little Chip That Could.” And all throughout 2011, “Brazos” kept on chugging. We added the “Best in Show” Award at Embedded Systems Conference and the “2011 Best Choice of Computex TAIPEI Award” to the list of accolades. In the second quarter we sold more than five million C- and E-Series APUs. What a tremendous start to a new way of processing for AMD and the industry.
But “Brazos” kept on impressing, showing up in a variety of form factors – notebooks, netbooks, small desktops and all-in-ones– from top global OEM partners.
So it was no surprise or mistake that we ended 2011 with more than 30 million APUs shipped. It all started with little “Brazos,” which has now earned its place in history as AMD’s fastest ramping platform ever.
John Taylor, Director of Worldwide Product Marketing at AMD
CES 2012 Consumer Showcase Tour [amd, Jan 11, 2012]
AMD Codename Decoder – November 9, 2010 [AMD Business blog]
APU
An APU is an accelerated processing unit, a new generation of processors that combine either low-power or high-performance x86 CPU cores with the latest GPU technology (such as DirectX® 11) on a single die.
Planned for introduction: Q1 2011
…
“Bobcat”
Market: Multiple devices, including notebooks ultrathins, HD netbooks and small form factor desktops.
What is it? A sub-one watt capable x86 CPU core that first comes to market in the “Ontario” and “Zacate” Accelerated Processing Units (APU) for mainstream, ultrathin, value, and netbook form factors as well as small form factor desktop solutions. “Bobcat” is designed to be an extremely small, highly flexible, out-of-order execution x86 core that easily can be scaled up and combined with other IP in SoC configurations.
Planned for introduction: Q1 2011
“Brazos”
Markets: Value Mainstream Notebooks, HD Netbooks and Small Form Factor Desktops
What is it? “Brazos” is AMD’s 2011 low-power platform, available with two APUs; “Zacate” – currently planned to be marketed as the E Series – is an 18-watt TDP APU for ultrathin, mainstream and value notebooks as well as desktops and all-in-ones. “Ontario” – currently planned to be marketed as the C Series – is a 9-watt
APU for netbooks and small form factor desktops and devices. Both “Brazos” platform APUs include a DirectX® 11-capable GPU.
Planned for introduction: Q1 2011
“Bulldozer”
Market: Server and Client
What is it? A multi-threaded high-performance x86 CPU core contained in the “Zambezi” processor for client PCs and “Interlagos” and “Valencia” processors for servers. Included in the “Scorpius” enthusiast desktop PC platform and “Maranello,” “Adelaide,” and “San Marino” server platforms, “Bulldozer” is designed to be a completely new, high performance architecture that employs a new approach to multithreaded compute performance for achieving advanced efficiency and throughput. “Bulldozer” is designed to give AMD an exceptional CPU option for linking with GPUs in highly scalable, single-chip APU configurations. “Bulldozer” offers AMD another exceptional CPU option for combining with GPUs in highly scalable, single chip APU configurations, beginning in 2012 APU designs.
Planned for introduction: Client (1H 2011); Server (2H 2011)
…
“Llano”
Market: Notebooks and Desktops
What is it? Part of the “Sabine” platform, “Llano” is a 32nm APU including up to four x86 cores and a DirectX® 11-capable GPU, primarily intended for performance and mainstream notebooks and mainstream desktops. “Llano” is engineered to deliver impressive visual computing experiences, outstanding performance with low power and long battery life.
Planned for introduction: Mid-2011
…
“Ontario”
Market: Primarily ultrathin notebooks and HD netbooks
What is it? A 9W APU featuring dual or single “Bobcat” x86 cores currently planned to be marketed as the C Series, and primarily intended to serve the low power and highly portable PC markets for netbooks and small form factor desktops and devices.
Planned for introduction: Q1 2011
…
“Zacate”
Market: Notebook/Desktop
What is it? “Zacate” is AMD’s 18W APU designed for the mainstream notebook and desktop market. Zacate will feature low-power “Bobcat” CPU cores and support DirectX 11 technology.
Planned for introduction: Q1 2011
…
More information about 2011 AMD APU past on this blog:
– Acer repositioning for the post Wintel era starting with AMD Fusion APUs [June 17, 2011]
– Supply chain battles for much improved levels of price/performance competitiveness [Aug 16, 2011]
– Acer & Asus: Compensating lower PC sales by tablet PC push [March 29 – Aug 2, 2011]
– CES 2011 presence with Microsoft moving to SoC & screen level slot management that is not understood by analysts/observers at all [Jan 7, 2011]
– Changing purchasing attitudes for consumer computing are leading to a new ICT paradigm [Jan 5, 2011]
AMD 2012 APU, code name “Trinity” [amd, Jan 11, 2012]
AMD started talking about ‘Trinity’ last summer. See in Acer repositioning for the post Wintel era starting with AMD Fusion APUs [June 17, 2011]
Advanced Micro Devices’ CEO Discusses Q4 2011 Results – Earnings Call Transcript [Seeking Alpha, Jan 24, 2012]
We are seeing particularly strong customer interest in our expanded low-power APUs for 2012. The low-power versions of our next-generation chip, Trinity APU, delivers mainstream performance while using half the power of our traditional notebook processor. This processor fits into an ultrathin notebook design, as thin as 17 millimeters, providing industry-leading visual performance and battery life at very attractive price points. Trinity remains on track to launch for midyear.
…
We achieved record quarter client revenue driven by an increase in supply of Llano APUs. And in Q4 of 2011, APUs accounted for nearly 100% of mobile microprocessors shipped and more than 60% of the total client microprocessors shipped. Microprocessor ASP increased sequentially due to an increase in mobile microprocessor ASP and an increase in server units shipped.
There is no doubt that the customer acceptance of our APU architecture is quite strong. We’ve now shipped over 30 million of these APUs to date. And we’re seeing a strong uptake in terms of that architecture, what it means to the customer. They are looking for a better experience, and I think that’s a key reason why we’ve seen the momentum in our business and the ability to deliver on that. Our focus on execution around the APUs and around Llano is definitely paying off. And I think as we move forward, we should be able to continue to build on that momentum.
…
We’ve actually increased our Llano 32-nanometer product delivery by 80% from the third quarter, and now Llano makes up almost 60% of the mobile microprocessing revenue. … We’re going to continue to build on the strong relationships that we’ve been developing with GLOBALFOUNDRIES as we move forward.
…
The movement to thin and light is nothing new. Customers want mobility. And the idea of ultrathin is something that we’re very focused on. And if you think about it with our APU strategy that I mentioned, with the next-generation product, Trinity APU, we already are well ahead of the pace last year when we set a record-setting year for design wins with the Trinity product in 2012. With that product, we can deliver ultrathin in the range of 17 millimeters. And what’s really important and I think we have to all focus on is ultrathin and mobility, the ability for computing to reach customers across the planet. … And I’ll add that the improvements that we’ve made in Trinity in both our CPU and the GPU are really delivering outstanding results in performance per watt. So as well for the ultrathins being able to hit the 17-millimeter low-profile, we’re also getting a doubling of the performance per watt. So it’s an exciting application of our APU technology.
…
… as you think of the industry trends around consumerization, cloud and convergence, there’s no doubt, as we’ve seen these kinds of inflection points in the industry, there’s always a significant downward pressure in terms of the price points. So if you’re dragging huge asset base along with you and there comes pressure into the market around those price points, that could put pressure into their [Intel’s] — into a business model. … We think the emerging market and the entry — and the high-growth markets around entry and mainstream will be the hottest segment, and I think that’s playing to our hand. We’re going to emphasize this strategy. We want to embrace this inflection point that’s emerging. We want to accelerate it, because shift happens when there’s these inflecting points.
Of course, we see the investment of our competitor, but the fabless ecosystem is not sitting still. And if you look at the investments that are done on their — TSMC, at a GLOBALFOUNDRIES and a GLOBALFOUNDRIES and alliances level, then the numbers are very comparable. GLOBALFOUNDRIES and their partnership models invest about $9 billion this year. TSMC seeds around $6 billion, if I recall the number correctly. So this is, in terms of scale and absolute numbers, are very comparable to what Intel is putting on the table.
… I feel pretty good about where we are in terms of the transition around 32 nm. … And I want to emphasize, we’ve made real progress, but we’re not finished with that. And we need to continue to work every day with those tiger teams we’ve put in place. We’re tracking the test vehicles through the lines to make sure that we’re getting that consistent improvement, because that will reduce our consumption of wafers and give us far more flexibility in our supply chain. So while we have improved by 80% from the third quarter, we’re not all the way there yet … there’s more yield improvements possible on that 32-nanometer line. … And those same techniques and practices that the teams — the tiger teams applied on 32-nanometer, that momentum continues in the 28-nanometer. And so that poises us well going into the coming 2012.
… I think it’s fair to say from the improvements we have seen and the — and our foundry partners that we are not going to be supply-constrained in the first quarter. … I think the progress we have seen on Trinity has impressed us. And of course, all the learnings that have been done on 32-nanometer with the Llano product will be transferred to Trinity. So the start-off pace with Trinity is going to be significantly better from a yield perspective compared to where we were at Llano launch. So that makes us quite optimistic looking forward.
…
Here are also a couple of illustrations highlighting that 2011 APU success with the details of new APU strategy additions from Lisa Su‘s (Senior Vice President and General Manager, Global Business Units) presentation for the 2012 Financial Analyst Day held on February 2, 2012 (see her full presentation in PDF):
APUs BRING LEADERSHIP GRAPHICS/COMPUTE IP TO MAINSTREAM [#10]
2011: AMD first to introduce heterogeneous computing to mainstream applications
“Llano” APU offers nearly 3X the performance in the same power envelope over conventional CPUs (2)
Fully leverages the growing ecosystem of GPU-accelerated apps
Source: AMD Performance labs
(1) Testing performed by AMD Performance Labs. Calculated compute performance or Theoretical Maximum GFLOPS score for 2013 Kaveri (4C, 8CU) 100w APU, use standard formula of (CPU Cores x freq x 8 FLOPS) + (GPU Cores x freq x 2 FLOPS). The calculated GFLOPS for the 2013 Kaveri (4C, 8CU) 100w APU was 1050. GFLOPs scores for 2011 A-Series “Llano” was 580 and the 2013 [2012] A-Series “Trinity” was 819. Scores rounded to the nearest whole number.
(2) Testing performed by AMD Performance Labs. Calculated compute performance or Theoretical Maximum GFLOPS score (use standard formula of CPU Cores x freq x 8 FLOPS) for conventional CPU alone in 2011 was 210 GFLOPs while the calculated GFLOPs for the 1st Gen APU using standard formula (CPU Cores x freq x 8 FLOPS) + (GPU Cores x freq x 2 FLOPS) was 580 or 2.8 times greater compute performance.
Related new codenames (from the AMD provided At-a-Glance Codename Decoder [Feb 2, 2012]):
“Trinity” APU (Traditional Notebooks, Ultrathin Notebooks and Desktops)
- “Trinity” is AMD’s second generation APU and improves the power and performance of AMD’s A-Series APU lineup for mainstream and high-performance notebooks and desktops. “Trinity” will feature next-generation “Piledriver” CPU cores and new, DirectX® 11-capable, second generation AMD Radeon™ HD 7000 series graphics.
- New for 2012, AMD will offer a BGA or pin-less format, low power “Trinity” APU specifically designed for ultrathin notebooks.
- Planned for introduction: Mid-2012
“Piledriver” Core Micro Architecture
- “Piledriver” is the next evolution of AMD’s revolutionary “Bulldozer” core architecture.
- The “Trinity” line-up of APUs will be the first introduction of “Piledriver.”
“Kaveri” APU (Notebooks and Desktops)
- “Kaveri” is AMD’s third generation APU for mainstream desktop and notebooks.
- These APUs will include “Steamroller” cores, and new HSA-enabling features for easier programming of accelerated processing capabilities.
- Planned for introduction: 2013
“Steamroller” Core Micro Architecture
- “Steamroller” is the evolution of AMD’s “Piledriver” core architecture.
AMD OPTERON™ FUTURE TECHNOLOGY [#26]
Additional new codename (from the AMD provided At-a-Glance Codename Decoder):
“Excavator” Core Micro Architecture
- “Excavator” is the evolution of AMD’s “Steamroller” core architecture.
APU ADOPTION: RECORD DESIGN WINS, STRONG END-USER DEMAND [#11]
Shipped > 30m APUs to date
11 of the world’s top 12 OEMs shipping AMD APU-based platforms
“Brazos” APUs shipped more units in its first year than any previous mobile platform in AMD history
“Llano” APUs ramped to represent nearly 60% of mobile processor revenue by Q4 2011
Additional new codenames (from the AMD provided At-a-Glance Codename Decoder):
“Southern Islands” Discrete Graphics
- Internal codename for the entire family of desktop graphics ASICs based on Graphics Core Next architecture and utilizing 28nm process technology.
- “Southern Islands” products include “Tahiti” (AMD Radeon™ HD 7900 series), “Pitcairn,” “Cape Verde” and “New Zealand.”
“Brazos 2.0” APU (Essential Desktop and Notebook, Netbook, All-In-One and Small Desktop)
- The “Brazos 2.0” family of APUs will follow “Brazos”, AMD’s fastest ramping platform ever.
- In addition to increased CPU and GPU frequencies, “Brazos 2.0” will offer additional features and functionality as compared to “Brazos”.
- Planned for introduction: H1 2012
“Hondo” APU (Tablet)
- “Hondo” is AMD’s sub-5W APU designed for tablets. “Hondo” will feature low-power “Bobcat” CPU cores and support DirectX® 11 technology in a BGA or pin-less format.
- Planned for introduction: H2 2012
AMD started talking about ‘Hondo’ (as well as ‘Trinity’) last summer. See in Acer repositioning for the post Wintel era starting with AMD Fusion APUs [June 17, 2011]

(3) Projections and testing developed by AMD Performance Labs. Projected score for 2012 AMD Mainstream Notebook Platform “Comal” on the “Pumori” reference design for PC Mark Vantage Productivity benchmark is projected to increase by up to 25% over actual scores from the 2011 AMD Mainstream Notebook Platform “Sabine”. Projections were based on AMD A8/A6/A4 35w APUs for both platforms.
(4) Projections and testing developed by AMD Performance Labs. Projected score for the 2012 AMD Mainstream Notebook Platform “Comal” the “Pumori” reference design for 3D Mark Vantage Performance benchmark is projected to increase by up to 50% over actual scores from the 2011 AMD Mainstream Notebook Platform “Sabine”. Projections were based on AMD A8/A6/A4 35w APUs for both platforms.
(5) Testing performed by AMD Performance Labs. Battery life calculations using the “Pumori” reference design based on average power draw based on multiple benchmarks and usage scenarios. For Windows Idle calculations indicate 732 minutes (12:12 hours) as a resting metric; 421 minutes (7:01 hours) of DVD playback on Hollywood movie, 236 minutes (3:56 hours) of Blu-ray playback on Hollywood movie, and 205 minutes (3:25 hours) using 3D Mark ‘06 as an active metric.
Projections for the 2012 AMD Mainstream Platform Codename “Comal” assume a configuration of “Pumori” reference board, Trinity A8 35W 4C – highest performance GPU, AMD A70M FCH, 2 x 2G DDR3 1600, 1366 x 768 eDP Panel / LED Backlight, HDD (SATA) – 250GB 5400rpm, 62Whr Battery Pack and Windows 7 Home Premium.
Additional new codenames (from the AMD provided At-a-Glance Codename Decoder):
“Sea Islands” Graphics Architecture
- New GPU Architecture and HSA Features
- Planned for introduction: 2013
“Kabini” APU (Essential Desktop and Notebook, Netbook, All-In-One and Small Desktop)
- The “Kabini” APU is AMD’s second generation low-power APU and follow-on to “Brazos 2.0.”
- In addition to new “Jaguar” cores, these APUs will be enhanced with new Heterogeneous Systems Architecture (HSA), enabling features for easier programming of accelerated processing capabilities.
- Planned for introduction: 2013
“Temash” APU (Tablet and Fanless Client)
- The “Temash” APU is AMD’s second generation tablet APU and follow-on to “Hondo.”
- In addition to new “Jaguar” cores, these APUs will be enhanced with new Heterogeneous Systems Architecture-enabling features for easier programming of accelerated processing capabilities.
- Planned for introduction: 2013
“Jaguar” Core Micro Architecture
- “Jaguar” is the evolution of AMD’s “Bobcat” core architecture for low-power APUs.
MOBILE MARKET PROJECTIONS [#29] AMD Direction:
Focus on true productivity and user experience in ultra-low power devices
Leadership graphics, web applications and video processing leveraging APUs
Agile, flexible SoC designs
Ambidextrous solutions across ISAs and ecosystems
Fanless, sealed designs
These APU related strategic moves have been summarized by the same John Taylor as Strengthening our Client Roadmap [AMD Fusion blog, Feb 2, 2012]:
Roadmaps signify our plans to customers and business partners, outlining the new products and technologies that we are bringing online. In an ideal world plans would never change. But in reality, change is a certainty in the tech industry – new form factors immerge, technologies and applications shift and consumer tastes remake technology plans.
Like any technology company, AMD desires to anticipate change in the industry. So we course-correct as we work with customers to ensure that we create products that address the optimal blend of timing, features and performance, cost and form factors.
Today at our Financial Analyst Day in Sunnyvale, AMD senior staff detailed how AMD will focus its investments in R&D and marketing going forward, including roadmaps for 2012-2013. As Phil Hughes summarized, the announced roadmaps are designed to extend platform longevity, accelerate time to market and enhance performance and features. These roadmaps strengthen AMD’s ability to make the most of shifting market dynamics, all the while giving stand-out experience across device categories through our graphics and video IP. This blog provides some insight into our 2012 and 2013 roadmaps – the words in quotes are the codenames for the particular AMD processor offerings discussed today.
2012 Client Roadmap
AMD’s “Brazos 2.0” Accelerated Processor Unit (APU) family will be used for essential desktop and notebook, netbook, tablet, all-in-one and small desktop form factors. This allows us to address a fast-growing segment of the PC market where we have proven success with the original “Brazos” line-up – the C-Series, E-Series and Z-SeriesAPUs. We will add plenty of new features to the “Brazos 2.0” APU family, including increased CPU and GPU performance, longer battery life, a bevy of integrated I/O options and improvements to AMD Steady Video technology. “Brazos 2.0” is scheduled to hit the market in the first half of 2012.
As we demoed at CES, AMD’s “Trinity” APU for desktop and notebook remains on track for introduction in mid-2012, with plans to pack up to four “Piledriver” CPU cores and next-generation DirectX® 11-capable graphics technology, together delivering up to 50% more compute performance than our “Llano” offerings, including superior entertainment potential, longer battery-life and an even more incredibly brilliant HD visual experience.
New for 2012, AMD will introduce a low voltage “Trinity” APU that will be ideal for the next-generation of ultrathin notebook. This “Trinity” APU matches the experience enabled by the AMD 2011 APU in up to half the TDP. As we said, “Trinity” is on track for introduction in mid-2012.
In 2012 we will also introduce the ultra-low voltage “Hondo” APU for tablets. These low-power (power maxes out at 5W TDP) APUs will have “Bobcat” CPU cores and support DirectX 11 technology in a BGA or pin-less, thin processor package. Look for these in the second half of 2012 – more details to come later.
On the desktop platform side of things, the “Vishera” CPU will replace the “Komodo” CPU for desktop. This change enables accelerated time to market for improved performance and next-generation CPU features while maintaining the existing AM3+ motherboards. The “Vishera” CPU ushers in many exciting updates, includes 8 “Piledriver” cores, and when compared with the previous generation, provides higher frequencies, improved instruction per clock performance, advanced instruction sets (thus increasing application performance), additional DDR3 memory support and next-generation AMD Turbo Core Technology. We plan to launch “Vishera” in the second half of 2012.
2013 Client Roadmap
2013 brings major evolution to the client roadmaps as the vision presented by Rory, Mark and Lisa today begin to manifest – including moving our low power APUs to a system on a chip (SoC) design with the AMD Fusion Controller Hub integrated right into a single chip design.
In the performance APU category our third-generation APU, “Kaveri,”will employ “Steamroller” (the evolution of AMD’s “Piledriver” core architecture) x86 cores for enhanced instructions per clock and power advantages. Applications that take advantage of GPU accelerate will give users an amazing experience thanks to our Graphics Core Next and new Heterogeneous Systems Architecture (HSA) enabling features for easier programming of accelerated processing capabilities.
In the low power category, the “Kabini” SoC APU takes over for “Brazos 2.0.” This second generation low power APU integrates “Jaguar” x86 cores for augmented performance and energy efficiency. These APUs will also benefit from select HSA features and functionality.
We keep on innovating for the ultra-low power space in 2013. Our second generation, ultra-low-power “Temash” SoC APU will follow “Hondo” for tablet and other fanless form factors. This APU will also leverage the “Jaguar” low-power x86 cores and HSA features.
We at AMD strongly believe these roadmap updates help us time new product introductions with customer design phases to hit key sales cycles across a range of form factors and experiences. We are moving with the market and on the path to deliver exceptional productivity and user experience in a wide array of form factors.
John Taylor, Director of Worldwide Product Marketing at AMD
He also provided the following answers to questions regarding how AMD spells out Windows 8 tablet strategy [CNET, Feb 2, 2012]:
…
Q: Before, we go to Windows 8, what is your smartphone strategy, if any?
Taylor: The smartphone market is eight, nine, ten, maybe a dozen players. [They have] lower ASPs (average selling price), lower [profit] margins, different competitive dynamic. So, there is no shift on the smartphone strategy.And Window 8?
Taylor: But you will see much more focus on tablets, the convertible or hybrid devices that fit between tablets and notebooks, very thin [designs].What chips exactly will get you there?
Taylor: For tablets, it will decidedly be the Hondo chip. We’re acknowledging that we still have a couple of watts to shave off to really be a more ideal tablet platform (to achieve optimal power efficiency). But we think that Temash gets us much, much closer to that in 2013.And Windows 8 convertibles?
A 17-watt [power consumption] is the lowest that we’ll offer. That’s called Trinity. It will be unmatched in that [17-watt design] space. Discrete graphics-like performance. All types of dedicated video processing capabilities, better battery life than the competition. And all of these ways that we’re driving the new generation of accelerated applications. If you think about the Web apps that are being built for Win 8, using HTML5 and the graphics enginethat drives that higher level experience.…
I will add to that the following two illustrations from the AMD Product and Technology Roadmaps[AMD FAD, Feb 2, 2012]:
![]()
“Vishera” CPU (Desktop)
- The “Vishera” desktop CPU incorporates up to eight “Piledriver” cores, advanced instruction sets and other performance enhancing additions
- This next-generation CPU will maintain the AM3+ infrastructure.
- Planned for introduction: H2 2012
In addition to the above described expansion of the original APU strategy for the clients there is a kind of naming change with AMD Fusion System Architecture is now Heterogeneous Systems Architecture [AMD Fusion blog, Jan 18, 2012]
Since its introduction to the public in June 2011 at the AMD Fusion11 Developer Summit, the AMD Fusion System Architecture (FSA) has received widespread support and interest from our business partners and technology industry leaders. FSA was the blueprint for AMD’s overarching design for utilizing CPU and GPU processor cores as a unified processing engine, which we are making into an open platform standard. This architecture enables many benefits, including high application performance and low power consumption.
Our software partners are already taking advantage of the power and performance advantage of APU and GPU acceleration, with more than 200 accelerated applications shipped to date. The combination of industry standards like OpenCL and C++ AMP, alongside FSA, is ushering in the era of heterogeneous computing.
Together with these software partners, we have built a heterogeneous compute ecosystem that is built on industry standards. As such, we believe it’s only fitting that the name of this evolving architecture and platform be representative of the entire, technical community that is leading the way in this very important area of technology and programing development.
FSA will now be known as Heterogeneous Systems Architecture or HSA. The HSA platform will continue to be rooted in industry standards and will include some of the best innovations that the technology community has to offer.
Manju Hegde and I will be hosting a breakout session on HSA at AMD’s Financial Analyst Day on February 2nd 2012, which will be webcast live here. More information on the latest advances in HSA design will be released at a future date.
Also, if you haven’t already made plans to attend the AMD Fusion12 Developer Summit in June 2012 in Bellevue, Washington, I encourage you to save the date. Leaders from the technology and programming development communities will converge at the summit to discuss Heterogeneous Computing and the next-generation user experiences that are enabled by this platform.
Phil Rogers, corporate fellow at AMD.
From the Analyst Day breakout session presentation I will include the following illustrations here as the food for thoughts and further interests:
For Windows 8 related HSA, “C++ AMP” (indicated on the last illustration) is worth to expand on via Introducing C++ Accelerated Massive Parallelism (C++ AMP) [MSDN Blogs, June 15, 2011]
A few months ago, Herb Sutter told about a keynote he was to delivered today in the AMD Fusion Developer Summit (happening these days). He said by then:
“Parallelism is not just in full bloom, but increasingly in full variety. We know that getting full computational performance out of most machines—nearly all desktops and laptops, most game consoles, and the newest smartphones—already means harnessing local parallel hardware, mainly in the form of multicore CPU processing. (…) More and more, however, getting that full performance can also mean using gradually ever-more-heterogeneous processing, from local GPGPU and Accelerated Processing Unit (APU) flavors to “often-on” remote parallel computing power in the form of elastic compute clouds. (…)”
In that sense, S. Somasegar, Senior Vice President of the Developer Division made this morning the following announcement:
“I’m excited to announce that we are introducing a new technology that helps C++ developers use the GPU for parallel programming. Today at the AMD Fusion Developer Summit, we announced C++ Accelerated Massive Parallelism (C++ AMP). (…) By building on the Windows DirectX platform, our implementation of C++ AMP allows you to target hardware from all the major hardware vendors. (…)”
C++ AMP, as Soma tells in his post, is actually an open specification. Microsoft will deliver an implementation based on its Windows DirectX platform (DirectCompute, as Daniel Moth specifies in a later posta few minutes ago).
Daniel added that C++ AMP will lower the barrier to entry for heterogeneous hardware programmability, bringing performance to the mainstream. Developers will get an STL-like library as part of the existing concurrency namespace (whose Parallel Patterns Library –PPL and its Concurrency Runtime –ConcRT are also being enhanced in the next version of Visual C++ –check references at the end of this post for further details) in a way that developers won’t need to learn a different syntax, nor using a different compiler.
Update (6/16/2011): “Heterogeneous Parallelism at Microsoft”, the keynote where Herb Sutter and Daniel Moth introduced this technology with code and graphic demos is available for on-demand watching.
Update (6/17/2011): Daniel Moth’s session “Blazing-fast Code Using GPUs and More, with C++ AMP” is available as well! Beside, Dana Groff tells what’s new in Visual Studio 11 for PPL and ConcRT.
Pedal to the metal, let’s go native at full speed!
References:
- S. Somasegar’s announcement: http://blogs.msdn.com/b/somasegar/archive/2011/06/15/targeting-heterogeneity-with-c-amp-and-ppl.aspx
- Daniel Moth’s blog post: http://www.danielmoth.com/Blog/C-Accelerated-Massive-Parallelism.aspx
- Herb Sutter’s keynote at the AMD Fusion Developer Summit: http://channel9.msdn.com/Events/AMD-Fusion-Developer-Summit/AMD-Fusion-Developer-Summit-11/KEYNOTE
- Daniel Moth: Blazing-fast Code Using GPUs and More, with C++ AMP (session presented at AMD Fusion Developer Summit): http://channel9.msdn.com/Events/AMD-Fusion-Developer-Summit/AMD-Fusion-Developer-Summit-11/DanielMothAMP
- Announcing the PPL, Agents and ConcRT efforts for Visual Studio 11, by Dana Groff: http://blogs.msdn.com/b/nativeconcurrency/archive/2011/06/16/announcing-the-ppl-agents-and-concrt-efforts-for-v-next.aspx
- AMD Fusion Developer Summit Webcasts: http://developer.amd.com/afds/pages/webcast.aspx
With that in mind the upcoming 2012 AMD Fusion Developer Summit will definitely bring quite important updates as promised by the last breakout session illustration:
![]()
More on that: Adobe and Cloudera among Keynotes at AMD Fusion12 Developers Summit [AMD Fusion blog, Feb 3, 2012]
Finally, regarding the ‘ambidextrous’ strategy mentioned in the first sentence of the press release:
- ‘ambidextrous’ generally means ‘very skillful and versatile’ coming from ‘able to use the right and the left hand with equal skill’
- it is described in the press release as:
-
… adopting an SoC-centric roadmap designed to speed time-to-market, drive sustained execution, and enable the development of more tailored customer solutions. SoC design methodology is advantageous because it is a modular approach to processor design, leveraging best practice tools and microprocessor design flows with the ability to easily re-use IP and design blocks across a range of products. …
- and detailed in Mark Papermaster‘s (Senior Vice President and Chief Technology Officer) presentation for the 2012 Financial Analyst Day held on February 2, 2012 (see his full presentation in PDF) via the following illustrations:
![]()
as the Go-to-market approach together with ODM / OEM relationships
![]()
specifically highlighting the differentiation with it for the datacenter
![]()
related to MDC [Multi-DataCenter] workloads and HSA.
But also mentioning it in more generic terms as:
![]()
”Flexible around ISA [Instruction Set Architecture]” and
“Flexible around combination of AMD IP and third party IP”
Which caused probably the biggest interest and questions among participating analysts what made even The Wall Street Journal to report as AMD Will Incorporate Others’ Technology in Its Chips [Feb 3, 2011]:
Advanced Micro Devices Inc., the microprocessor maker whose fortunes have long been closely tied to the same technology as bigger rival IntelCorp., is planning a more flexible future.
The company on Thursday said it may pursue what it calls an “ambidextrous” strategy that would allow it to offer chips that include circuitry developed by other companies as well as its own. One obvious option would be low-power microprocessor technology from ARM HoldingsPLC that now dominates chip markets for cellphones and tablet computers.
AMD Chief Executive Rory Read, at a meeting with analysts here and in a subsequent interview, stopped short of saying that AMD would definitely add ARM-based technology to its chips in the future. But he noted that the company is laying the technical groundwork for modular chips that could accept blocks of circuitry developed by ARM as well as other companies.
“We have a relationship with ARM, and we will continue to build on it,” Mr. Read said in an interview. “We will continue to evolve that relationship as the market continues to evolve.”
Such possibilities are a sign of how the exploding market for mobile devices is causing many companies to alter their strategies. The x86 design used by AMD and Intel is the foundation of virtually all personal and most server computers.
But the two companies have struggled to make headway in the mobile-device market, in large part because of the lower power consumption of ARM-based designs. Meanwhile, ARM licensees—which include Qualcomm Inc., Texas Instruments Inc. and Nvidia Corp.—are adding to the pressures by edging toward the PC market, as MicrosoftCorp. finishes development of a new operating system that supports ARM and x86 chips.
AMD’s management team, in a meeting with analysts here, took pains to dispute the notion that AMD may become marginalized as ARM-powered competitors enter the PC market. Rather, they argued, AMD’s strength in graphics and microprocessors—and a strategy of customizing chips for large customers—will expand AMD’s opportunities.
Indeed, Mr. Read argued, it is Intel’s outsize influence of the tech industry that will tend to decline. “We will see the breakdown of proprietary control points,” Mr. Read said.
Though Mr. Read didn’t commit to embracing ARM’s designs, others who heard his presentation said the direction is clear. “AMD was very deliberate today about their goal to integrate more third-party intellectual property,” said Patrick Moorhead, a former AMD vice president and now principal analyst at Moor insights & Strategy. “Nothing they communicated excluded the potential for ARM.”
AMD’s remarks also underscore an industry shift—driven largely by the mobile market—away from separate chips and toward multi-function products that the industry calls SoCs, for systems on a chip, which save space and power in mobile devices and other hardware.
Intel and AMD have begun offering SoCs for laptop computers. But AMD discussed extensive plans to create more such products at a faster rate, using a flexible design scheme that can accommodate technology submitted by other companies.
Mr. Read, who previously served as a senior executive at PC maker Lenovo GroupLtd., has recruited others that also worked at IBM and have experience with other chip technologies than x86.
One is Mark Papermaster, AMD’s senior vice president and chief technology officer, who worked at Apple Inc. and Cisco Systems Inc. after leaving IBM in 2008. Another is Lisa Su, a senior vice president and general manager of AMD’s global business units, who most recently worked at Freescale Semiconductor HoldingsLtd., an ARM user.
Ms. Su gave an updated road map for a series of future chips, including products that AMD expects to be used in tablets that are powered by Microsoft’s forthcoming Windows 8 operating system. But Mr. Read said AMD would likely stay away from trying to sell chips for smartphones soon, characterizing the market as too crowded with competitors.
Marvell SoCs to win both Microsoft and Nokia for Windows Phone and Windows 8 platforms (after the Kinect success)
February 1, 2012 1:13 pm / 2 Comments on Marvell SoCs to win both Microsoft and Nokia for Windows Phone and Windows 8 platforms (after the Kinect success)
Update: – Marvell licenses VeriSilicon DSP cores [Feb 13, 2012]
SAN FRANCISCO—Marvell Technology Group Ltd. has signed a licensing agreement for VeriSilicon Holdings Co. Ltd.’s ZSP G3 intellectual property cores, including the dual-MAC ZSP800M and ZSP880M synthesizable DSP cores, VeriSilicon said Monday (Feb. 13). Financial terms of the deal were not disclosed.
Marvell is also using VeriSilicon’s quad-MAC ZSP800 core and suite of HD-audio software solutions in the ARMADA 1000 HD media processor SoC and the recently introduced Marvell ARMADA 1500 media processor SoC, VeriSilicon (Santa Clara, Calif.) said. These chips are designed for applications such as Blu-ray players, digital media adapters, HD-STB and HDTVs.
According to VeriSilion, the dual-MAC ZSP architecture offers a balance of high performance, power efficiency and lower cost to support the increasing feature convergence in mobile and digital entertainment products and enable prolonged battery life. The company claims its products offer ease of use and strong customer support.
“We are quite impressed with the area and power efficiency of the dual-MAC ZSP800M core, combined with the ease of programming on the ZSP architecture,” said Ivan Lee, vice president of mobile products at Marvell, in a statement. “VeriSilicon’s ZSP-based HD-audio and voice software solutions will provide us with faster time-to-market advantages necessary to meet the growing demands of the mobile platform solutions for use in tablets and smartphones.”
Marvell’s Cutting-Edge Application Processors [Jan 10, 2012]
From [2:45] the so-called hybrid multiprocessing technology is mentioned with showing the above architecture. It was introduced back in September 2010 with ARMADA 628 (see: Marvell ARMADA beats Qualcomm Snapdragon, NVIDIA Tegra and Samsung/Apple Hummingbird in the SoC market [again] [Sept 23, 2010 – Jan 17, 2011]) at the time when Marvell was working on the earlier ARMADA 610 (see also in the indicated post) for the RIM Blackberry Playbook. Six month into the project RIM dumped the 610 for a TI SoC, but even with that was only able to deliver the stable version of its new QNX software on version 2, missing the crucial 2010 Holiday season. While rumors of that time blamed Marvell for that, according to a current view: “It appears that the failures are largely RIM’s, and often software related. The Marvell processors, when used, seem to work well.“
The first larger scale win for ARMADA 610 was the VIZIO VTAB1008 8″ tablet operating with Android, made available in August 2011 (see: Innovative entertainment class [Android] tablet from VIZIO plus a unified UX for all cloud based CE devices, from TVs to smartphones [Aug 21, 2011 – Jan 7, 2012]). This tablet is shown earlier in the above video (from [0:19] to [1:24]). The ARMADA 628 still has to arrive in a tablet which probably will happen only late in 2012 on Android (as “The company looks at the tablets market as ‘saturated’ and is avoiding it for the next couple quarters“, see below) and might happen in Q4 as the earliest on Windows 8 as hinted explicitly below by Marvell. This is just a possibility (but a very big opportunity for OEMs considering the obvious maturity of 628), nothing more, as any OEM engagement currently under way might end up in a market relased product, or not (as in the case of Playbook with ARMADA 610).
Note: in the above video instead of ARMADA the earlier PXA branding is used by Marvell’s Allen Leibovitch. Jack Kang in charge of the Application Processors business is also using the PXA branding, as you could read below.
After the First real chances for Marvell on the tablet and smartphone fronts [Aug 21, 2011 – Jan 19, 2012], so far in the Android, Google TV, educational (more edu) and OPhone spaces, here is the next large scale opportunity for the company. With the young and entrepreneurial Jack Kang in charge since H2CY2010, who has an excellent earlier track record with Microsoft via the hugely successfull Microsoft Kinect application SoC effort, there is a real chance for the company to conclude with platform wins the reported below new engagements with both Microsoft and Nokia in 2012:
Exclusive: Marvell Says it Will Find a Home in Chinese Windows Phones [DailyTech, Jan 31, 2012]
Marvell also hints at possible Windows 8 tablets/laptops
We had an interesting chat with the Marvell Technology Group, Ltd. (MRVL).
Marvell is perhaps best known as the company that took the Xscale ARM division off of Intel Corp.’s (INTC) hands in 2006. During the modern smartphone era, Marvell has been a quiet competitor, overshadowed by companies like Qualcomm Inc. (QCOM) and Samsung Electronics Comp., Ltd. (KS:005930) which have pushed the smartphone processing power envelope more aggressively.
By contrast Marvell has focused on budget smartphones. It is in most of Research in Motion, Ltd.’s (TSE:RIM) BlackBerry smartphones. These budget smartphones have led it to strong sales in Indonesia and China.
Marvell has done well in China, thanks to close ties with RIM and Nokia.
[Image Source: BlackBerry Rocks]Interestingly, the American company sees China as perhaps its most valuable market. Jack Kang, director of Marvell’s applications processor business unit states, “China was a very strategic investment.”
With Windows Phones set to land in China later this year in budget smartphones, Mr. Kang is making a bold prediction — “If there’s Windows Phones in China, there will probably be Windows Phones with Marvell in China.”
That would be a major market event as thus far Qualcomm has been the exclusive ARM chipmaker partner of Windows Phone. While Windows Phone has struggled in the U.S. where key Windows Phone partner Nokia Oyj. (HEL:NOK1V) has virtually no market share, in China Nokia is the top smartphone maker, so a switch to Marvell ARM cores would be quite a coup.
Nokia is the top phonemaker in China, thus it’s crucial that Marvell gets in Nokia’s new Chinese Windows Phones when it makes the shift later this year. [Image Source: M.I.C. Gadget]Mr. Kang feels his firm’s biggest strength is providing “quality low-cost devices”. While it doesn’t bake discrete Wi-Fi circuitry into some of its system-on-a-chip devices, it says this approach works in markets like Indonesia or rural China where there’s plentiful 3G but sparse Wi-Fi coverage.
Marvell current produces single and dual-core chips, with the smartphone-aimed ARMADA family. Despite competitors like Qualcomm and NVIDIA Corp. (NVDA) jumping to quad core, Marvell says that approach doesn’t make sense. Mr. Kang comments, “We don’t think quad core makes sense at 40 [nm] from a power perspective, from a price perspective.”
Marvell’s ARMADA series ARM CPUs power smartphones and mobile devices like the ARM OLPC variant. [Image Source: OLPC.tv]He says that Marvell is tentatively slotted to release quad-core designs when it hits 28 nm in mid-2013. The chipmaker uses Taiwan Semiconductor Manufacturing Comp., Ltd.’s (TPE:2330) third-party fabrication services. TSMC has struggled at the 28 nm node, delivering low yields and in turn higher costs — a combination that doesn’t work with Marvell’s business model — hence the delay.
Marvell feels that the fact that it takes its ARM license and build a unique core from the ground up using the ARM instruction set gives it an advantage over competitors like NVIDIA that simply take the core licensed from ARM Holdings plc (LON:ARM), but don’t do a complete redesign.
The company looks at the tablets market as “saturated” and is avoiding it for the next couple quarters, although it did seem distraught at losing RIM’s PlayBook to Texas Instruments Inc. (TXN), another U.S. chipmaker.
Mr. Kang hinted Marvell may jump on the tablet bandwagon or even release budget ARM laptops in Q4 2012 when Windows 8 arrives — and with it the first-ever ARM CPU support for a Windows main line operating system. He comments, “Microsoft already said Windows 8 will run on ARM. And we build ARM devices, so….”
Marvell hints it may be cooking up ARM Windows 8 tablets/laptops, too.
This move would make sense because Marvell has been involved with the One Laptop Per Child (OLPC) project in producing an ARM (Marvell) powered design. It has also played with low cost Linux laptops for years.
The company also showed off a (Android 3.2) “Honeycomb” television set, which it plans to target as an introduction to Internet TV in budget markets like China. This was a reference design, whereas Marvell would partner with a traditional TV maker for production designs.
The Honeycomb set uses Marvell’s latest dual-core chip, which contains an extra low-power core to conserve energy during simpler tasks. The power savings approach mirrors that found in Tegra 3. In that sense Marvell’s dual-core is technically a tri-core, much as NVIDIA’s quad-core is technically a penta-core.
There could indeed be a real 2012 opportunity for Marvell as Nokia CEO Stephen Elop highlighted in an answer to questions about the Quarter 4 results last week (Nokia Quarter 4 results 2011 webcast [Nokia, Jan 26, 2012]):
on China dynamics:
… The Chinese operators are increasingly, on accellerated basis entering into structures where there’s effectively retail rate plan bundling is going on at the store. The operators are driving very hard for the volume of 3G data subscribers. And this is not necessary an economic measure as it is driving volume on certain networks for certain technologies. I think those targets are probably set more broadly for all of the operators [he could mean: by the state, as all three operators are majority owned by the state]. And the impact of that is that they are discovering that with very low priced devices on certain radio technologies they can drive a lot of volume at those levels. And so we are seeing, for example, a very significant uptake in a number of low-priced devices that are on CDMA, there’s also a very significant focus on the Chinese technology TD-SCDMA, again all of the low levels ought to drive those volumes. My comment in the prepared remarks is that Symbian is not well positioned today against that. We do not have Symbian CDMA products at all, so we are not participating in that part of the market. So as that part of the market grows our addressable market has gone down because of that. In TD-SCDMA we do have some products in that space but not at the price points and configurations that is the real focus of this market. …
… We have not yet announced our specific products for the Chinese market but I will say that when we first announced our launch plans, I think all the way back in October, we did highlight that we would have CDMA based Windows Phone products and TD-SCDMA Windows Phone products. That thing said it is the case that we have work to do to successively drive the prices down further and further and further. That will take a bit of time but this is clearly the pattern you are going to see us on the months ahead. …
[I have a couple of deep and current analysis on that:
– The new, high-volume market in China is ready to define the 2012 smartphone war [Jan 6, 2012]
– China TD-SCDMA and W-CDMA 3G subscribers by the end of 2011: China Mobile lost its original growth momentum [Jan 21, 2012]
– China becoming the lead market for mobile Internet in 2012/13 [Dec 1, 2011]]
High performance SOC handles HD media [Jan 6, 2012]
The ARMADA 1500 HD media SOC decodes high-definition advanced multi-format video and audio using it’s dual ARMv7 compatible PJ4B 1.2 GHz processors with symmetric multi-processing and DSP accelerators. The chip targets IP/cable/satellite set-top boxes, advanced Blu-Ray players, digital media adapters, Google TV, and DTV applications.
The SOCs processors yield 6,000 DMIPS. It includes a secure boot ROM and USB, Fast Ethernet, HDMI, SATA, and SDIO interfaces, plus a 32-bit DDR3 at 800 MHz interface. The chips security engine handles OTP, RNG, AES/(3), DES, RSA, SHA-1, and MD5 and a comprehensive software development kit is available. (No price given – available now.)
See also my other posts regarding the other high volume opportunities for Marvell:
– Marvell® ARMADA® PXA168 based XO laptops and tablets from OLPC with $185 and target $100 list prices respectively [Jan 8, 2012]
– Google’s revitalization of its Android-based TV effort via Marvell SoC and reference design [Jan 5, 2012]
(the VIZIO VAP430 Stream Player, introduced below, is likely based on that)
– VIZIO’s two pronged strategy: Android based V.I.A. Plus device ecosystem + Windows based premium PC entertainment [Jan 11, 2012]
Background on Marvell’s relationship with Microsoft
A Cal ‘Kinect-ion’ [Innovations by UC Berkeley College of Engineering, Nov 9, 2011]
Some engineers wait a lifetime for a project like the one that Jack Kang (B.S.’04 EECS) landed when he was barely 26.
In the fall of 2008, Kang was settling into a new marketing position [Technical Marketing Manager] at Marvell, a Santa Clara-based semiconductor company, when Microsoft came knocking with a mysterious assignmentfor the company. Working on an undisclosed product, the computing giant needed a team to design a complex chip for manufacture on a massive scale.
“This project was very secretive,” recalls Kang, who had shifted from hands-on chip design to marketing management at Marvell. Marvell got the Microsoft contract, but “we didn’t really know what it was for,” says Kang. Many months into the development of a specialized microprocessor—often touted as a system’s “brains”—he got his answer. The mystery chip was destined for Kinect, Microsoft’s controller-free and immensely popular electronic game sensor device.
Introduced last November, Kinect uses sophisticated visual and voice recognition to run electronic games, movies and other entertainment. A companion to Microsoft’s Xbox 360 video gaming system, it became the fastest-selling consumer electronics gadget in history, selling 8 million devices in 60 days.
Kinect’s appeal came as no surprise to Kang. “It was a giant leap,” he says of the technology that lets users interact with media through body motions and voice commands. In fact, when Kang first learned about Kinect, he was so dazzled by the concept that he wondered if it could actually be pulled off.
His work on the Kinect chip spanned two years. Acting as the project champion in a “do-whatever-it-takes” capacity, Kang managed the effort from the earliest negotiations through a series of designs to manufacturing. In all, more than 100 Marvell chip designers, marketing representatives, software engineers and othersparticipated in a process that witnessed its share of evolutionary curveballs.
For the first six months, the Marvell team focused on what Kang believes would have been one of the most powerful mobile or consumer chips on the market. Shortly after the chip was completed, Microsoft asked for an even higher performing version. But the company soon switched course, deciding to put more of the computing functions into the Xbox instead of Kinect, Kang says.
Ultimately, Marvell engineers were asked to build a general purpose chip capable of controlling voice recognition and sending data to the Xbox. The team wound up modifying a chip already in development. That chip, as it turned out, was one that Kang had helped design in his earlier capacity as a Marvell engineer.
PHOTO BY ABBY COHN
Excited by his role in unleashing Kinect, Kang sees many possibilities for human-machine interaction. “We’re just at the tip of the iceberg of what this device can do,” he says, envisioning future Kinect systems that help the disabled and the elderly, and play a role in medical treatment and procedures.
Beyond Kinect’s intended use for home entertainment, the $150 system has already triggered a flood of creative applications for its cameras, 3-D sensing and other features. At UC Berkeley, graduate student Patrick Bouffard installed a Kinect on a small four-rotor robotic helicopter to enable it to sense its height above the floor and detect objects in its way. Other concepts have included video-conferencing, surveillance and a navigational aid for the blind.
With his boyish smile and animated personality, Kang, now 29, is at least a decade younger than most of his professional peers. He has developed 11 patents, mostly in the field of CPU (central processing unit) technology. “Everything I needed to know I learned in CS152!” he quips. Kang took that computer architecture and engineering class at Berkeley Engineering and became a teaching assistant his senior year.
Born in Taiwan and raised in the South Bay, Kang was drawn to a career at the intersection of engineering and business. “I felt you could have more of an impact,” he says. At Berkeley, he minored in business administration and was powerfully influenced by his experience as a TA. Hired as a Marvell engineer in February 2006, he was increasingly tapped to showcase company products in technical presentations for clients. “I had the mindset of marketing,” says Kang, who also enjoyed the social interaction that came with it.
Twice promoted since 2008, Kang now serves as director of Marvell’s application processor business unit. Today, with a 12-member staff, Kang manages Marvell product lines for e-readers, gaming, education, tabletsand other devices. Long gone is a work schedule with room for lunchtime volleyball and soccer games. “There’s always someone up in some time zone,” Kang observes.
Kang is eager for the next project of Kinect-like proportions to come his way. “Technology is always evolving,” he says. “I certainly hope I have something that beats it.”
Marvell: Lazard Says Buy On Kinect, TD-SCDMA Opportunities [Tech Trader Daily, June 20, 2011]
… Lazard Capital Markets analyst Daniel Amirraised the stock to Buy from Neutral …
Marvell’s sales of chips into China’s home-brewed TD-SCDMA cellular network standard, which is being developed by China Mobile (CHL), and backed by the government, is perhaps underestimated by the Street.
Marvell could produce $90 million in revenue this year from those chip sales, and $151 million next year, but it could actually go as high as $202 million next year, he thinks. The Street has just $80 million modeled for this year, on average.
Moreover, the company’s sales into Microsoft’s (MSFT) “Kinect” gaming accessory are “opening new doors” for Marvell in the mobile and wireless business, he thinks, which may help Marvell catch up after missing earlier tablet and smartphone bids. Kinect will probably produce $104 million in revenue for Marvell this year, up from $64 million last year, on Kinect units of 16 million, Amir thinks.
[Microsoft Reports Record Revenue of $20.9 Billion in Second Quarter [Microsoft press release, Jan 19, 2012]: “The Xbox 360 installed base now totals approximately 66 million consoles and 18 million Kinect sensors”]
…
Teardown: Kinect has processor after all [EE Times, Nov 15, 2010]
Despite Microsoft Corp.’s claims to the contrary, its new Kinect motion-gaming ad-on for the Xbox 360 uses a standalone applications processor marketed by Marvell Technology Group Ltd. , according to a teardown analysis of the Kinect performed by UBM TechInsights.
TechInsights’ teardown uncovered within Kinect a Marvell PXA 168 applications processor, a part usually found in notebook computers. In September, Microsoft reportedly said it decided not to use a dedicated processor in Kinect. Instead, the company reportedly said the peripheral would harness the power of the processor within the Xbox.
Microsoft (Redmond, Wash.) did not immediately respond to request for comment about the discrepancy.
TechInsights analysts concluded that Microsoft’s head fake means the company has bigger plans to make Kinect more of a platform for applications beyond gaming, or that the company was simply trying to prevent the device from being hacked. The Kinect has reportedly already been hacked multiple times.
The analysts also believe that Microsoft may have underestimated the resource demand on the 360 console processor and was forced into using a laptop-equivalent processor to integrate the imaging, sensing, motor-drive and control functions and orchestrate I/O and communications between the Kinect and Xbox 360. It’s also possible that the processor was required to support the spatial aspects of Kinect’s multiple microphones, they said.
“It’s difficult to identify exactly what the Marvell processor accomplishes on the Kinect as investigation on how the firmware and software manage all control and processing functions and how they could be localized/virtualized to the Xbox haven’t been investigated yet,” said Allan Yogasingam, a technical marketing manager at TechInsights. “Regardless, Microsoft has created a product that takes full advantage of all its components to provide an innovative gaming experience. The existence of this Marvell processor just opens the door for further innovation down the line and an extension of the Kinect from more than just a sensor-based gaming accessory.”
TechInsights also conducted further study on the sensor unit that works with Kinect’s image processor, made by PrimeSense Ltd. The firm discovered that the CMOS image sensors used were provided by Aptina Imaging (the die markings on the sensors still refer to Micron Imaging, which was spun off into Aptina in 2008). The infrared camera uses the MT9M001 sensor and RGB input from the color camera features the MT9M112 sensor, TechInsights said.
Close up of the Marvell PXA 168 applications processor found inside Kinect.
Source: UBM TechInsights.TechInsights’ recent teardown of Kinect found chips made by PrimeSense, Marvell, Texas Instruments Inc., STMicroelectronics NV and others. The firm estimates that Kinect carries a bill-of-materials of roughly $56 for the components, not including the the price of design, R&D and the $500 million Microsoft plans to spend to market the device.
Teardown of the Microsoft Kinect – Focused on Motion Capture [Chipworks, Dec 23, 2010]
…
Application processor An Armada Series 800 MHz application processor by Marvell was also inside the Microsoft Kinect. Interestingly, this device is typically aimed at the e-reader market
Marvell-88AP1-BJD2
…
Why did MS dump Kinect processor? There was ‘no need’ for it [ComputerAndVideoGames.com, Sept 29, 2010]
Camera tracks fewer points than it did last year
It emerged in January that MS had ditched a standalone processor in the camera – which some have claimed has subsequently affected performance.
Kinect now relies on the processing power of the Xbox itself – although the platform holder has claimed that it uses “less than one per cent” of the 360’s motherboard.
“We didn’t know how much processing Kinect was going to take at the start of development,” Kinect creative boss Kudo Tsunoda told the new Xbox World 360.
“Obviously you don’t want to lose any of the things that are important to Xbox customers. Graphic fidelity is something that Xbox has always been known for, and you want to make sure that you still hit that level.
“Forza is a graphical showpiece, and we had Forza with Kinect at E3… the graphic fidelity has actually improved in some areas from what they shipped with Forza 3. It’s still running at 60 FPS and it’s supporting Kinect, so there’s just no need to have that extra processor.”
When asked why Kinect detected less points on the player’s body than it did last year, Tsunoda added:
“As you start building the stuff, you’re like: ‘Wow, to track everything in the human body we can do less points. That’s just normal game development. Anything you do with games, you want the processing power to be used as efficiently as possible to get the experience that you want.”
Kinect launches in the UK on November 10 and the US on November 4.
Microsoft drops internal Natal chip [Jan 7, 2010]
GamesIndustry.biz has learned that Microsoft has dropped a chip from its forthcoming Natal motion control system as the platform holder eyes accessible price points in the build-up to release later this year.
Kinect Downgraded To Save Money, Can’t Read Sign Language [Kotaku, Aug 11, 2010]
The patent for Microsoft’s motion-sensing camera Kinect suggested that the device could understand American Sign Language. Well, it can’t. At least, the version going on sale in November can’t.
Responding to the claims made in the patent, Microsoft has told Kotaku “We are excited about the potential of Kinect and its potential to impact gaming and entertainment. Microsoft files lots of patent applications to protect our intellectual property, not all of which are brought to market right away. Kinect that is shipping this holiday will not support sign language.”
So why did the patent suggest it could? Well, sources close to the evolution of Kinect’s development tell us it’s because the version of the hardware that’ll be available later this year isn’t as capable as was originally intended.
The original Kinect had a much higher resolution (over twice that of the final model’s 320×240), and as such, was able to not only recognise the limbs of a player as the current model version can, but their fingers as well (which the current version can’t). And when the hardware could recognise fingers, it would have been able to read sign language.
But that capability came at a cost, and while Microsoft had always intended Kinect to sell for $150, “dumbing down” the camera would have meant that Microsoft wouldn’t be losing as much money on each unit sold, an important point should Kinect prove to be a failure. So dumb it down they did, reducing the camera’s resolution (which in turn reduced the number of appendages it’d have to track) and placing the burden for some of the device’s processing on the console and not Kinect’s own hardware.
This probably isn’t the first time you’ve heard such a rumour, but this latest time at least explains why Kinect can’t read sign language!
We’ve reached out to Microsoft for comment on the matter, and will update if we hear back.
Background on Jack Kang
Jack Kang, Director, Application Processors at Marvell [LinkedIn profile, excerpted, Feb 1, 2012]
Current
- Director, Application Processors at Marvell Semiconductor, Inc.
Past
- Technical Marketing Manager at Marvell Semiconductor, Inc.
- Logic Design Engineer at Marvell Semiconductor, Inc.
- Design Engineer at Eureka Technology
Education
- University of California, Berkeley
- University of California, Berkeley – Walter A. Haas School of Business
Jack is currently director of Marvell’s Application Processor Business Unit. He has been in the semiconductor business for more than seven years, holding previous positions in design engineering at several leading technology vendors. At Marvell, Mr. Kang manages multiple product lines from design conception to mass market implementation and adoption. These include the industry-leading PXA168, PXA618 and PXA510 processors, which are fueling today’s premier consumer devices.
Additionally, he oversees various market segments, including education, eReaders, gaming, tablets and other connected consumer and embeddeddevices. Most recently, Mr. Kang was responsible for the processor design powering Microsoft’s gaming console, Microsoft Kinect. This gaming console shattered sales records and was named the fastest-selling tech gadget of all time by the Guinness Book of World Records – totaling more than 10 million units since its launch in November, 2010.
[Steve Ballmer, Houston Technology Forum, March 10, 2011: “We shipped those in November. We just announced that we’re over 10 million sold, in what amounts to about two-and-a-half months.”]
Outside of his work at Marvell, Mr. Kang also serves as a technical expert on CPU technology and has more than 11 patents pending in the field of CPU technology. He holds a degree in Electrical Engineering and Computer Science from the University of California, Berkeley, with an emphasis in Computer Architecture.
Jack Kang, Patents and Publications [LinkedIn page, excerpted, Feb 1, 2012]
Jack Kang’s Patents
- United States Patent 7,870,372
- Issued January 11, 2011
Inventors: Jack Kang, Hsi-Cheng Chu, Rich, Yu-Chi Chuang
Method and apparatus for idling and waking threads by a multithread processor
- United States Patent 7,904,703
- Issued March 8, 2011
Inventors: Jack Kang, Rich, Yu-Chi Chuang
MULTI-THREAD PROCESSOR WITH MULTIPLE PROGRAM COUNTERS
- United States Patent 7,941,643
- Issued May 10, 2011
Inventors: Jack Kang, Rich, Yu-Chi Chuang
Methods, apparatuses, and system for facilitating control of multiple instruction threads
- United States Patent 7,757,070
- Issued July 13, 2010
Inventors: Jack Kang, Hsi-Cheng Chu, Rich, Yu-Chi Chuang
Multithread processor with thread based throttling
- United States Patent 7,886,131
- Issued February 8, 2011
Inventors: Jack Kang
Instruction dispatching method and apparatus
- United States Patent 7,904,704
- Issued March 8, 2011
Inventors: Jack Kang, Rich, Yu-Chi Chuang
Methods and apparatus for handling switching among threads within a multithread processor
- United States Patent 8,032,737
- Issued October 4, 2011
Inventors: Jack Kang, Hsi-Cheng Chu
Event-based bandwidth allocation mode switching method and apparatus
- United States Patent 8,046,775
- Issued October 25, 2011
Inventors: Jack Kang, Rich, Yu-Chi Chuang
Jack Kang’s Publications
- Berkeley Innovations
- November 28, 2011
Authors: Jack Kang, Abby Cohn
Marvell’s processors for embedded systems – Discussion of the PXA510 processor and the D2Plug developer kit
Mr. Jack Kang of Marvell discusses the PXA510 ARM V7 based 800 MHz application processor with with 512 Kbytes of level 2 cache and it’s associated developer kit.
From Dewey to Digital [HigherEdTECH, Jan 6, 2011]
No more pencils?! No more books? No more teachers? On-demand digital content, do-it-yourself learning, new generation learning platforms, and new modes of assessment are disrupting traditional textbooks, grading, courses, and degrees. Is technology really a catalyst for change? Let us count the ways.
Moderator:
Kenneth C. Green, Founding Director, The Campus Computing ProjectPanel:
- Sean Devine, Chief Executive Officer, CourseSmart
- Felice Nudelman, Executive Director, Education, The New York Times Company
- William D. Rieders, Executive Vice President of Global New Media, Cengage Learning
- Jack Kang, Director, Application Processor Business Unit, Marvell
Video Records (~10 min each) of the From Dewey to Digital (Jan 6, 2011) panel discussion:





Apple’s iPhone has been gaining a lot of traction in China recently. As Apple CEO Tim Cook said during 
















In connection with the availability of these breakthrough thin clients, Wyse also announced the results of independent testing, recently conducted by 














