Home » 2014 (Page 2)

Yearly Archives: 2014

Microsoft Azure: Marketable machine learning components capability for “a new data science economy”, and real-time analytics for Azure HDInsight service

Marketable machine learning components capability for “a new data science economy”Predictive analytics components on the Azure Marketplace (“APIs”) consisting of predictive models that can plug into Azure Machine Learning as a web service

Introduced this summer and available now in preview, Microsoft Azure Machine Learning helps customers and partners rapidly design, test, automate and manage predictive analytics solutions in the cloud. For example, search engines, online product recommendations, credit card fraud prevention systems, GPS traffic directions and mobile phone personal assistants all use the power of machine learning to provide people with valuable insight.

On October 15th, Microsoft introduced new machine learning capabilities in the Azure Marketplace enabling customers and partners to access machine learning capabilities as Web services. These include a recommendation engine for adding product recommendations to a website, an anomaly detection service for predictive maintenance or fraud detection and a set of R packages, a popular programming language used by data scientists. These new capabilities will be available as finished examples for anyone to try.

Oct 17, 2014:
Joseph Sirosh keynote: “A New Data Science Economy” — Strata + Hadoop 2014

Software and the rise of cloud services have given rise to revolutionary new economies – creating new markets for everything from self-published books, music and videos to mobile apps. Only a few years ago, it would have been hard to imagine developers authoring a million apps for smartphones. But that’s history. Cloud-centric economies are permanently changing the way people author and create in the knowledge economy – whether it be authors or developers – and soon, even data scientists. Joseph Sirosh will share his conviction that the next big software economy will be the Data Science Economy – one where data scientists build predictive models and intelligent services that can be published and monetized as easily as apps for the mobile phone.

About Joseph Sirosh:
I am a Corporate Vice President at Microsoft, and head of the Information Management and Machine Learning group. Our talented team of scientists and engineers are developing Cloud ML services and tools to transform data at scale into intelligence. We are taking the wealth of ML capabilities in Microsoft Research and Product Groups and making it available commercially on Azure. Our first-class ML algorithms, services and tooling will help developers build amazing next-generation ML apps in the cloud and help ML become pervasive across a wide range of future scenarios. Prior to Microsoft I worked at Amazon as VP for Global Inventory Platform and CTO of the core retail business and I was VP of R&D at Fair Isaac Corporation before that. I am very passionate about ML and its applications and have been active in the field since 1990.

Real-time analytics for Azure HDInsight serviceEnabling the real-time predictive analytics in HDInsight with support for Apache Storm clusters

Azure HDInsight cloud analytics service combines the best of Hadoop open source technology with the elasticity and manageability enterprises require. On October 15th, Microsoft announced Azure HDInsight will support Apache Storm clusters in public preview. Storm is an open source project in the Hadoop ecosystem which gives users access to an event-processing analytics platform that can reliably process millions of events. Now, users of Hadoop can gain insights as events happen, in addition to insights from past events. By bringing real-time analytics capabilities to HDInsight, Microsoft is opening up new customer scenarios such as the ability to analyze operational data in real time for predictive maintenance. Storm can also be used with machine learning solution that has previously been trained by batch processing, such as a solution based on Mahout. However its generic, distributed computation model also opens the door for stream-based machine learning solutions. For information on real world scenarios, read how companies are using Storm.

Introducing Apache Storm by Microsoft -- 15-Oct-2014

Apache Storm is a distributed, fault-tolerant, open source computation system that allows you to process data in realtime. Storm solutions can also provide guaranteed processing of data, with the ability to replay data that was not successfully processed the first time. HDInsight Storm is offered as a managed cluster integrated into the Azure environment, where it may be used as part of a larger Azure solution. For example, Storm might consume data from services such as ServiceBus Queues or Event Hub, and use Websites or Cloud Services to provide data visualization. HDInsight Storm clusters may also be configured on an Azure Virtual Network, which reduces latency communicating with other resources on the same Virtual Network and can also allow secure communication with resources within a private datacenter.

Additionally, they’ve also teamed up with Hortonworks to deliver hybrid data connectors between on-premises and cloud deployments. On-premises Hadoop customers using Hortonworks Data Platform 2.2 can move data from on-premises Hadoop into Azure HDInsight. This gives every on-premises Hadoop customer elastic cloud access for back up, burst capacity, and test/dev.

All that is follow-up to:
– Satya Nadella on “Digital Work and Life Experiences” supported by “Cloud OS” and “Device OS and Hardware” platforms–all from Microsoft [this same blog, July 23, 2014] for  Azure Machine Learning, Big Data, Cortana, in-memory BI, in-memory data warehousing, Power BI and Power Q&A
– Microsoft BUILD 2014 Day 2: “rebranding” to Microsoft Azure and moving toward a comprehensive set of fully-integrated backend services [this same blog, April 27, 2014] for Azure,  Hadoop 2.2, Hadoop infrastructure on Azure, HDinsight,  Microsoft Azure, Office 365 and Windows Azure
– An upcoming new era: personalised, pro-active search and discovery experiences for Office 365 (Oslo) [this same blog, April 2, 2014] for  machine learning
– The first “post-Ballmer” offering launched: with Power BI for Office 365 everyone can analyze, visualize and share data in the cloud [this same blog, Feb 10, 2014] for  Big Data, Business Intelligence, business intelligence models,  data insights,  insights from data, Power BI as the lead business solution, Power BI for Office 365, Power BI Jumpstart, Power BI Mobile App, Power BI Sites,  Q&A, Q&A of Power BI, self-service analytics, self-service BI, self-service business intelligence solution and visualization
– Satya Nadella’s (?the next Microsoft CEO?) next ten years’ vision of “digitizing everything”, Microsoft opportunities and challenges seen by him with that, and the case of Big Data [this same blog, Dec 13, 2013for analysis of relational and non-relational data, Apache Hadoop, Big Data, Business Intelligence,  Data Explorer, data warehousing,  digitizing everything, Hadoop, Hadoop integration, HDinsight, join relational and Hadoop cluster tables, Massively Parallel Processing, Microsoft Parallel Data Warehouse, Microsoft PolyBase,  Parallel Data Warehouse, PDW, PolyBase, Power BI for Office 365, Power Map, Power Pivot, Power Query, Power View, Q&A,  self-service analytics, self-service BI,  the next Big Data revolution, tipping point for Big Data and Windows Azure 
– Microsoft partners empowered with ‘cloud first’, high-value and next-gen experiences for big data, enterprise social, and mobility on wide variety of Windows devices and Windows Server + Windows Azure + Visual Studio as the platform [this same blog, July 10, 2013for Azure Data Marketplace, Big Data, Hadoop infrastructure on Azure,  Office 365, Power BI for Office 365 Preview and Windows Azure
– BUILD 2012: Notes on Day 1 and 2 Keynotes [this same blog, Oct 31, 2012for Hadoop

June 30, 2012: Career of the Future: Data Scientist Study Results Infographic by EMC. The explosion in digital data, bandwidth, and processing power – combined with new tools for analyzing the data – has sparked massive interest in the field of data science. Organizations of all sizes are turning to people who are capable of translating this trove of data – created by mobile sensors, social media, surveillance, medical imaging, smart grids, and the like – into predictive insights that lead to business value. Despite the growing opportunity, demand for data scientists is outpacing the supply of talent and will do so for the next five years. Who are data science practitioners, what skills do they need, and why are they so different?

July 8, 2013: Becoming a Data Scientist – Curriculum via Metromap by Swami Chandrasekaran. Data Science, Machine Learning, Big Data Analytics, Cognitive Computing …. well all of us have been avalanched with articles, skills demand info graph’s and point of views on these topics (yawn!). One thing is for sure; you cannot become a data scientist overnight. Its a journey, for sure a challenging one. But how do you go about becoming one? Where to start? When do you start seeing light at the end of the tunnel? What is the learning roadmap? What tools and techniques do I need to know? How will you know when you have achieved your goal? Given how critical visualization is for data science, ironically I was not able to find (except for a few), pragmatic and yet visual representation of what it takes to become a data scientist. So here is my modest attempt at creating a curriculum, a learning plan that one can use in this becoming a data scientist journey. I took inspiration from the metro maps and used it to depict the learning path. I organized the overall plan progressively into the following areas / domains, Fundamentals Statistics Programming Machine Learning Text Mining / Natural Language Processing Data Visualization Big Data Data Ingestion Data Munging Toolbox Each area / domain is represented as a “metro line”, with the stations depicting the topics you must learn / master / understand in a progressive fashion. The idea is you pick a line, catch a train and go thru all the stations (topics) till you reach the final destination (or) switch to the next line. I have progressively marked each station (line) 1 thru 10 to indicate the order in which you travel. You can use this as an individual learning plan to identify the areas you most want to develop and the acquire skills. By no means this is the end; but a solid start. Feel free to leave your comments and constructive feedback. PS: I did not want to impose the use of any commercial tools in this plan. I have based this plan on tools/libraries available as open source for the most part. If you have access to a commercial software such as IBM SPSS or SAS Enterprise Miner, by all means go for it. The plan still holds good. PS: I originally wanted to create an interactive visualization using D3.js or InfoVis. But wanted to get this out quickly. Maybe I will do an interactive map in the next iteration.

Web Services and Marketplaces Create a New Data Science Economy [Machine Learning Blog from Microsoft, Oct 16, 2014]
This blog post is authored by Joseph Sirosh, Corporate Vice President of Machine Learning at Microsoft.
Yesterday, at Strata + Hadoop World, we announced the expansion of our data services with support of real-time analytics for Apache Hadoop in Azure HDInsight and new machine learning (ML) capabilities in the Azure Marketplace. Today, I would like to expand on the new ML capabilities that we announced and share how this is an important step in our journey to jump-start the new data science economy. I’ll also be speaking more about this in my keynote presentation tomorrow at Strata.
Data scientists and their management are often frustrated by just how little of their work makes it into production deployments. Consider this hypothetical, although not uncommon scenario. A data scientist and his team are asked to create a new sales prediction model that can be run whenever needed. The data scientists perfect the sales model using popular statistical modeling language, “R”. The new model is presented to management who want to get the model up and running right away as a web app and as a mobile client. Unfortunately, engineering is unable to deploy the model as they don’t have R and the only option is to convert it all to Java – something that will take months to get up and running. So the data scientists end up preparing a batch job to run R code and mail reports on a daily basis, leaving everyone unsatisfied.
Well, now there’s a better way, thanks to Azure Machine Learning.
We built Azure ML to empower data science with all the benefits of the cloud. Data scientists can bring R code and use Microsoft’s world class ML algorithms in our web-based ML Studio. No software installs required for analysis or production – our browser UI works on any machine and operating system. Teams can collaborate in the cloud, share projects, experiment with world-class algorithms and include data from databases or blob storage. They can use enormous storage and compute resources in the cloud to develop the best models from their data, unrestrained by server or storage capacity.
Perhaps best of all, with just one-click, users can publish a web service with their data science code embedded in it. Data transformations and models can now run in a web service in the cloud – fully managed, secure, reliable, available, and callable from anywhere in the world.
These web service APIs can be invoked from Excel, as shown in this video, by using this simple plug-in. Now, instead of emailing reports, users can surprise management with cloud-hosted apps that are built in hours. Engineering can hook up APIs to any application easily and even create custom mobile apps. Users can publish as many web services as they like, test multiple models in production and update models with new data. The data science team just became several times more productive and engineering is happy because integration is so easy.
But wait, there’s still more.
Imagine a data scientist hits upon that perfect idea for an intelligent web service that everyone else in the world should be building into their apps. Maybe it is a great forecasting method, or a new churn prediction technique, or a novel approach to pattern recognition. Data scientists can now build that web service in Azure ML, publish the ML web service on the Azure Marketplace and start charging for it in over one hundred currencies. Published APIs can be found via search engines. Anyone in the world can pay and subscribe to them and use them in their apps.
For the first time, data scientists can monetize their know-how and creativity just as app developers do. When this happens, we start changing the dynamics of the industry – essentially, data scientists are able to “self-publish” their domain expertise as cloud services which can then be made accessible to billions of users via smartphone apps that tap into those services.
The Azure Marketplace already has an emerging selection of such services. In just a couple of weeks, four of our data scientists published over 15 analytics APIs into the marketplace by wrapping functions from CRAN. Among others, these include APIs for forecasting, survival analysis and sentiment analysis.
Our marketplace has much more than basic analytics APIs. For example, we went and built a set of finished end-to-end ML applications, all using Azure ML, to solve specific business needs. These ML apps do not require a data scientist or ML expertise to use – the science is already baked into our solution. Users can just bring their own data and start using them. These include APIs for recommendations, items that are frequently bought together as well as anomaly detection to spot anomalous events in time-series data such as server telemetry.
A similar anomaly detection API is used by Sumo Logic, a cloud-based machine data analytics company. They have collaborated with Microsoft to bring metric-based anomaly detection capability to their customers. Our metric-based anomaly detection perfectly complements Sumo Logic’s structure-based anomaly detection capabilities. Any Sumo Logic query which results in a numerical time-series now has a special “metric anomaly detection” button which sends the pre-aggregated time series data to Azure ML for analysis. The data is then annotated with labels provided by the Azure ML service indicating unusual spikes or level shifts. Sumo Logic is now offering this optional integration in a limited beta release.
Third parties too are starting to publish APIs into our marketplace. For instance, Versium, a predictive analytics startup, has published these three sophisticated customer scores, all based on public marketing data – Giving Score (which predicts customer propensity to donate), Green Score (predicts customer propensity to make environmentally conscious purchase decisions) and Wealth Score (helps companies estimate the net worth of customers and prospects). Versium offers these scores by analyzing and associating billions of LifeData® attributes and building predictive models using Azure ML.
Our marketplace also hosts a number of other exciting APIs that use ML, including the Bing Speech Recognition Control, Microsoft Translator, Bing Synonyms API and Bing Search API.
By bringing ML capabilities to the Azure Marketplace and making it easy for anyone to access, we are liberating data science from its confines. This two-minute video recaps how:
 This video illustrates how the power of the cloud removes many of the barriers to advanced analytics today. Learn how Microsoft Azure Machine Learning enables deployment of R [the code for a popular programming language used by data scientists] as a web service in minutes and global scale with the Machine Learning Marketplace. Learn more: http://azure.microsoft.com/en-us/services/machine-learning/
Get going today – sign up for Azure ML and try out some of our easy to use samples.
A new future for machine learning is being born in the cloud.
Joseph
Follow me on Twitter.

Oct 20, 2014: Joseph Sirosh – BigDataNYC 2014 – theCUBE

Microsoft Targets IBM Watson with Azure Machine Learning in Big Data Race [Redmond Magazine, Oct 17, 2014]

Nearly a year after launching its Hadoop-based Azure HDInsight cloud analytics service, Microsoft believes it’s a better and broader solution for real-time analytics and predictive analysis than IBM’s widely touted Watson. Big Blue this year has begun commercializing its Watson technology, made famous in 2011 when it came out of the research labs to appear and win on the television game show Jeopardy.

Both companies had a large presence at this year’s Strata + Hadoop World Conference in New York, attended by 5,000 Big Data geeks. At the Microsoft booth, Eron Kelly, general manager for SQL Server product marketing, highlighted some key improvements to Microsoft’s overall Big Data portfolio since last year’s release of Azure HDInsight including SQL Server 2014 with support for in-memory processing, PowerBI and the launch in June of Azure Machine Learning.

In addition to bolstering the offering, Microsoft showcased Azure ML’s ability to perform real-time predictive analytics for the retail chain Pier One.

“I think it’s very similar,” in terms of the machine learning capabilities of Watson and Azure ML, Kelly said. “We look at our offering as a self-service on the Web solution where you grab a couple of predictive model clips and you’re in production. With Watson, you call in the consultants. It’s just a difference fundamentally [that] goes to market versus IBM. I think we have a good advantage of getting scale and broad reach.”

Not surprisingly, Anjul Bhambhri, vice president of Big Data for IBM’s software group disagreed. “There are certain applications which could be very complicated which require consulting to get it right,” she said. “There’s also a lot of innovation that IBM has brought to market around exploration, visualization and discovery of Big Data which doesn’t require any consulting.” In addition to Watson, IBM offers its InfoSphere BigInsights for Hadoop and Big SQL offerings.

As it broadens its approach with a new “data culture,” Microsoft has come on strong with Azure ML, noting it shares many of the real-time predictive analytics of the new personal assistant in Windows Phone called Cortana. Now Microsoft is looking to further broaden the reach of Azure ML with the launch of a new app store-type marketplace where Microsoft and its partners will offer APIs consisting of predictive models that can plug into Azure Machine Learning.

Kicking off the new marketplace, Joseph Sirosh, Microsoft’s corporate VP for information management and machine learning, gave a talk at the Strata + Hadoop conference this morning. “Now’s the time for us to try to build the new data science economy,” he said in his presentation. “Let’s see how we might be able to build that. What do data science and machine learning people do typically? They build analytical models. But can you buy them?”

Sirosh said with Microsoft’s new data section of the Azure Marketplace, marketplace developers and IT pros can search for predictive analytics components. It consists of APIs developed both by Microsoft and partners. Among those APIs from Microsoft are Frequently Bought Together, Anomaly Detection, Cluster Manager and Lexicon Sentiment Analysis. Third parties selling their APIs and models include Datafinder, MapMechanics and Versium Analytics.

Microsoft’s goal is to build up the marketplace for these data models. “As more of you data scientists publish APIs into that marketplace, that marketplace will become just like other online app stores — an enormous of selection of intelligent APIs. And we all know as data scientists that selection is important,” Sirosh said. “Imagine a million APIs appearing in a marketplace and a virtual cycle like this that us data scientists can tap into.”

Also enabling the real-time predictive analytics support is support for Apache Storm clusters, announced today. Though it’s in preview, Kelly said Microsoft is adhering to its SLAs with use of the Apache Storm capability, which enables complex event processing and stream analytics, providing much faster responses to queries.

Microsoft also said it would support the forthcoming Hortonworks Data Platform, which has automatic backup to Azure BLOB storage, Kelly said. “Any Hortonworks customer can back up all their data to an Azure Blob in a real low cost way of storing their data, and similarly once that data is in Azure, it makes it real easy for them to apply some of these machine learning models to it for analysis with Power BI [or other tools].”

Hortonworks is also bringing HDP to Azure Virtual Machines as an Azure certified partner. This will bring Azure HDInsight to customers who want more control over it in an infrastructure-as-a-service model, Kelly said. Azure HDInsight is currently a platform as a service that is managed by Microsoft.

Sept 18, 2014: Insider’s Introduction to Microsoft Azure Machine Learning (AzureML)

Microsoft has introduced a new technology for developing analytics applications in the cloud. The presenter has an insider’s perspective, having actively provided feedback to the Microsoft team which has been developing this technology over the past 2 years. This session will 1) provide an introduction to the Azure technology including licensing, 2) provide demos of using R version 3 with AzureML, and 3) provide best practices for developing applications with Azure Machine Learning.


Mark Tabladillo is a Microsoft MVP and SAS expert. He helps teams become more confident in making actionable business decisions through the use of data mining and analytics. Mark provides training and consulting for companies in the US and around the world. He also teaches part-time with the University of Phoenix. He tweets @marktabnet and blogs at http://marktab.net.

Attachments:

Sam Guckenheimer on Microsoft Developer Division’s Journey to Cloud Cadence

Why Cloud Cadence?

  • Customers value regular improvements
  • Maximize investments in what customers want
  • React to changes quickly: market/competitive/regulatory
  • Change engineering dynamic from “validation” to “flow”

OPENING KEYNOTE: Journey to Cloud Cadence by Sam Guckenheimer, July 28, 2014  [Agile2014 by Agile Alliance, July 28 – Aug 1, 2014] ⇒Browse 184 photos, videos and tweets about @samguckenheimer, microsoft, adopting cloud cadence at Agile 2014 (#agile2014), on Seen

ABSTRACT:  Sam describes a ten-year transformation at Microsoft Developer Division from a waterfallian box product delivery cycle of four years to Agile practices enabling a hybrid SaaS and on-prem business, with a single code base, triweekly delivery of new features in the service, and quarterly delivery for on-premise customers. He presents three waves of improvement and learning: first, the reduction of technical debt and other waste to gain trustworthy transparency, second, the increase in the flow of customer value, and third the shortening of cycle time to allow continuous feedback and continuous business improvement.

The current scale of the business is that there are millions of customer accounts each on–premise and in the cloud. This hybrid situation will exist for many years, and is a necessary part of the business.

Sam will discuss both the organizational issues of transformation and give examples from monthly service reviews of key practices and metrics, such as hypothesis-driven development, funnel analysis, performance monitoring, MTTD and MTTR improvement, log analysis, root cause remediation, scale unit replication and canarying, common code base, testing cycles, georeplication, feature flags, compatibility and compliance testing. He will share his thoughts on the lessons learned in moving from a traditional software delivery team to a modern DevOps team.

CLICK TO GO TO THE VIDEO LIBRARY OF THE CONFERENCE

CLICK TO GO TO THE VIDEO LIBRARY OF THE CONFERENCE

A few excerpts from a recent  [Oct 17, 2014] InfoQ interview:

… the big news is that – the new realization is that the old view that there were two life cycles, one for development, one for operations has been replaced by the realization that there is one life cycle for both and that has been forced to some extent by the improvement in development which leads to faster delivery of working software to deploy, in part by the public cloud that removes the impediments to deployment, in part by the visibility of the consumer facing web, in part by mobile, in part by the “build – measure – learn” affinity practices which connect business learning to tactical learning and in part by the broader shift to the realization that we can now have hypothesis-driven development instead of a notion of requirements or user-story (or whatever you like) product owner driven development. The idea that you can actually use data to drive what to do next.

… as the hyper-scale cloud vendors like us do go literally worldwide on datacenter location, we become worldwide utilities. I mean, I remember for example – was it 4 years ago that tsunami in Japan? In 2009? – I remember the tsunami hit Japan on a Friday, I think. There were three fiber links to Hong Shu at that time and because of the aftershocks they kept going up and down and no one could get reliable information about Fukushima. The center of the tsunami was in Sendai and our data center was between Tokyo and Sendai, as was Fukushima. So we made the decision on the weekend to evacuate our data center and run it “lights out” because Fukushima was too dangerous. So, by Monday we were holding hourly Scrums between Redmond and Hyderabad, which were the two NOCs (network operations centers), basically 12 hours apart in time zone. We were moving all of the network services off of the Japanese data center remotely, including Hotmail which was considered quite critical by the Japanese government because they were telling everyone to stay indoors and use email and we have 10 million Japanese subscribers. That was the first point for me where I really truly understood that this need to run like a utility, or better than a utility given the way Fukushima was unfolding.

I think in five years’ time the default question for any new project of scale will be “Why not public Cloud?” I think the “buy” versus “build” question will be different. So you will buy anything that is context as SAAS or rented and you will be very conscious of where your differentiators are that you want to build, what Forrester calls “systems of engagement” or Gartner calls “systems of innovation”. I think that there will be more reliance on those things being distinct.

Additional information from the earlier Transforming Software Development in a World of Services — Keynote by Sam Guckenheimer, April 3, 2014 [ALM Forum 2014, Seattle WA, April 1-3]: The new world of services enables a much tighter “build-measure-learn” loop so that you can deliver value in small increments and adjust rapidly to feedback. In order to do this, you need to make significant changes to your customer data collection and engineering processes to avoid any debt accumulation and to streamline delivery into production. This talk will look at lessons learned from the transformation of a traditional development organization to a cloud cadence and the practices for continual delivery of customer value.

… Presenters and Biographies — Sam Guckenheimer Product Owner Microsoft Visual Studio Microsoft

Sam Guckenheimer is the Product Owner for Microsoft Visual Studio. Sam is also the author of Software Engineering with Microsoft Visual Studio Team System [there is a newer edition of this: Visual Studio Team Foundation Server 2012: Adopting Agile Software Practices: From Backlog to Continuous Feedback (3rd Edition) (Microsoft Windows Development Series)]. He has 25 years experience as architect, developer, tester, product manager, project manager and general manager in the software industry in the US and Europe. Currently, Sam is the Group Product Planner for Microsoft Visual Studio Team System. In this capacity, he acts as chief customer advocate, responsible for the end-to-end external design of the next releases of these products. Prior to joining Microsoft in 2003, Sam was Director of Product Line Strategy at Rational Software Corporation, now the Rational Division of IBM. He holds five patents on software lifecycle tools. A frequent speaker at industry conferences, Sam is a Phi Beta Kappa graduate of Harvard University.

Now in China and coming to India: 4G LTE True Octa-core™ premium superphones based on 32-bit MediaTek MT6595 SoC with upto 20% more performance, and upto 20% less power consumption via its CorePilot™ technology

As the follow-up to: MediaTek is repositioning itself with the new MT6732 and MT6752 SoCs for the “super-mid market” just being born, plus new wearable technologies for wPANs and IoT are added for the new premium MT6595 SoC [this same blog, March 4-13, 2014] and ARM Cortex-A17, MediaTek MT6595 (devices: H2’CY14), 50 billion ARM powered chips [this same blog, Feb 19 – March 13, 2014]. Also the high-end strategic response to 32-bit Qualcomm Snapdragon 805 Processor (ARM TechCon 2014, Oct 1-3): “our latest and greatest”  [this same blog, Oct 11, 2014].

I. In China:

  • On sale since Sept 22 (on the Beijing market): Lenovo VIBE X2 [datasheet on Lenovo’s local site, Sept 24] is the world’s first layered smartphone. Uniquely crafted into three distinct layers, the VIBE X2 delivers MediaTek’s cutting-edge 4G LTE True Octa-core technology with its MT6595M* SoC, 5″ Retina IPS screen with 1920*1080 Full HD, brilliant dual cameras with 5MP front and 13MP rear cameras, and innovative click-on accessories. Weighting just 120g (without accessories). The first two layers are made up of its colorful cases and click-on accessories respectively, and the layer at the heart of the phone refers to the MT6595 encased within. Generally ¥ 1995 -¥ 2499  [$326-$408] while the cheapest offer is at ¥ 1795 [$293]. It was announced on Sept 4 at the IFA 2014 show (see also the MTK press release) with suggested retail price of $399, first to be available in China, and then in Asia Pacific, Eastern Europe and the Middle East (in countries where Lenovo smartphones are currently sold). It got “Best of IFA 2014” award from Android Authority in “Best Flagship” category, along with the Samsung Galaxy Note 4 phablet (with a 5.7″ screen and to be available first from Oct 17 in the U.S.).

MediaTek MT6595 and MT6595M* The capability differences between MT6595M and MT6595 [i.e. Standard]:
T[i.e. Turbo] version will also be available later with the big core upto 2.5GHz. There is no information yet on the speed increase of the LITTLE core, of the GPU and of the ISP.
Note that instead of the earlier Cortex-A15 the more efficient by 25-30% Cortex-A17 was used in the MT6595 versions, as it has not less than 50-60% performance uplift over the Cortex-A9 (cycle for cycle):
– Cortex-A15: 15 stage integer/17–25 stage floating point pipeline, with out-of-order speculative issue 3-way superscalar execution pipeline[11]
– Cortex-A9: Designed around the high efficiency, dual-issue superscalar, out-of-order, speculating dynamic length pipeline (8 – 11 stages)
– Cortex-A17: … fully out-of-order [speculative issue superscalar] 11+ stage pipeline …

  • Oct 14–orders made before October on www.meizu.com/en start shipping this week: The Meizu MX4 is a flagship, featuring a metal body, packed with some of the latest hardware available on the market, including a large, 5.4-inch display, a speedy processor (MT6595 [i.e. Standard]), and 20-megapixel camera for $449 (starting at ¥ 1799 [$298] suggested retail price in China). See also the Oct 2 Meizu MX4 Review on PhoneArena.

  • Much more to come as e.g. the Nov 19 arrival of ZOPO ZP999 Lion Heart (model name is 小黑3X i.e. Black 3X in China expected to come at suggested retail price of ¥ 1999 [$326]) with MT6595M*

ZOPO ZP999 with MT6595

As with all other MediaTek True Octa-core SoCs, the MT6595 is capable of running on all eight cores concurrently and comes with MediaTek’s proprietary CorePilot™ technology, which delivers exceptional performance as well as battery and thermal efficiency. Its multimedia capabilities include H.265 Ultra HD (4K2K) video recording and playback, 480fps Super Slow Motion, ClearMotion™ and industry-leading face beautification technology.

MediaTek CorePilot™ Technology: MediaTek CorePilot™ technology is designed to deliver the maximum computing performance from big.LITTLE mobile SoC platforms with low power consumption, offering the ultimate combination of performance and power-efficiency.

MediaTek big.LITTLE approach (i.e. CorePilot) involves the GPU compute as well:MediaTek big.LITTLE core approach which includes the GPU as well

The MT6595 is an easy to do stepping stone to the 64-bit MT6795 premium SoC on the MediaTek 4G smartphone product roadmap of July 7 (source: MediaTek MT6795 eight-core 64 LTE chip exposure):

MediaTek 4G smartphone product roadmap -- 7-July-2014

MediaTek 4G smartphone product roadmap of July 7. An August 19 update to that extends the Entry4G category.

II. In India:

What has just been added to Micromax is in a strategic alliance with operator Aircel and SoC vendor MediaTek for delivery of bundled complete solution offers almost equivalent to cost of the device and providing innovative user experience  [this same blog, March 21, 2014] as this strategic alliance is also aimed at the development of a local version of MediaTek’s new “super-mid market” approach:

MediaTek’s True Octa-core™ 4G LTE SoC MT6595 Now Commercialized in India [company press release, Oct 14, 2014]: Delivering a premium user experience for consumers in the Indian market by end of the year

MediaTek Inc., one of the largest fabless semiconductor companies in the world, today announced that its MT6595, the world’s first True Octa-core™ 4G LTE System on Chip (SoC), has reached commercialization in India.  Devices embedded with MT6595 are expected to arrive in the Indian market by the end of the year.

First announced in February 2014, the introduction of this premium chipset in the Indian market will accelerate the transition to 4G LTE in the country and create limitless opportunities for Indian device makers to manufacture high-performance handsets. in the segment. MediaTek’s 4G LTE products allow consumers to embrace the improved speed from 4G LTE and parallel computing capability. MediaTek is looking forward to associating with Indian Smartphone makers and leading the industry in delivering premium mobile user experiences.

“MediaTek is focused on delivering a full range of 4G LTE platforms, and the MT6595 is the first of many to arrive in India. Our 4G LTE solutions support all modes including TDD LTE, which is important for India because infrastructure providers in the country are currently focusing on the TDD LTE network,” said Dr. Finbarr Moynihan, General Manager – Corporate Sales International, of MediaTek.

“With a deep understanding of the Indian smartphone market, we continue to bring a premium mobile experience to consumers, and thus enable them to become, what we like to call, an Everyday Genius,” said Finbarr.

The MT6595 SoC promises a premium user experience with 4G connectivity, superior multi-tasking performance and excellent sustained performance-per-watt.

Underscoring its commitment to India, MediaTek recently announced its robust expansion plans for the Indian market with an investment of US$200 million. To boost its R&D capabilities in India, MediaTek has also launched a new facility in Bengaluru, which focuses on developing innovative solutions for wireless communications. The new Bengaluru facility, together with MediaTek’s Noida facilities, are instrumental to bolstering MediaTek’s R&D capabilities, allowing the company to establish presence in other core segments beyond mobile.

MediaTek says India its largest market on mobile after China [The Economic Times [of Times Group of India], Oct 14, 2014]:

NEW DELHI: MediaTek, the Taiwanese chipset maker, expects to ship about 350 million smartphone chipsets to vendors worldwide in 2014, with India one of the top five markets, a top executive said, adding that the company expects affordable 4G smartphones to be launched in the South Asian nation later next year.

“Very clearly, India is the largest market for MediaTek on mobile, outside of China,” said Finbarr Moynihan, general manager for international corporate sales at the chipset maker, which has a 30% share globally and shipped 220 million smartphones chipsets till December 2013.

MediaTek ranks India as its largest market for 3G growth in smartphones and 2G feature phones. As the 4G ecosystem develops in India, the company expects affordable, dual-band LTE smartphones in the second half next year, by when Reliance Jio Infocomm is expected to have launched its high-speed wireless Internet services.

“We will certainly see entry-level LTE smartphones at affordable price points with reduced feature sets, display, camera quality, less memory, the usual by the second half of next year,” Moynihan said.

LTE, or long-term evolution, is a technology that allows telecom companies to provide higher Internet speeds to consumers on their mobile phones. MediaTek, which competes with Qualcomm globally, on Monday launched its 4G LTE chipset compatible with both TDD and FDD modes, using 2300 MHz and 1800 MHz, respectively.

The company expects some manufacturers to launch devices using its new 4G LTE octa-core, multi-mode chipset by December, though on the premium end.

Moynihan added that India will benefit a lot from China, where the band and frequency structures are almost similar and sales of 4G LTE phones are only rising. This can drive economies of scale for India as carriers and handset makers move towards offering 4G smartphones in the mid- and low-price brackets, making the two countries the biggest markets for LTE growth for chipset makers, including Qualcomm and Intel.

“In terms of criticality, India will be the largest 3G market for us over the next two years as China progresses towards LTE,” Moynihan said.

At present, Bharti Airtel and Aircel offer 4G services in select areas. Reliance Jio Infocomm, which has 4G airwaves countrywide, is widely speculated to launch low-priced yet high-speed data services on mobile phones that support LTE.

“We are in discussions with them. They will likely be the major drivers for LTE in India,” Moynihan said when asked about partnering with Reliance Jio. “When they become mainstream, we hope to be a big part of their portfolio going forward,” he added.

MediaTek will work with Reliance Jio to pre-certify its chipsets and modem solutions to meet network or frequency requirements, a practice the company follows with global carriers, more so in markets driven by them. This allows handset makers to create smartphones that will work on the service providers’ frequencies and bands, including compatibility on both TDD and FDD LTE.

“We hope we can accomplish the appropriate testing, validation and certifications with Reliance and other carriers in India over the next couple of months or quarters. They (RJio) have some special requirements,” he added, without elaborating.

MediaTek’s 64-bit ARM Cortex-A53 octa-core SoC MT8752 is launched with 4G/LTE tablets in China

CUBE-Cool Rubik's Cube-MT8752-8732-based T7-T8-T9 range of tablets -- 11-Oct-2014Oct 9, 2014 (reports on several Chinese websites about the launch):  [First MT8752 octa-core Tablet!] 首款MT8752八核平板![999 Yuan Cool Rubik’s Cube T7] 999元酷比魔方T7发布
Oct 11, 2014 on JD.com (Jingdong Mall): [Cool Rubik’s Cube] 酷比魔方(CUBET7 7[inch tablet computer]英寸平板电脑(MT8752[octa-core]八核 JDI[Retina [1920*1200] screen]视网膜屏64[bit]位[China Unicom]联通/[mobile dual]移动双4G 2.0GHz 2G/16G ¥999.00 [$163]
Oct 11, 2014 in ProductShow on [site home of] 网站首页 – [Cool Rubik’s Cube] 酷比魔方(CUBE)[brand website]品牌网站: T7 – 酷比魔方(CUBE)品牌网站

8″ and 9″ tablets (T8 and T9) to come later, as well as the ones with the quad-core SoC variety MT8732.Their lead partner for that is Shenzhen Alldo Cube Technology and Science Co., Ltd. releasing its products under the  [Cool Rubik’s Cube] 酷比魔方(CUBE)brand. More information on this blog: MediaTek is repositioning itself with the new MT6732 and MT6752 SoCs for the “super-mid market” just being born, plus new wearable technologies for wPANs and IoT are added for the new premium MT6595 SoC [March 4-13, 2014]

This is MediaTek’s very first response to the 32-bit Qualcomm Snapdragon 805 Processor (ARM TechCon 2014, Oct 1-3): “our latest and greatest”. Regarding the MediaTek competitive edge over Qualcomm before that you can read on this blog:
– Qualcomm’s SoC business future is questioned first time [May 1, 2013]
– Eight-core MT6592 for superphones and big.LITTLE MT8135 for tablets implemented in 28nm HKMG are coming from MediaTek to further disrupt the operations of Qualcomm and Samsung [July 20, 2013 – March 15, 2014]
– MediaTek MT6592-based True Octa-core superphones are on the market to beat Qualcomm Snapdragon 800-based ones UPDATE: from $147+ in Q1 and $132+ in Q2 [Dec 22, 2013 – Jan 27, 2014]
– ARM Cortex-A17, MediaTek MT6595 (devices: H2’CY14), 50 billion ARM powered chips [Feb 18 – March 13, 2014]

 

Centaur Technology: Do the same job that an Intel processor can do, but doing it less expensively, with a much smaller group and Glenn Henry in charge

An October 11, 2014 teaser video (what might be behind see: Can VIA Technologies save the mobile computing future of the x86 (x64) legacy platform?). Glenn Henry on Wikipedia.

Their previous teaser was Coming very soon from Centaur Technology: A Leap Ahead in Chip Design [this same blog, Oct 9, 2014]

HP split into two–HP Enterprise and HP Inc. (devices and printers)–for the growth phase of its turnaround

HP share price -- Sept 2011 - Oct 2014

HP share price — Sept 2011 – Oct 2014. Meg Whitman was named CEO on September 22, 2011. As well as renewing focus on HP’s Research & Development division, Whitman’s major decision during her first year as CEO has been to retain and recommit the firm to the PC business that her predecessor announced he was considering discarding (see the August 2011 post on this blog). After such “stabilization and foundation year” on October 03, 2012 she announced an ambitious 5-year turnaround strategy that promised new products by FY14 and finally growth by 2015.  This plan promised changes in HP’s four primary businesses. Enterprise Services got an entirely different operating model. Likewise the Enterprise Group planned to further utilize the cloud. The operating model of the Printing and Personal Systems Group was simplified by reducing its product line. A new cloud-based consumption model was implemented for the Software Group. With the split now  Meg Whitman writes  that “Hewlett-Packard Enterprise … will define the next generation of infrastructure, software, and services for the New Style of IT” while “HP Inc. will be extremely well-positioned to leverage its impressive portfolio and strong innovation pipeline across areas such as multi-function printing, Ink in the office, notebooks, mobile workstations, tablets and phablets, as well as 3-D printing and new computing experiences”. By separation into two they will “be able to accellerate the progress” they’ve made to date, “unlock additional value”, and “more aggressively go after the opportunities in front” of them.

Also seeing total 55,000 job cuts this year, with 45,000-50,000 cuts already done in Q2. CEO Meg Whitman (age 58) is enjoying huge bonus payments via those job cuts, and then she will lead HP Enterprise as CEO, as well as will become the non-executive Chairman of HP Inc.’s Board of Directors.

Detailed information on this blog about the new direction set up for Personal Systems Group part of HP Inc. (very few):

Latest news from HP Personal Systems Group:
– Revamped Z desktop and ZBook mobile workstations [Sept 10, 2014]
HP Stream series of skinny Windows 8.1 laptops and tablets targeted for the holidays [Sept 29, 2014]
– HP 10 Plus 10.1-Inch 16 GB Android Full HD IPS Tablet with Allwinner A31 quadcore 1.0 GHz on Amazon and elsewhere for $280  [July 13, 2014]
– HP Slate 21 – 21.5″-k100 All-in-One Full HD IPS Android PC with NVIDIA Tegra 4 for $400 [Sept 28, 2014] a 17″ version of which, HP Slate 17 will be hitting stores by New Year

Note that such large screen All-in-One Full HD IPS strategy for both desktop replacements as well as great home devices + complete flat tabletop mode for using an application that’s maybe multi-orientational was started with Windows 8-based HP ENVY Rove [June 23, 2013], using Intel® Core™ i3-4010U and now selling for $980.

Detailed information on this blog about the new direction set up for HP Enterprise (quite extensive and deep):


* Note here that as of now Microsoft Windows Server is not available (even the upcoming Windows Server 10 for “the Future of the datacenter from Microsoft“) on the emerging 64-bit ARM. See: Intel: ARM Server Competition ‘Imminent,’ But Not Yet There, Says MKM [Barrons.com, Oct 2, 2014], in which the current state characterized as:

ARM highlighted progress in servers by citing two data center end-customers (sharing the stage with Sandia Labs but not Paypal) that use HP blades for their Moonshot server chassis based on 64-bit Applied Micro (AMCC, NR, $6.90) and 32-bit Texas Instruments silicon.

HP Moonshot program and the 1st 64-bit ARM server (ARM TechCon 2014, Oct 1-3)

HP’s ARM-powered ProLiant m400 (Moonshot) is ready for DDR4 [ARM Connected Community, Oct 8, 2014]

AppliedMicro and Hewlett-Packard recently introduced the first commercially-available 64-bit ARMv8  server. Dubbed the ProLiant m400, the cartridge is specifically designed to fit HP’s Moonshot server framework. The new server – targeted at web caching workloads  – is based on AppliedMicro’s X-Gene System-on-a-Chip  (SoC) and runs Canonical’s versatile Ubuntu operating system.

… One of the key advantages of the X-Gene based m400? The doubling of addressable memory to 64GB per cartridge. … “You put 10 of these enclosures in a rack and you have 3,600 cores and 28 TB of memory to hook together to run a distributed application,” … “The m400 node burns about 55 watts with all of its components on the board, so a rack is in the neighborhood of 25 kilowatts across 450 nodes.” …

Loren Shalinsky, a Strategic Development Director at Rambus, points out that each ProLiant m400 cartridge is actually a fully contained server with its own dedicated memory, which, in the default launch version, carries a payload of DDR3L DIMMs.

“However, future generations of the cartridges can be upgraded from DDR3 to DDR4, without affecting the other cartridges in the rack. This should allow for even higher memory bandwidth and lower power consumption,” he added. “Our expectation is that DDR4 will ramp on the server side – both in terms of x86  and ARM – before finding its way into desktop PCs, laptops and consumer applications like digital TVs and set-top boxes.”

As we’ve previously discussed on Rambus Press , DDR4 memory delivers a 40-50 percent increase in bandwidth, along with a 35 percent reduction in power consumption compared to DDR3 memory, currently in servers. In addition, internal data transfers are faster with DDR4 , while in-memory applications such as databases – where a significant amount of processing takes place in DRAM – are expected to benefit as well.

Compare the above to what was written in Choosing chips for next-generation datacentres [ComputerWeekly.com, Sept 22, 2014]:

HP CEO Meg Whitman has high hopes for the company’s Moonshot low-energy server family as a differentiator in the commodity server market. Moonshot is based on Intel Atom and AMD Opteron system-on-a-chip (SoC) processors, optimised for desktop virtualisation and web content delivery applications. These servers can run Windows Server 2012 R2 or Red Hat, Canonical or Suse Linux distributions.

Semiconductor companies Cavium and Applied Micro are taking two different approaches to the ARM microserver market. Cavium is specialising in low-powered cores, while Applied Micro is taking a high-performance computing (HPC) approach.

AMD is building its chips based on the ARM Cortex-A57 core. … Servers with AMD’s Seattle [Opteron A-Series] ARM-based chip are not expected to ship until mid-2015.

Note here as well that AMD’s Seattle, i.e. Opteron A-Series strategy is also serving the company’s own dense server infrastructure strategy (going against HP’s Moonshot fabric solution) as described here earlier in AMD’s dense server strategy of mixing next-gen x86 Opterons with 64-bit ARM Cortex-A57 based Opterons on the SeaMicro Freedom™ fabric to disrupt the 2014 datacenter market using open source software (so far) [Dec 31, 2014 – Jan 28, 2014] post.

“HP has supported ARM’s standardization effort since its inception, recognizing the benefits of an extensible platform with value-added features,” said Dong Wei, HP fellow. “With the new SBSA specification [Server Base System Architecture from ARM], we are able to establish a simplified baseline for deploying ARM-based solutions and look forward to future HP [server] products based on the ARM architecture.”

 

32-bit Qualcomm Snapdragon 805 Processor (ARM TechCon 2014, Oct 1-3): “our latest and greatest”

Only 3 devices based on Snapdragon 805 SoC have been announced yet (since September): Amazon Fire HDX 8.9 (will be released on Oct 21), Samsung GALAXY Note Edge and Samsung GALAXY Note 4 (release for both–17 model varieties–is scheduled to take place in 140 countries throughout the rest of October after 17th in the U.S. and into the month of November). In addition Google Hopes Whale of a Phone Will Make Splash in Phablet Market with the 5.9″ Motorola (soon Lenovo) Nexus 6 (Shamu) sporting the Snapdragon 805 SoC, and competing with Apple’s iPhone 6 Plus. It might be released in mid-October (October 15 or 16) quite probably together with groundbreaking Android L.

New Krait 450 CPU cores (of Qualcomm’s own Krait microarchitecture for ARMv7-A CPU instruction set architecture–ISA) and new Adreno 420 GPU (for Qualcomm’s brand new Adreno 4xx GPU architecture) as well as the explained below:
– Ultra HD 4K display with integrated Hollywood Quality Video (HQV) technology [0:50⇒]
– HEVC codec feature [2:02⇒]
More information: Snapdragon 805 Processor Product Brief of Sept 20, as well as AnandTech | Qualcomm Snapdragon 805 Performance Preview of May 21 (also giving details about the Adreno 4xx GPU architecture) plus The first wave of computational photography capabilities from Qualcomm for its new Snapdragon 805 SoCs [this same blog, Jan 4-12, 2014].

Note that the Snapdragon 805 SoC was announced 11 months ago (Nov 20, 2013) with:

sampling now and expected to be available in commercial devices by the first half of 2014.

The 4 months delay is quite explainable by the onslaught of high-end SoCs from aspiring competitors, such as MediaTek (first and foremost, see more below), Rockchip, Allwinner etc., all using 3d party semiconductor IP for CPU and GPU cores. While pushing for maximum attainable performance with Krait 450 and Adreno 420 by adding more time to development, meantime Qualcomm itself was forced to move to high-end 64-bit ARMv8-A ISA cores from ARM Holdings (Cortex-A57/A53 big.LITTLE) for its upcoming 2015 SoCs (Snapdragon 810 and Snapdragon 808) in order to remain competitive, as even with Krait 450 the DMIPS/MHz gain against the first Krait 200 core is marginal: 3.51 vs 3.3. So the Krait 450 is the end-of-the-road implementation of the original Krait microarchitecture (but Qualcomm might come out with a brand new microarchitecture of its own for ARMv8-A ISA cores in order to remain competitive from 2016 and on).

Note as well that MediaTek will pose a direct challenge to Qualcomm in high-end 32/64-bit smartphone SoC space as per MediaTek May Narrow Qualcomm’s Lead in China’s 4G Market [EE|Times, Oct 1, 2014]:

MediaTek, Taiwan’s largest chip designer, has a chance to narrow Qualcomm’s lead in China’s 4G smartphone market with the launch of a new octo-core processor in the first quarter of 2015. MediaTek is sampling now the MT6795, a new 64-bit LTE True Octa-core SoC and will start selling the chip early next year, according to Joey Lee, a company spokesperson.

“The chip will provide Samsung Galaxy Notes-like performance at half the price,” Abrams [Randy Abrams, a Taipei-based analyst with investment bank Credit Suisse] said in a phone interview. “It’s for Chinese brands that want performance comparable to Galaxy Notes or the Apple iPhone at the equivalent of $300 to $400 retail for a handset.”

By the first quarter of next year, MediaTek’s MT6795 shipments are expected to reach 30 million units, giving MediaTek a chance to take the lead from Qualcomm, the Commercial Times report said, without citing the source of its information. Qualcomm has a 68% share of the global baseband chip business that was worth $5.2 billion in the second quarter of this year, according to Strategy Analytics.

The current MediaTek challenge for Qualcomm is MediaTek’s 64-bit ARM Cortex-A53 octa-core SoC MT8752 is launched with 4G/LTE tablets in China [this same blog, Oct 14, 2014]

Regarding the MediaTek competitive edge over Qualcomm before that you can read on this blog:
– Qualcomm’s SoC business future is questioned first time [May 1, 2013]
– Eight-core MT6592 for superphones and big.LITTLE MT8135 for tablets implemented in 28nm HKMG are coming from MediaTek to further disrupt the operations of Qualcomm and Samsung [July 20, 2013 – March 15, 2014]
– MediaTek MT6592-based True Octa-core superphones are on the market to beat Qualcomm Snapdragon 800-based ones UPDATE: from $147+ in Q1 and $132+ in Q2 [Dec 22, 2013 – Jan 27, 2014]
– ARM Cortex-A17, MediaTek MT6595 (devices: H2’CY14), 50 billion ARM powered chips [Feb 18 – March 13, 2014]
– MediaTek is repositioning itself with the new MT6732 and MT6752 SoCs for the “super-mid market” just being born, plus new wearable technologies for wPANs and IoT are added for the new premium MT6595 SoC [March 4-13, 2014]

The New HTC EYE Experience for HTC phones and the New HTC Desire EYE phone already embodying that

auto selfie [0:25⇒]; voice prompts, Voice Selfie [0:53⇒]; Split Capture [1:10⇒]; Crop-Me-In [1:55⇒]; face fusion 2:25⇒]; face tracking [3:00⇒]; screen sharing [[3:00–>]4:17⇒]; the New HTC Desire EYE phone [5:12⇒] (for more see: HTC – Innovations | HTC Eye Experience | Selfie and Video Smartphone Software and HTC Desire EYE specs – Phone Arena). All that just the start of “Double Exposure” launch event of Oct 8. HTC EYE Experience will roll out to the following models (full feature list to be confirmed at roll-out): HTC One (M7), HTC One (M8), HTC One (E8), HTC One mini, HTC One mini 2, HTC One Remix, HTC One Max, HTC Desire 612, and HTC Desire 816 in the coming months.

HTC RE camera hands-on! (Android Central)

They took a look on the announcement day (Oct 8. 2014) at the HTC RE camera, the first foray into the post-mobile world from the Taiwanese manufacturer. It’s a small, handheld camera with a 16MP sensor that can shoot on its own, or connect to an Android device or iPhone for tethered shooting.

Coming very soon from Centaur Technology: A Leap Ahead in Chip Design

An October 8, 2014 teaser video (what might be behind see: Can VIA Technologies save the mobile computing future of the x86 (x64) legacy platform?)

Design a site like this with WordPress.com
Get started