Azure Red Hat OpenShift, a Kubernetes service jointly managed by Microsoft and Red Hat, is now available

Azure Red Hat OpenShift, a Kubernetes service jointly managed by Microsoft and Red Hat, is now available

3:22pm, 7th May, 2019
(Red Hat Photo) Companies interested in using Kubernetes to help manage their containerized applications have a new option from Microsoft and Red Hat that should ease them into the notoriously complex world of Kubernetes. Azure Red Hat OpenShift is now generally available, the two companies plans to announce Tuesday at the in Boston. Fresh off his keynote appearance on Monday, Microsoft CEO Satya Nadella is expected to join Red Hat CEO Jim Whitehurst on stage to show off the latest installment in the partnership between the two companies. EARLIER: is Red Hat’s container-management software product based around the open-source Docker and Kubernetes projects, and it’s designed for mainstream to late-adopter enterprise computing customers that want the benefits of containers without the hassle of managing them. It’s available both as a cloud service and for on-premises servers, but the launch of Azure Red Hat OpenShift represents a new direction for Red Hat, said Satish Balakrishnan, vice president of product management, in an interview with GeekWire. “We’re actually creating more choice but also creating a new model, where we’re offering this managed offering via Microsoft Azure. It’s the first of its kind in the world, in terms of a jointly engineered, supported and operated OpenShift platform,” Balakrishnan said. The two companies . OpenShift is available on other public clouds, but Microsoft and Red Hat will jointly manage and support this service on Azure and customers will be able to pay for it through a single unified bill from Azure under a revenue-sharing agreement. “A lot of the customers who we’re talking to that are interested in containers and Kubernetes are coming from a Red Hat background,” said Gabe Monroy, partner program manager at Microsoft. “They’re looking for OpenShift to help them in their journey to a more cloud-native world.” In general, a lot of those customers built applications around Red Hat Enterprise Linux (RHEL) a few years back and are looking for ways to modernize their infrastructure. They’re one of the biggest driving forces behind the embrace of hybrid cloud computing strategies by cloud computing providers, who have recognized that lots of companies have applications they can’t or won’t move to the cloud and need products that bridge the gap between their data centers and new applications built on cloud services. Red Hat engineers were given access to some of Microsoft Azure’s internal customer-support technology in order to make sure OpenShift would work as a jointly managed product, Monroy said. Red Hat and Microsoft also plan to jointly manage and support other Red Hat products on Azure, including RHEL, Ansible, and a combination of RHEL and SQL Server. , which should be the last one Red Hat puts on as an independent company while awaiting regulatory approval of . That approval is expected to come through in the second half of the year.
Microsoft may be all-in on cloud computing, but Azure reliability is lagging the competition

Microsoft may be all-in on cloud computing, but Azure reliability is lagging the competition

12:48pm, 7th May, 2019
A Microsoft data center in Cheyenne, Wyo. (Microsoft Photo) In an increasingly competitive market for cloud computing, reliability matters, and Microsoft has some work to do. shows a noticeable gap between Microsoft Azure and the other two big cloud providers when looking at cloud uptime in North America during 2018. According to Gartner, last year Amazon Web Services and Google had nearly identical uptime statistics for the virtual machines at the heart of cloud services — 99.9987 percent and 99.9982 percent, respectively — while Azure trailed by a small but significant amount, at 99.9792 percent. “Azure has had significant downtime, not just in 2018, but even the first three months of 2019 have been not good for Microsoft,” said Raj Bala, an analyst with Gartner who compiled the data. LAST WEEK: As Microsoft courts developers this week at Build with an array of new services, it has also making been making changes behind the scenes to improve Azure reliability, said Mark Russinovich, Microsoft Azure CTO, in an interview this week with GeekWire. He plans to showcase a few of those improvements during his annual Azure architecture keynote on Wednesday, but also defended the company’s track record when dealing with planned and unplanned disruptions to cloud service. “We’ve invested a ton in capabilities that allow us to do maintenance with little to zero impact on customers,” Russinovich said. However, that didn’t help last week when , disconnecting Azure services from customers and and took out essential Microsoft services like Office 365 and Xbox Live, as well as websites such as the one you’re currently visiting. Microsoft Azure CTO Mark Russinovich at the GeekWire Cloud Tech Summit. (GeekWire Photo / Kevin Lisota) , that problem was caused by two separate errors, and had either one of those errors happened by itself, we’re not having this discussion. As a result, Microsoft is putting additional procedures and safeguards into place in hopes of preventing this from happening again in the future, Russinovich said. “When you do thousands of these and everything goes off fine, you’re like, the process works,” he said. “Obviously something like this shows us that there’s a gap, and we’re closing that gap.” There were two major unplanned events that rocked Microsoft’s cloud services in North America during 2018. in 2017 forced all cloud providers to update their services in January 2018 with software mitigations that isolated cloud customers from those bugs, but Microsoft had to reboot everyone’s servers to put those changes into effect, and that takes time. And in September 2018, caused some cooling systems to fail, damaging servers and and replace the damaged systems. In the months following the Spectre reboot cycle, Microsoft began rolling out new live migration capabilities that allow it to update servers running customer workloads with little to no disruption. Earlier this year it began rolling those features out across its network of data centers, and they’re now operating nearly everywhere, Russinovich said. But AWS and Google also needed to update their servers to add the patches for Spectre and Meltdown, and it didn’t appear to have as much of an impact on their service uptime. that can update servers with no disruption to customer workloads, while AWS talks far less about the technologies it uses to run its cloud service, which is very on brand for the market-share leader. A Microsoft Azure data center. (Microsoft Photo) Microsoft is also using machine-learning technology to do predictive analytics on its data center hardware, Russinovich said, in hopes of flagging components that are about to fail or underperform based on historical performance data. On Wednesday Russinovich plans to show off Project Tardigrade, a new Azure service named after . This effort will detect hardware failures or memory leaks that can lead to operating system crashes just before they occur and freeze virtual machines for a few seconds so the workloads can be moved to a fresh server. The company is also continuing to roll out availability zones in its cloud computing regions around the world. Microsoft cloud executives rarely miss an opportunity to point out that they have the most regions around the world of any cloud provider, but only within the last year has Microsoft started building availability zones — separate facilities within a region with independent power and cooling supplies — that help ensure availability in the event of a problem at one building in a region. in its Iowa and Paris data centers, and has since rolled them out to several other regions in the U.S., Europe, and Asia. Cloud providers refer to regions and zones a little differently, but AWS and Google Cloud have had far more availability zones up and running for several years. Operating cloud computing services at scale is really one of the more amazing things human beings have accomplished; the complexity involved is hard to appreciate without a fair amount of knowledge about how these systems work. And even if Microsoft lags AWS and Google in reliability scoring, unless your company is blessed with world-class operations talent, Microsoft is likely still better at operating data centers than most companies managing their own servers. But turning over control of your most critical business applications to a third-party provider still requires a leap of faith. As cloud companies fight tooth and nail for the next generation of large enterprise customers considering a move to the cloud, uptime numbers will be more and more important.
Microsoft rolls out new performance-scaling features for Azure databases, including serverless Azure SQL

Microsoft rolls out new performance-scaling features for Azure databases, including serverless Azure SQL

3:39pm, 6th May, 2019
Microsoft’s Scott Guthrie, executive vice president of its Cloud and AI Group, speaks at Microsoft Build 2019. (GeekWire Photo / Kevin Lisota) Cloud customers are getting used to the idea of paying only for what they use while enjoying on-demand performance scaling from their cloud services, and Microsoft added those capabilities to a few of its database products Monday at Microsoft Build. Both and can now implement a technology called Hyperscale, which allows customers to build applications knowing they’ll be able to scale database resources in response to a surge in demand. Azure SQL Database customers will also be able to take advantage of a new serverless pricing tier. Azure Database for PostgreSQL Hyperscale (say that 10 times fast) is one of the first dividends from . Citus’ technology will be implemented in the Hyperscale version of the database to allow it to scale horizontally across nodes, which is ideal for “workload scenarios that require ingesting and querying data in real-time, with sub-second response times, at any scale – even with billions of rows,” Rohan Kumar, corporate vice president for Azure Data, . (L to Ri:) Sudhakar Sannakkayala, General Manager Open Source Relational Databases, Microsoft; Ozgun Erdogan, CTO and Co-Founder, Citus Data; Umur Cubukcu, CEO and Co-Founder, Citus Data; Sumedh Pathak, VP of Engineering and Co-Founder, Citus Data; Rohan Kumar, Corporate Vice President, Microsoft Azure Data (Microsoft Photo) The PostgreSQL Hyperscale version will be available in preview, while Azure SQL Database customers can get started with their Hyperscale version right away. The Azure SQL version allows developers to use a database that scales compute and storage resources as needed, and it also helps improve the how quickly customers can restore an Azure SQL database. And for Azure SQL Database customers that need to scale compute and memory as needed, Microsoft rolled out a new serverless pricing option that lets developers pay by the second for their usage. as customers look to shed the additional burden of configuring cloud hardware after getting rid of the need to manage hardware in moving to the cloud. Database technology is an extremely competitive aspect of both cloud and on-premises enterprise technology, and for good reason, given the importance of data and the complexity of storing data under the right conditions for an application. Last week ahead of Build called Azure SQL Database Edge, and two years ago it introduced i.
Microsoft Azure recovering from major networking-related outage that took out Office 365, Xbox Live, and other services

Microsoft Azure recovering from major networking-related outage that took out Office 365, Xbox Live, and other services

5:20pm, 2nd May, 2019
A look inside a Microsoft data center in Cheyenne, Wyo. (Microsoft Photo) An outage that lasted more than an hour took out a host of Microsoft cloud services Thursday afternoon, as networking connectivity errors in Microsoft Azure also took out third-party apps and sites running on Microsoft’s cloud. Beginning around 1:20pm and lasting for more than an hour, the outage appeared to span the breadth and depth of Microsoft’s cloud services, including Office 365, Microsoft Teams, Xbox Live, and several others used by Microsoft’s commercial customers. The service began to recover around 2:30pm, and warned that it could take some time to get everyone back up and running. ⚠️ Engineers are currently investigating DNS resolution issues affecting network connectivity to Azure services. More information will be provided as it becomes available. — Azure Support (@AzureSupport) Microsoft representatives did not immediately respond to a request for comment on what happened. We’ll update this post as more information becomes available.
Microsoft and VMware unveil new cloud service that lets VMware customers run their apps on Azure

Microsoft and VMware unveil new cloud service that lets VMware customers run their apps on Azure

2:04pm, 29th April, 2019
A look inside a Microsoft data center in Cheyenne, Wyo. (Microsoft Photo) Years after VMware cut a ground-breaking cloud deal with Amazon Web Services that helped make the hybrid cloud a reality, it has struck a similar partnership with Microsoft. Companies running applications built for VMware’s virtualized data center software will be able to run those same apps on Microsoft Azure, the two companies announced Monday at . is generally available and promises to “enable customers to extend and redeploy their VMware workloads natively on Azure dedicated hardware without having to refactor their applications,” wrote Microsoft’s Scott Guthrie, executive vice president of its Cloud and Enterprise group, . The initial rush to public cloud computing implied that all workloads would swiftly move into the data centers of cloud providers, but that hasn’t exactly worked out in practice. Cloud sales teams have accepted that lots of on-premises applications developed years ago around VMware’s technology are unsuitable for cloud environments, and have begun working closely with traditional providers of data center hardware and software to create hybrid cloud products that . So while VMware once considered cloud providers an existential threat, the fruits of changed its tune. Over the last several years, that partnership has produced several products that let mutual customers of the two companies manage their applications across their own data centers and AWS, and at industry events to showcase their work. Now Microsoft has a product for those customers that should help blunt AWS’s advantage with hybrid cloud customers that use VMware’s software to manage critical applications. Microsoft also offers hybrid cloud customers a product and service called Azure Stack, which allows companies that are updating their on-premises infrastructure to purchase hardware with software support for Azure services.
Azure revenue remains a mystery, but cloud services continue to drive Microsoft forward

Azure revenue remains a mystery, but cloud services continue to drive Microsoft forward

3:58pm, 24th April, 2019
Microsoft CEO Satya Nadella. (GeekWire Photo / Nat Levy) One of these days, Microsoft will decide that it needs to break out the amount of revenue it is recording from sales of its Azure cloud computing services. Today was not that day. which ended on March 31st, Microsoft’s Intelligent Cloud division revenue was $9.7 billion, a 22 percent improvement compared to the same quarter last year. That division, which includes sales of Azure, Windows Server, and Enterprise Services, remains the fastest growing part of Microsoft overall. Azure revenue increased 73 percent, according to Microsoft, but the actual numbers that would allow analysts to make better estimates of its performance against Amazon Web Services remain elusive. Still, there’s no dispute that Azure is the second-most used cloud infrastructure service, ahead of Google, Alibaba, IBM, and a few others. Another closely-watched cloud metric, commercial cloud revenue, increased 41 percent compared to the previous year to $9.6 billion. Commercial cloud revenue includes Azure and several segments of Microsoft’s Productivity and Business Processes division, such as commercial Office 365 and Dynamics 365. Microsoft will have a conference call later on Wednesday to discuss its quarter, and we’ll update this post with additional information.
Microsoft’s South African data centers are now open for Azure business

Microsoft’s South African data centers are now open for Azure business

4:18am, 6th March, 2019
A Microsoft data center in Cheyenne, Wyo. (Microsoft Photo) Microsoft is ready to start serving Africa with local data centers, bringing its previously announced South African data centers online Wednesday. Microsoft had online in 2018, but it will become the first of the major cloud providers to provide local service to Africa. IBM operates data centers in South Africa but doesn’t provide nearly the breadth and depth of cloud computing services that the other companies do, while to bring a South African data center online in 2020. Location matters on the modern internet, as more and more real-time applications arrive and the demands cloud customers place on infrastructure continues to grow. Processing power and networking capabilities have never been more powerful, but the speed of light still governs internet communications, which means that end-user proximity is extremely important when it comes to serving local cloud customers. Microsoft said it expects demand for cloud computing services in Africa to triple over the next few years, and local cloud availability could spur a startup boom given the experimental possibilities afforded by cloud computing services. Cloud computing allowed enormous U.S. tech startups like Pinterest and Airbnb to grow over the past decade without having to spend capital on acquiring and maintaining servers to power fledgling applications. Office 365 and Dynamics 365 will also be available out of the South African data centers later this year, Microsoft said.
Microsoft unveils new Azure Sentinel cloud security service to help manage threat detection and analysis

Microsoft unveils new Azure Sentinel cloud security service to help manage threat detection and analysis

9:07am, 28th February, 2019
A look inside Microsoft’s Cyber Defense Operations Center. (Microsoft Photo) Ahead of next week’s big RSA security conference, Microsoft plans to introduce a new cloud service Thursday that will help customers manage their security efforts and also give them a way to tap into its world-class security talent. Azure Sentinel is a cloud-based SIEM, or (security folks like acronyms too) service that allows customers to view and respond to security alerts and threats across their corporate networks. Microsoft appears to be targeting this service at companies that are running SIEM software on their own servers and looking to modernize their approach, but it will also cover applications running across multiple public clouds and hybrid cloud, company executives said. Security is hard. Threats grow every day as more and more applications flood the web, and software designed to flag potential malware or suspicious activity can overwhelm users with alerts that need to be investigated and dealt with. There is also , which has led to demand for artificial-intelligence based security services that reduce manual labor and a concentration of security talent at big companies like Microsoft, Amazon, and Google. With Azure Sentinel, “customers are able to automate 80 percent of the most common tasks defenders spend their time on today,” said Ann Johnson, corporate vice president for cybersecurity solutions at Microsoft, in a briefing ahead of the announcement. Companies are also embracing more complex infrastructure strategies, with applications running on public clouds like Azure and on their own self-managed servers. Older products designed for the data center era can’t necessarily handle that complexity, said Microsoft’s Steve Dispensa, partner director for product management, security. All the major cloud vendors pay very close attention to their own security efforts while also introducing products for customers to help them manage the parts of cloud security for which the customer is responsible. at re:Invent 2017 and . One primary argument for cloud computing is that these companies are much better at security than your company probably is, and Microsoft will also introduce a new feature Thursday that lets users submit detailed questions to Microsoft security staff. It’s called Microsoft Threat Experts, and allow Windows Defender ATP customers to hit an “ask the expert” button on their dashboards to send a question to Microsoft alongside application or network data.
Microsoft unveils next-generation HoloLens headset and $399 ‘Azure Kinect’ camera for developers

Microsoft unveils next-generation HoloLens headset and $399 ‘Azure Kinect’ camera for developers

12:47pm, 24th February, 2019
Microsoft Technical Fellow Alex Kipman. (Screenshot Via YouTube) Microsoft today unveiled the second generation of its HoloLens mixed reality headset, along with a surprise cloud-powered camera built on technology created originally for Xbox. The new HoloLens improves in three areas: more immersion, more comfort and greater simplicity right out of the box. One of the big knocks on the first HoloLens was a narrow field of vision, but the new version has doubled that. Alex Kipman, Microsoft technical fellow and HoloLens creator, said the new device has tripled the comfort of the original, making it easier for people in areas like construction and manufacturing to wear the headset for hours at a time. HoloLens 2 boasts more physical commands than the original. The device recognizes the user’s hands, and it allows users to touch virtual images and interact in several different ways. HoloLens 2 launches with several kits to make it easier for large enterprises to build out mixed reality applications. The new $399 Azure Kinect camera is available for pre-order today. Microsoft showed off a variety of partners using the device in areas like retail and medicine. Microsoft’s presentation today at MWC, one of the most important gatherings for the mobile technology industry, is streaming on YouTube now. We’ll update this post as more details emerge. The announcement of the a new HoloLens was not exactly a surprise. Alex Kipman, Microsoft technical fellow and HoloLens creator, timed to Microsoft’s presentation in Barcelona that numerous reports linked to HoloLens 2. Then last night, images believed to show the device contrasted with the original model leaked online. — WalkingCat (@h0x0d) have persisted since mid 2018. The device shipped in 2016 and was targeted at developers and large companies. Microsoft has yet to release a consumer version of the headset — which costs $3,000 for a developer edition and $5,000 for a commercial suite. Microsoft has targeted a number of areas for the device, from medicine to construction to retail and service workers. One of, if not the biggest HoloLens customer has created some internal strife at Microsoft. A group of employees this week to cancel a $480 million contract to outfit the U.S. Army with 100,000 HoloLens headsets saying they don’t want to be “implicated as war profiteers.”
Azure CTO Mark Russinovich touts security during deep dive into the tech behind Microsoft’s cloud

Azure CTO Mark Russinovich touts security during deep dive into the tech behind Microsoft’s cloud

3:30pm, 9th May, 2018
A Microsoft data center in Amsterdam, where land is being cleared for additional facilities. (Microsoft Photo) Build 2018 attendees got a peek behind the Azure curtain Wednesday from Azure Chief Technology Officer Mark Russinovich, who also announced that new security technologies from Intel are now available for customers as part of the Azure Confidential Computing program. as a way to assure customers that critical cloud data would be protected at all times by hardware-level technologies that Microsoft’s servers can’t access. that processors running Intel’s SGX technology are now available in the East US region, and a new group of virtual machines running on those processors is also available. The Intel processors create what’s known as “trusted execution environments,” a place on the chip where data can be processed without being exposed to the broader network. It’s a smart concept, and it’s one of the reasons why so many computing professionals were so freaked out by ; those types of attacks . Thanks to the mitigations put in place by Intel and cloud vendors, most cloud customers should be fine, and your data is definitely safer inside a trusted execution enviornment than outside one. Russinovich provided an example of a potential use case for this technology: health care providers have a lot of patient data that could be used by machine-learning algorithms to unlock new treatments or discover causes of certain diseases, but they are either prohibited by law or internal policy from sharing that data outside their organizations. Sharing that data through Azure Confidential Computing could satisfy those policies by preventing other organizations from seeing one group’s data. Russinovich, who you’ll be able to see at our June 27th in Bellevue, covered a lot of other topics during a 75-minute presentation to developers at Build. He walked attendees through some basic characteristics of Microsoft’s data centers, now present in 50 regions around the world. A look inside a Microsoft data center in Cheyenne, Wyo. (Microsoft Photo) Microsoft has reached the point where 50 percent of the energy it uses to power these data centers comes from natural sources like wind or solar, he said. The company hopes to bump that up to 60 percent by the end of the decade, and of course the eventual goal is to source 100 percent of its energy from renewable sources. Amazon Web Services said last year it was shooting for the 50 percent mark by the end of 2017 but , while Google says it has offset 100 percent of its energy usage thanks to the purchases of clean power, . The company is also working to make its data centers more efficient on the demand side of the energy world, researching new types of fuel cells that can help improve power consumption and overall reliability. “One of the things we realized as we looked at (data center designs) is that utility power is not that reliable,” he said. Like almost all cloud vendors, Microsoft builds its own servers to run these data centers, and Russinovich shared a little more information on the progress it has made with its server designs. The company recently designed a newish server architecture dubbed “Beast” inside the company, designed to handle memory-intensive workloads like SAP’s Hana database with a whopping four terabytes of memory available. Most Azure users don’t need that type of performance, but Russinovich observed something interesting about modern data center design; after years of “scale-out,” adding vast quantities of relatively cheap servers to a network to improve performance, Microsoft is finding increasing uses for more traditional “scale-up” systems, where it makes more sense to add more powerful components to servers to increase performance. Russinovich closed his talk recapping Microsoft’s quantum computing strategy, which like most others researching the topic is pretty far out in the future. But the company is “investing a huge amount of money in this, it’s kind of a moonshot project for us.” he said. “This on on the verge of being real and practical.”