Quantcast
Channel: Category Name
Viewing all 5264 articles
Browse latest View live

Azure.Source – Volume 75

$
0
0

Preview | Generally available | News & updates | GTC 2019 | Technical content | Azure shows | Events | Customers, partners, and industries

Now in preview

Windows Virtual Desktop now in public preview on Azure

The public preview of the Windows Virtual Desktop service is now available on Azure. Customers can now access the only service that delivers simplified management, multi-session Windows 10, optimizations for Office 365 ProPlus, and support for Windows Server Remote Desktop Services (RDS) desktops and apps. With Windows Virtual Desktop, you can deploy and scale your Windows desktops and apps on Azure in minutes, while enjoying built-in security and compliance. Access to Windows Virtual Desktop is available through applicable RDS and Windows Enterprise licenses.

Thumbnail from What is Windows Virtual Desktop? by Microsoft Mechanics

Azure Data Studio: An Open Source GUI Editor for Postgres

Support for PostgreSQL in Azure Data Studio is now available in preview. Azure Data Studio is a cross-platform modern editor focused on data development. It's available for Linux, MacOS, and Windows. We're also introducing a corresponding preview PostgreSQL extension in Visual Studio Code (VS Code). Both Azure Data Studio and Visual Studio Code are open source and extensible - two things that PostgreSQL itself is based on. If your primary use case is data, choose Azure Data Studio to manage multiple database connections, explore database object hierarchy, set up dashboards, and more.

Azure Container Registry virtual network and Firewall rules preview support

Announcing Azure Container Registry (ACR) now supports limiting public endpoint access. Customers can now limit registry access within an Azure Virtual Network (VNet), as well as whitelist IP addresses and ranges for on-premises services. VNet and Firewall rules are now supported with virtual machines (VM) and Azure Kubernetes Services (AKS). VNet and Firewall rules are available for public preview in all 25 public cloud regions. General availability (GA) will be based on a curve of usage and feedback.

Also available in preview

Now generally available

Azure Backup for SQL Server in Azure Virtual Machines now generally available

Now generally available, Azure Backup for SQL Server Virtual Machines (VMs), an enterprise scale, zero-infrastructure solution that eliminates the need to deploy and manage backup infrastructure while providing a simple and consistent experience to centrally manage and monitor the backups on standalone SQL instances and Always On Availability Groups. Built into Azure, the solution combines the core cloud promises of simplicity, scalability, security and cost effectiveness with inherent SQL backup capabilities that are leveraged by using native APIs, to yield high fidelity backups and restores.

Thumbnail from How to back up SQL Server running in Azure VMs with Azure Backup

Also generally available

News and updates

Microsoft’s Azure Cosmos DB is named a leader in the Forrester Wave: Big Data NoSQL

Announcing that Forrester has named Microsoft as a Leader in The Forrester Wave™: Big Data NoSQL for the first quarter of 2019 based on their evaluation of Azure Cosmos DB validating the exceptional market momentum and customer satisfaction. According to Forrester, “half of global data and analytics technology decision makers have either implemented or are implementing NoSQL platforms, taking advantage of the benefits of a flexible database that serves a broad range of use cases.” We are committed to making Azure Cosmos DB the best globally distributed database for all businesses and modern applications. With Azure Cosmos DB, we believe that you will be able to write amazingly powerful, intelligent, modern apps and transform the world.

Marketing graphic for Azure Cosmos DB

March 2019 changes to Azure Monitor Availability Testing

Azure Monitor Availability Testing allows you to monitor the availability and responsiveness of any HTTP or HTTPS endpoint that is accessible from the public internet. At the end of this month we are deploying some major changes to improve performance and reliability, as well as to allow us to make more improvements for the future. This post highlights and describes some of the changes needed to ensure that your tests continue running without any interruption.

Data integration with ADLS Gen2 and Azure Data Explorer using Data Factory

Introducing the latest integration in Azure Data Factory. Azure Data Lake Storage Gen2 is a data lake platform that combines advanced data lake solutions with the economic, global scale, and enterprise grade security of Azure Blob Storage. Azure Data Explorer is a fully-managed data integration service to operationalize and manage the ETL/ELT flows with flexible control flow, rich monitoring, and continuous integration and continuous delivery (CI/CD) capabilities. You can now meet the advanced needs of your analytics workloads with unmatched price performance and the security of one of the best clouds for analytics.

Additional news and updates

News from NVIDIA GPU Technology Conference

Over the years, Microsoft and NVIDIA have helped customers run demanding applications on GPUs in the cloud. Last week at NVIDIA GPU Technology Conference 2019 (GTC 2019) in San Jose, Microsoft made several announcements on our collaboration with NVIDIA to help developers and data scientists deliver innovation faster.

Microsoft and NVIDIA extend video analytics to the intelligent edge

Microsoft and NVIDIA are partnering on a new approach for intelligent video analytics at the edge to transform raw, high-bandwidth videos into lightweight telemetry. This delivers real-time performance and reduces compute costs for users. With this latest collaboration, NVIDIA DeepStream and Azure IoT Edge extend the AI-enhanced video analytics pipeline to where footage is captured, securely and at scale. Now, our customers can get the best of both worlds—accelerated video analytics at the edge with NVIDIA GPUs and secure connectivity and powerful device management with Azure IoT Edge and Azure IoT Hub.

Edge appliance and Azure cloud diagram

Microsoft and NVIDIA bring GPU-accelerated machine learning to more developers

GPUs have become an indispensable tool for doing machine learning (ML) at scale. Our collaboration with NVIDIA marks another milestone in our venture to help developers and data scientists deliver innovation faster. We are committed to accelerating the productivity of all machine learning practitioners regardless of their choice of framework, tool, and application. Two of the integrations that Microsoft and NVIDIA have built together to unlock industry-leading GPU acceleration for more developers and data scientists are covered in the next two posts.

Azure Machine Learning service now supports NVIDIA’s RAPIDS

Azure Machine Learning service is the first major cloud ML service to support NVIDIA’s RAPIDS, a suite of software libraries for accelerating traditional machine learning pipelines with NVIDIA GPUs. With RAPIDS on Azure Machine Learning service, users can accelerate the entire machine learning pipeline, including data processing, training and inferencing, with GPUs from the NC_v3, NC_v2, ND or ND_v2 families. Azure Machine Learning service users are able to use RAPIDS in the same way they currently use other machine learning frameworks, and can use RAPIDS in conjunction with Pandas, Scikit-learn, PyTorch, TensorFlow, etc.

ONNX Runtime integration with NVIDIA TensorRT in preview

Announcing the open source the preview of the NVIDIA TensorRT execution provider in ONNX Runtime. Taking another step towards open and interoperable AI by enabling developers to easily leverage industry-leading GPU acceleration regardless of their choice of framework, developers can now tap into the power of TensorRT through ONNX Runtime to accelerate inferencing of ONNX models, which can be exported or converted from PyTorch, TensorFlow, and many other popular frameworks.

Technical content

Reducing security alert fatigue using machine learning in Azure Sentinel

Alert fatigue is real. Security analysts face a huge burden of triage as they not only have to sift through a sea of alerts, but to also correlate alerts from different products manually. Machine learning (ML) in Azure Sentinel is built-in right from the beginning and focuses on reducing alert fatigue while offering ML toolkits tailored to the security community; including ML innovations aimed at making security analysts, security data scientists, and engineers productive.

Screenshot of Fusion and two composite alerts

Breaking the wall between data scientists and app developers with Azure DevOps

Data scientists are used to developing and training machine learning models for their favorite Python notebook or an integrated development environment (IDE). The app developer is focused on the application lifecycle – building, maintaining, and continuously updating the larger business application. As AI is infused into more business-critical applications, it is increasingly clear that we need to collaborate closely to build and deploy AI-powered applications more efficiently. Together, Azure Machine Learning and Azure DevOps enable data scientists and app developers to collaborate more efficiently while continuing to use the tools and languages that are already familiar and comfortable.

Step-By-Step: Getting Started with Azure Machine Learning

In this comprehensive guide, Anthony Bartolo explains how to set up a prediction model using Azure Machine Learning Studio. The example comes from a real-life hackfest with Toyota Canada and predicts the pricing of vehicles.

Intro to Azure Container Instances

Aaron Powell covers using Azure Container Instances to run containers in a really simple approach in this handy introductory guide. It walks through a hello world demo and then some advanced scenarios on using ACR and connecting to Azure resources (such as a SQL server).

Getting started with Azure Monitor Dynamic Thresholds

This overview from Sonia Cuff discusses the new Dynamic Thresholds capability in Azure Monitor, where machine learning sets the alert threshold levels when you are monitoring metrics (e.g., CPU percentage use in a VM or HTTP request time in an application).

How to Deploy a Static Website Into Azure Blob Storage with Azure DevOps Pipeline

In this third video and blog post of Frank Boucher’s CI/CD series, he creates a release Pipeline and explains how to deploy an ARM template to create or update Azure resources and deploy a static website into a blob storage. This series explains how to build a continuous integration and continuous deployment system using Azure DevOps Pipeline to deploy a Static Website into Azure Bob Storage.

Thumbnail from How to Deploy a Static Website Into Azure Blob Storage with Azure DeOps Pipeline - part 3

Fixing Azure Functions and Azure Storage Emulator 5.8 issue

If you happen to run into an error after updating Azure Functions to the latest version, Maxime Rouille's hotfix is for you. He not only explains what to do, but why this might happen and what his hot fix does.

Using Key Vault with Your Mobile App the Right Way

You know you need to keep your app’s secrets safe and follow best practices - but how? In this post, Matt Soucoup uses a Xamarin app to walk through how and why to use Azure Key Vault, Active Directory, and Azure Functions to keep application secrets off of your mobile app and in Key Vault - without tons of extra work for you.

How do You Structure Your Code When Moving Your API from Express to Serverless Functions?

There are a lot of articles showing how to use serverless functions for a variety of purposes. A lot of them cover how to get started, and they are very useful. But what do you do when you want to organize them a bit more as you do for your Node.js Express APIs? There's a lot to talk about on this topic, but in this post, John focuses specifically on one way you can organize your code.

Securely monitoring your Azure Database for PostgreSQL Query Store

Long running queries may interfere with the overall database performance and likely get stuck on some background process, which means that from time to time you need to investigate if there are any queries running indefinitely on your databases. See how you can set up alerting on query performance-related metrics using Azure Functions and Azure Key Vault.

Expanded Jobs functionality in Azure IoT Central

We have improved device management workflow through additional jobs functionalities that make managing your devices at scale much easier. In this brief post, learn to copy an existing job, save a job to continue working on later, stop or resume a running job, and download a job details report once your job has completed running.

Screenshot showing an example of download details of Jobs in Azure IoT Central

Azure Stack IaaS – part five

Self-service is core to Infrastructure-as-a-Service (IaaS). Azure's IaaS gives the owner of the subscription everything they need to create virtual machines (VMs) and other resources on their own, without involving an administrator. This post shows a few examples of Azure and Azure Stack self-service management of VMs.

Additional technical content

Azure shows

Episode 271 - Azure Stack - Tales from the field | The Azure Podcast

Azure Stack experts from Microsoft Services, Heyko Oelrichs and Rathish Ravikumar, give us an update on Azure Stack and some valuable tips and tricks based on their real-world experiences deploying it for customers.


Read the transcript

One Dev Question: What new HoloLens and Azure products were released in Barcelona? | One Dev Minute

In this episode of One Dev Question, Alex Kipman discusses Microsoft Mixed Reality, featuring announcements from Mobile World Congress.

Data Driven Styling with Azure Maps | Internet of Things Show

Ricky Brundritt, PM in the Azure Maps team, walks us through data driven styling with Azure Maps. Data driven styling allows you to dynamically style layers at render time on the GPU using properties on your data. This provides huge performance benefits and allows large datasets to be rendered on the map. Data driven style expressions can greatly reduce the amount of code you would normally need to write and define this type of business logic using if-statements and monitoring map events.

New Microsoft Azure HPC Goodness | Tuesdays with Corey

Corey Sanders and Evan Burness (Principal Program Manager on the Azure Compute team) sat down to talk about new things in the High Performance Computing space in Azure.

Get ready for Global Azure Bootcamp 2019 | Azure Friday

Global Azure Bootcamp is a free, one-day, local event that takes place globally. It's an annual event run by the Azure community. This year, Global Azure Bootcamp is on Saturday, April 27, 2019. Event locations range from New Zealand to Hawaii and chances are good that you can find a location near you. You may even organize your own local event and receive sponsorship so long as you register by Friday, March 29, 2019. Join in to receive sponsorship, Azure passes, Azure for Students, lunch, and a set of content to use for the day.

How to use Azure Monitor Application Insights to record custom events | Azure Tips and Tricks

Learn how to use Azure Monitor Application Insights to make your application logging smarter with Custom Event Tracking.

Thumbnail from How to use Azure Monitor Application Insights to recordcustom events by Azure Tips and Tricks

How to create an Azure Kubernetes Service cluster in the Azure Portal | Azure Portal Series

The Azure Portal enables you to get started quickly with Kubernetes and containers. In this video, learn how to easily create an Azure Kubernetes Service cluster.

Thumbnail from How to create an Azure Kubernetes Service cluster in theAzure Portal by AzurePortal Series

Phil Haack on DevOps at GitHub | Azure DevOps Podcast

Jeffrey Palermo and Phil Haack take a dive deep into DevOps at GitHub. They talk about Phil's role as Director of Engineering; how GitHub, as a company, grew while Phil worked there; the inner workings of how the GitHub website ran; and details about how various protocols, continuous integration, automated testing, and deployment worked at GitHub.

Episode 3 - ExpressRoute, Networking & Hybridity, Oh My! | AzureABILITY

AzureABILITY host Louis Berman discusses ExpressRoute, networking and hybridity with Microsoft's Bryan Woodworth, who specializes in networking, connectivity, high availability, and routing for hybrid workloads in Azure.


Read the transcript

Events

Microsoft Azure for the Gaming Industry

Cloud computing is increasingly important for today’s global gaming ecosystem, empowering developers of any size to reach gamers in any part of the world. In this wrap-up post from GDC 2019, learn how Azure and PlayFab are a powerful combination for game developers. Azure brings reliability, global scale, and enterprise-level security, while PlayFab provides Game Stack with managed game services, real-time analytics, and comprehensive LiveOps capabilities.

The Value of IoT-Enabled Intelligent Manufacturing

Learn how you can apply insights from real-world use cases of IoT-enabled intelligent manufacturing when you attend the Manufacturing IoT webinar on March 28th: Go from Reaction to Prediction – IoT in Manufacturing. In addition, you'll learn how you can use IoT solutions to move from a reactive to predictive model. For additional hands-on, actionable insights around intelligent edge and intelligent cloud IoT solutions, join us on April 19th for the Houston Solution Builder Conference.

Customers, partners, and industries

Power IoT and time-series workloads with TimescaleDB for Azure Database for PostgreSQL

Announcing a partnership with Timescale that introduces support for TimescaleDB on Azure Database for PostgreSQL for customers building IoT and time-series workloads. TimescaleDB allows you to scale for fast ingest and complex queries while natively supporting full SQL and has a proven track record of being deployed in production in a variety of industries including oil & gas, financial services, and manufacturing. The partnership reinforces our commitment to supporting the open-source community to provide our users with the most innovative technologies PostgreSQL has to offer.


This Week in Azure - 22 March 2019 | A Cloud Guru - Azure This Week

Lars is away this week and so Alex Mackey covers Azure Portal improvements, the Azure mobile app, Azure SQL Data Warehouse Workload Importance and Microsoft Game Stack.

Thumbnail from This Week in Azure - 22 March 2019 by A Cloud Guru- Azure This Week


Incrementally copy new files by LastModifiedDate with Azure Data Factory

$
0
0

Azure Data Factory (ADF) is the fully-managed data integration service for analytics workloads in Azure. Using ADF, users can load the lake from 80 plus data sources on-premises and in the cloud, use a rich set of transform activities to prep, cleanse, and process the data using Azure analytics engines, while also landing the curated data into a data warehouse for getting innovative analytics and insights.

When you start to build the end to end data integration flow the first challenge is to extract data from different data stores, where incrementally (or delta) loading data after an initial full load is widely used at this stage. Now, ADF provides a new capability for you to incrementally copy new or changed files only by LastModifiedDate from a file-based store. By using this new feature, you do not need to partition the data by time-based folder or file name. The new or changed file will be automatically selected by its metadata LastModifiedDate and copied to the destination store.

The feature is available when loading data from Azure Blob Storage, Azure Data Lake Storage Gen1, Azure Data Lake Storage Gen2, Amazon S3, File System, SFTP, and HDFS.

The resources for this feature are as follows:

1. You can visit our tutorial, “Incrementally copy new and changed files based on LastModifiedDate by using the Copy Data tool” to help you get your first pipeline with incrementally copying new and changed files only based on their LastModifiedDate from Azure Blob storage to Azure Blob storage by using copy data tool.

2. You can also leverage our template from template gallery, “Copy new and changed files by LastModifiedDate with Azure Data Factory” to increase your time to solution and provide you enough flexibility to build a pipeline with the capability of incrementally copying new and changed files only based on their LastModifiedDate.

3. You can also start from scratch to get that feature from ADF UI.

4. You can write code to get that feature from ADF SDK.

You are encouraged to give these additions a try and provide us with feedback. We hope you find them helpful in your scenarios. Please post your questions on Azure Data Factory forum or share your thoughts with us on Data Factory feedback site.

Azure Storage support for Azure Active Directory based access control generally available

$
0
0

We are pleased to share the general availability of Azure Active Directory (AD) based access control for Azure Storage Blobs and Queues. Enterprises can now grant specific data access permissions to users and service identities from their Azure AD tenant using Azure’s Role-based access control (RBAC).  Administrators can then track individual user and service access to data using Storage Analytics logs. Storage accounts can be configured to be more secure by removing the need for most users to have access to powerful storage account access keys.

By leveraging Azure AD to authenticate users and services, enterprises gain access to the full array of capabilities that Azure AD provides, including features like two-factor authentication, conditional access, identity protection, and more. Azure AD Privileged Identity Management (PIM) can also be used to assign roles “just-in-time” and reduce the security risk of standing administrative access.

In addition, developers can use Managed identities for Azure resources to deploy secure Azure Storage applications without having to manage application secrets.

When Azure AD authentication is combined with the new Azure Data Lake Storage Gen 2 capabilities, users can also take advantage of granular file and folder access control using POSIX-style access permissions and access control lists (ACL’s).

RBAC for Azure Resources can be used to grant access to broad sets of resources across a subscription, a resource group, or to individual resources like a storage account and blob container. Role assignments can be made through the Azure portal or through tools like Azure PowerShell, Azure CLI, or Azure Resource Manager templates.

Assign a role to a user

Azure AD authentication is available from the standard Azure Storage tools including the Azure portal, Azure CLI, Azure PowerShell, Azure Storage Explorer, and AzCopy.

Browse Azure Storage blobs using the Azure portal

$ az login
Note, we have launched a browser for you to login. For old experience with device code, use "az login --use-device-code"
You have logged in. Now let us find all the subscriptions to which you have access...
[
  {
    "cloudName": "AzureCloud",
    "id": "XXXXXXXX-YYYY-ZZZZ-AAAA-BBBBBBBBBBBB",
    "isDefault": true,
    "name": "My Subscription",
    "state": "Enabled",
    "tenantId": "00000000-0000-0000-0000-000000000000",
    "user": {
      "name": "cbrooks@microsoft.com",
      "type": "user"
    }
  }
]
$ export AZURE_STORAGE_AUTH_MODE="login"
$ az storage blob list --account-name mysalesdata --container-name mycontainer --query [].name
[
  "salesdata.csv"
]​

We encourage you to use Azure AD to grant users access to data, and to limit user access to the storage account access keys. A typical pattern for this would be to grant users the "Reader" role make the storage account visible to them in the portal along with the "Storage Blob Data Reader" role to grant read access to blob data. Users who need to create or modify blobs can be granted the "Storage Blob Data Contributor" role instead.

Developers are encouraged to evaluate Managed Identities for Azure resources to authenticate applications in Azure or Azure AD service principals for apps running outside Azure.

Azure AD access control for Azure Storage is available now for production use in all Azure cloud environments

Azure Premium Block Blob Storage is now generally available

$
0
0

As enterprises accelerate cloud adoption and increasingly deploy performance sensitive cloud-native applications, we are excited to announce general availability of Azure Premium Blob Storage. Premium Blob Storage is a new performance tier in Azure Blob Storage for block blobs and append blobs, complimenting the existing Hot, Cool, and Archive access tiers. Premium Blob Storage is ideal for workloads that require very fast response times and/or high transactions rates, such as IoT, Telemetry, AI, and scenarios with humans in the loop such as interactive video editing, web content, online transactions, and more.

Premium Blob Storage provides lower and more consistent storage latency, providing low and consistent storage response times for both read and write operations across a range of object sizes, and is especially good at handling smaller blob sizes. Your application should be deployed to compute instances in the same Azure region as the storage account to realize low latency end-to-end. For more details on performance see, “Premium Block Blob Storage - a new level of performance.”

image

Figure 1 - Latency comparison of Premium and Standard Blob Storage

Premium Blob Storage is available with Locally-Redundant Storage (LRS) and comes with High-Throughput Block Blobs (HTBB), which provides very high and instantaneous write throughput when ingesting block blobs larger than 256KB.

Pricing and region availability

Premium Blob Storage has higher data storage cost, but lower transaction cost compared to data stored in the regular Hot tier. This makes it cost effective and can be less expensive for workloads with high transaction rates. Check out the pricing page for more details.

Premium Blob Storage is initially available in US East, US East 2, US Central, US West, US West 2, North Europe, West Europe, Japan East, Australia East, Korea Central, and Southeast Asia regions with more regions to come. Stay up to date on region availability through the Azure global infrastructure page.

Platform interoperability

At present, data stored in Premium cannot be tiered to Hot, Cool, or Archive access tiers. We are working on supporting object tiering in the future. To move data, you can synchronously copy blobs using the new PutBlockFromURL API (sample code) or AzCopy v10, which supports this API. PutBlockFromURL synchronously copies data server side, which means all data movement happens inside Azure Storage.

In addition, Storage Analytics Logging, Static website, and Lifecycle Management (preview) are not currently available with Premium Blob Storage.

Next steps

To get started with Premium Blob Storage you provision a Block Blob storage account in your subscription and start creating containers and blobs using the existing Blob Service REST API and/or any existing tools such as AzCopy or Azure Storage Explorer.

We are very excited about being able to deliver Azure Premium Blob Storage with low and consistent latency and look forward to hearing your feedback at premiumblobfeedback@microsoft.com. To learn more about Blob Storage please visit our product page.

Why IoT is not a technology solution—it’s a business play

$
0
0

IoT Hero

Enterprise leaders understand how important the Internet of Things (IoT) will be to their companies—in fact, according to a report by McKinsey & Company, 92 percent of them believe IoT will help them innovate products and improve operations by 2020. However, like many business-enabling systems, IoT is not without its growing pains. Even early adopters have concerns about the cost, complexity, and security implications of applying IoT to their businesses.

These growing pains can make it daunting for organizations to pick an entry point for applying IoT to their business.

Many companies start by shifting their thinking. It’s easy to get lost in technology, opting for platforms with the newest bells and whistles, and then leaning on those capabilities to drive projects. But sustainable change doesn’t happen that way—it happens when you consider business needs first, and then look at how the technology can fulfill those needs better than current processes can.

You might find it helpful to start by thinking about how IoT can transform your business. You know connected products will be an important part of your business model, but before you start building them, you need to make sure you understand where the market is headed so you can align innovation with emerging needs. After all, the biggest wins come when you can use emerging technology to create a “platform” undergirding products and services that can be extended into new opportunity areas.

To help you plan your IoT journey, we’re rolling out a four-part blog series. In the upcoming posts, we’ll cover how to create an IoT business case, overcome capability gaps, and simplify execution; all advice to help you maximize your gains with IoT.

Let’s get started by exploring the mindset it takes to build IoT into your business model.

Make sure business sponsors are invested

With any business-enabling system, organizations instinctively engage in focused exploration before they leap in. The business and IT can work together to develop a business case, identifying and prioritizing areas that can be optimized and provide real value. Involving the right business decision-makers will ensure you have sponsorship, budget, and commitment when it comes to applying IoT to new processes and systems and make the necessary course corrections as implementation grows and scales. Put your business leaders at the center of the discussion and keep them there.

Seize early-mover advantage

Organizations that are early in and commit to developing mastery in game-changing technologies may only see incremental gains in the beginning. But their leadership position often becomes propulsive, eventually creating platform business advantages that allow them to outdistance competitors for the long term. Don’t believe it? Just look at the history of business process automation, operational optimization, ecommerce, cloud services, digital business, and other tech-fueled trends to see how this has played out.

Consider manufacturing. The industry was a leader in operational optimization, using Six Sigma and other methodologies to strip cost and waste out of processes. After years of these efforts, process improvements became a game of inches with ever-smaller benefits.

Enter IoT. Companies have used IoT to achieve significant gains with improving production and streamlining operations in both discrete and process manufacturing. IoT can help companies predict changing market demand, aligning output to real needs. In addition, many manufacturing companies use IoT to help drive throughput of costly production equipment. Sensors and advanced analytics predict when equipment needs preventive maintenance, eliminating costly downtime.

How companies are using IoT today

Johnson Controls makes building-automation solutions, enabling customers to fine-tune energy use, lowering costs and achieving sustainability goals. The company built out its connected platform in the cloud with the Microsoft Azure IoT Suite to be able to aggregate, manage, and make sense of the torrent of facility data it receives. Through its Smart Connected Chillers initiative, Johnson Controls was able to identify a problem with a customer’s chiller plant, take corrective action, and prevent unplanned downtime that would have cost the customer $300,000 an hour.

IoT can also enable new business models. Adobe used to run its business with one-time software sales, but the company pivoted to online subscription sales as part of its drive to create a digital business, according to Forbes. While the move seemed risky at the time (and initially hurt revenue), Adobe’s prescient move has enabled it to dominate digital marketing, creative services, and document management. Now, of course, the software industry is predominantly Software as a Service (SaaS). Adobe is building on its success by pushing even deeper into analytics. The Adobe Experience Cloud uses Microsoft Azure and Microsoft Dynamics 365 to provide businesses with a 360-degree customer view and the tools to create, deliver, and manage digital experiences and a globally scalable cloud platform.

When connections with customers are constant and always adding value—everyone wins.

Think about IoT as a business enabler

It’s helpful to constantly stress to your team that IoT is a new way to enable a business strategically. IoT isn’t a bolt-on series of technology applications to incrementally optimize what you have. That may seem at odds with the prevailing wisdom to optimize current services. Yes, organizations should seek efficiencies, but only after they have considered where the business is headed, how things need to change to support targeted growth, and whether current processes can be improved or need to be totally transformed. In the case of Johnson Controls, service optimization is a core part of the company’s value proposition to its customers.

Yes, IoT can reinvent your business. And yes, constant, restless innovation can be expected until IoT is fully embedded in your business in a way that’s organic and self-sustaining. Adobe has used its digital platform to extend further and further into its customers’ businesses, providing value they can measure.

If you haven’t started with IoT, now is the right time, because your competitors are grappling with these very issues and creating smart strategies to play the IoT long game. Disruption is here and will only get more pronounced as platform leaders rocket ahead.

As you plan your journey with IoT, there’s help. In forthcoming blogs, we’ll be looking at:

  • Building a business case for IoT—Why this is the moment to be thinking about IoT and developing a solid business case to capture market opportunity and future-proof your organization.
  • Paving the way for IoT—Understanding and addressing what gaps need to be overcome to achieve IoT’s promise.
  • Becoming an IoT leader now—It’s simpler than you think to start with IoT. You can use what you have and SaaS-ify your approach with technology that makes it easy to connect, monitor, and manage your IoT assets at scale.

Need inspiration? Watch this short video to hear insights from IoT leaders Henrik Fløe of Grundfos, Doug Weber from Rockwell Automation, Michael MacKenzie from Schneider Electric, and Alasdair Monk of The Weir Group. 

Clean up files by built-in delete activity in Azure Data Factory

$
0
0

Azure Data Factory (ADF) is a fully-managed data integration service in Azure that allows you to iteratively build, orchestrate, and monitor your Extract Transform Load (ETL) workflows. In the journey of data integration process, you will need to periodically clean up files from the on-premises or the cloud storage server when the files become out of date. For example, you may have a staging area or landing zone, which is an intermediate storage area used for data processing during your ETL process. The data staging area sits between the data source stores and the data destination store. Given the data in staging areas are transient by nature, you need to periodically clean up the data in the staging area after the ETL process has being completed.

We are excited to share ADF built-in delete activity, which can be part of your ETL workflow to deletes undesired files without writing code. You can use ADF to delete folder or files from Azure Blob Storage, Azure Data Lake Storage Gen1, Azure Data Lake Storage Gen2, File System, FTP Server, sFTP Server, and Amazon S3.

You can find ADF delete activity under the “Move & Transform” section from the ADF UI to get started.

1. You can either choose to delete files or delete the entire folder. The deleted files and folder name can be logged in a csv file.

2. The file or folder name to be deleted can be parameterized, so that you have the flexibility to control the behavior of delete activity in your data integration flow.

3. You can delete expired files only rather than deleting all the files in one folder. For example, you may want to only delete the files which were last modified more than 30 days ago.

4. You can start from ADF template gallery to quickly deploy common use cases involving delete activity.

You are encouraged to give these additions a try and provide us with feedback. We hope you find them helpful in your scenarios. Please post your questions on Azure Data Factory forum or share your thoughts with us on Data Factory feedback site.

Enabling customers’ hybrid strategy with new Microsoft innovation

$
0
0

Customers who are taking a hybrid cloud approach are seeing real business value – I see this in organizations across the globe. The ability for customers to embrace both public cloud and local datacenter, plus edge capability, is enabling customers to improve their IT agility and maximize efficiency. The benefit of a hybrid approach is also what continues to bring customers to Azure, the one cloud that has been uniquely built for hybrid. We haven’t slowed our investment in enabling a hybrid strategy, particularly as this evolves into the new application pattern of using intelligent cloud and intelligent edge.

Before I dive into what’s new, I want to take a moment to share why Microsoft is so passionate about enabling a hybrid approach. It stems from a deep understanding of our customers and their businesses over the past several decades. We want every organization on the planet to benefit from cloud innovation. Fundamentally, hybrid enables every organization to participate in this technology transformation. Beyond this, we see the leading experiences enabled by tapping into both the intelligent cloud and intelligent edge, creating optimized experiences for literally every use case. 

Today, I’m pleased to share some new products and updates to our Azure hybrid portfolio that ultimately help address a wider range of customer needs.

Expanding the Azure Stack family

Since Azure Stack become globally available in 2017, it has captivated customers as a unique offering to build and run cloud-native applications with consistent Azure services on premises including disconnected locations. Today, Azure Stack is available in 92 countries with 8 announced hardware partners and customers like Airbus Defense & SpaceKPMG Norway, and iMOKO are using it to enable hybrid and edge scenarios. We continue to expand Azure Stack offerings to meet a broader set of customer needs, so they can run virtualized applications in their own datacenter. 

Introducing Azure Stack HCI Solutions. When considering their full application portfolio, customers want to upgrade a set of existing applications to run on modern hyperconverged infrastructure (HCI) for efficiency and performance gains. And, while this is often an approach for “legacy” applications, in moving to HCI, customers can now also benefit from a hybrid cloud HCI solution. We are bringing our existing HCI technology into the Azure Stack family for customers to run virtualized applications on-premises with direct access to Azure management services such as backup and disaster recovery. Azure Stack HCI solutions feature the same software-defined compute, storage, and networking technology as the existing Azure Stack, and include simplified cloud access via the Azure hybrid services in Windows Admin Center. Azure Stack HCI is the newest solution to join the broad set of hybrid capabilities from Azure. You can learn more about Azure Stack HCI in Arpan Shah’s blog.

More customer options for Azure Stack. The Azure Stack ecosystem is growing in more ways than one today. Dell EMC Tactical Microsoft Azure Stack is now generally available, bringing Azure-consistent cloud to operating environments where network connectivity is limited and/or mobility and high portability are required – in remote and rugged circumstances. This new offering enables Azure Stack to support an even wider range of use-cases, while still delivering a consistent hybrid cloud approach.

Delivering more innovation at the edge

Hybrid cloud is evolving from being only the integration of a datacenter with the public cloud, to becoming units of computing available at the edge, including even the world’s most remote destinations, working in concert with public cloud. You can read more about this new era we call intelligent cloud and intelligent edge in an earlier blog. What’s compelling about the intelligent edge is many of the same patterns and principles for hybrid applications apply to edge applications. That means investments our customers make today in a hybrid cloud, and the skillsets developed from running these hybrid applications, position them to take advantage of edge computing and tools that can yield even bigger benefits in the future. Toward the goal of helping customers tap into this potential, I am very excited to announce some new edge capabilities.

Azure Data Box Edge is now available. Today is the general availability of Azure Data Box Edge. We previewed the Data Box Edge appliance with edge compute and network data transfer capabilities last September. Data Box Edge provides a cloud managed compute platform for containers at the edge, enabling customers to process data at the edge and accelerate machine learning workloads, powered by Azure Machine Learning and Intel Arria 10 FPGA. Data Box Edge also enables customers to transfer data over the internet to Azure in real-time for deeper analytics or model retraining at cloud scale or for long term storage, as does the Azure Data Box Gateway virtual appliance that is also available today. You can read more about both and how customers like Cree and Esri are already using Data Box Edge via Dean Paron’s blog.

This week: Join me for a Hybrid Cloud Virtual Event

You can learn more about Azure hybrid cloud, and in the meantime, I hope you’ll join me later this week as I host a virtual event focused on hybrid cloud for enterprises. I’m looking forward to talking with customers, analysts and technical leaders about how they’re approaching hybrid cloud – including what’s working and what they’ve learned. We’ll cover topics like application innovation, security and governance, migration, edge computing – and more. The event is Thursday, March 28, 2019 at 8:00 AM Pacific Time, and you can register here.

Announcing Azure Stack HCI: A new member of the Azure Stack family

$
0
0

It has been inspiring to watch how customers use Azure Stack to innovate and drive digital transformation across cloud boundaries. In her blog post today, Julia White shares examples of how customers are using Azure Stack to innovate on-premises using Azure services. Azure Stack shipped in 2017, and it is the only solution in the market today for customers to run cloud applications using consistent IaaS and PaaS services across public cloud, on-premises, and in disconnected environments. While customers love the fact that they can run cloud applications on-premises with Azure Stack, we understand that most customers also run important parts of their organization on traditional virtualized applications. Now we have a new option to deliver cloud efficiency and innovation for these workloads as well.

Today, I am pleased to announce Azure Stack HCI solutions are available for customers who want to run virtualized applications on modern hyperconverged infrastructure (HCI) to lower costs and improve performance. Azure Stack HCI solutions feature the same software-defined compute, storage, and networking software as Azure Stack, and can integrate with Azure for hybrid capabilities such as cloud-based backup, site recovery, monitoring, and more.

Adopting hybrid cloud is a journey and it is important to have a strategy that takes into account different workloads, skillsets, and tools. Microsoft is the only leading cloud vendor that delivers a comprehensive set of hybrid cloud solutions, so customers can use the right tool for the job without compromise. 

Choose the right option for each workload

Azure Stack HCI solutions

Azure Stack HCI: Use existing skills, gain hyperconverged efficiency, and connect to Azure

Azure Stack HCI solutions are designed to run virtualized applications on-premises in a familiar way, with simplified access to Azure for hybrid cloud scenarios. This is a perfect solution for IT to leverage existing skills to run virtualized applications on new hyperconverged infrastructure while taking advantage of cloud services and building cloud skills.

Customers that deploy Azure Stack HCI solutions get amazing price/performance with Hyper-V and Storage Spaces Direct running on the most current industry-standard x86 hardware. Azure Stack HCI solutions include support for the latest hardware technologies like NVMe drives, persistent memory, and remote-direct memory access (RDMA) networking.

IT admins can also use Windows Admin Center for simplified integration with Azure hybrid services to seamlessly connect to Azure for:

  • Azure Site Recovery for high availability and disaster recovery as a service (DRaaS).
  • Azure Monitor, a centralized hub to track what’s happening across your applications, network, and infrastructure – with advanced analytics powered by AI.
  • Cloud Witness, to use Azure as the lightweight tie breaker for cluster quorum.
  • Azure Backup for offsite data protection and to protect against ransomware.
  • Azure Update Management for update assessment and update deployments for Windows VMs running in Azure and on-premises.
  • Azure Network Adapter to connect resources on-premises with your VMs in Azure via a point-to-site VPN.
  • Azure Security Center for threat detection and monitoring for VMs running in Azure and on-premises (coming soon).

Buy from your choice of hardware partners

Azure Stack HCI solutions are available today from 15 partners offering Microsoft-validated hardware systems to ensure optimal performance and reliability. Your preferred Microsoft partner gets you up and running without lengthy design and build time and offers a single point of contact for implementation and support services. 

How to buy Azure Stack HCI solutions

Visit our website to find more than 70 Azure Stack HCI solutions currently available from these Microsoft partners: ASUS, Axellio, bluechip, DataON, Dell EMC, Fujitsu, HPE, Hitachi, Huawei, Lenovo, NEC, primeLine Solutions, QCT, SecureGUARD, and Supermicro.

Learn more

We know that a great hybrid cloud strategy is one that meets you where you are, delivering cloud benefits to all workloads wherever they reside. Check out these resources to learn more about Azure Stack HCI and our other Microsoft hybrid offerings:

FAQ

What do Azure Stack and Azure Stack HCI solutions have in common?

Azure Stack HCI solutions feature the same Hyper-V based software-defined compute, storage, and networking technologies as Azure Stack. Both offerings meet rigorous testing and validation criteria to ensure reliability and compatibility with the underlying hardware platform.

How are they different?

With Azure Stack, you can run Azure IaaS and PaaS services on-premises to consistently build and run cloud applications anywhere.

Azure Stack HCI is a better solution to run virtualized workloads in a familiar way – but with hyperconverged efficiency – and connect to Azure for hybrid scenarios such as cloud backup, cloud-based monitoring, etc.

Why is Microsoft bringing its HCI offering to the Azure Stack family?

Microsoft’s hyperconverged technology is already the foundation of Azure Stack.

Many Microsoft customers have complex IT environments and our goal is to provide solutions that meet them where they are with the right technology for the right business need. Azure Stack HCI is an evolution of Windows Server Software-Defined (WSSD) solutions previously available from our hardware partners. We brought it into the Azure Stack family because we have started to offer new options to connect seamlessly with Azure for infrastructure management services.

Will I be able to upgrade from Azure Stack HCI to Azure Stack?

No, but customers can migrate their workloads from Azure Stack HCI to Azure Stack or Azure.

How do I buy Azure Stack HCI solutions?

Follow these steps:

  1. Buy a Microsoft-validated hardware system from your preferred hardware partner.
  2. Install Windows Server 2019 Datacenter edition and Windows Admin Center for management and the ability to connect to Azure for cloud services
  3. Option to use your Azure account to attach management and security services to your workloads.

How does the cost of Azure Stack HCI compare to Azure Stack?

This depends on many factors.

Azure Stack is sold as a fully integrated system including services and support. It can be purchased as a system you manage, or as a fully managed service from our partners. In addition to the base system, the Azure services that run on Azure Stack or Azure are sold on a pay-as-you-use basis.

Azure Stack HCI solutions follow the traditional model. Validated hardware can be purchased from Azure Stack HCI partners and software (Windows Server 2019 Datacenter edition with software-defined datacenter capabilities and Windows Admin Center) can be purchased from various existing channels. For Azure services that you can use with Windows Admin Center, you pay with an Azure subscription.

We recommend working with your Microsoft partner or account team for pricing details.

What is the future roadmap for Azure Stack HCI solutions?

We’re excited to hear customer feedback and will take that into account as we prioritize future investments.


Blob storage interface on Data Box is now generally available

$
0
0

The blob storage interface on the Data Box has been in preview since September 2018 and we are happy to announce that it's now generally available. This is in addition to the server message block (SMB) and network file system (NFS) interface already generally available on the Data Box.

The blob storage interface allows you to copy data into the Data Box via REST. In essence, this interface makes the Data Box appear like an Azure storage account. Applications that write to Azure blob storage can be configured to work with the Azure Data Box in exactly the same way. 

This enables very interesting scenarios, especially for big data workloads. Migrating large HDFS stores to Azure as part of a Apache Hadoop® migration is a popular ask. Using the blob storage interface of the Data Box, you can now easily use common copy tools like DistCp to directly point to the Data Box, and access it as though it was another HDFS file system! Since most Hadoop installations come pre-loaded with the Azure Storage driver, most likely you will not have to make changes to your existing infrastructure to use this capability. Another key benefit of migrating via the blob storage interface is that you can choose to preserve metadata. For more details on migrating HDFS workloads, please review the Using Azure Data Box to migrate from an on premises HDFS store documentation.

Blob storage on the Data Box enables partner solutions using native Azure blob storage to write directly to the Data Box. With this capability, partners like Veeam, Rubrik, and DefendX were able to utilize the Data Box to assist customers moving data to Azure.

For a full list of supported partners please visit the Data Box partner page.

For more details on using blob storage with Data Box, please see our official documentation for Azure Data Box Blob Storage requirements and a tutorial on copying data via Azure Data Box Blob Storage REST APIs.

Accelerated AI with Azure Machine Learning service on Azure Data Box Edge

$
0
0

FPGA acceleration at the edge

Along with the general availability of Azure Data Box Edge that was announced today, we are announcing the preview of Azure Machine Learning hardware accelerated models on Data Box Edge. The majority of the world’s data in real-world applications is used at the edge. For example, images and videos collected from factories, retail stores, or hospitals are used for manufacturing defect analysis, inventory out-of-stock detection, and diagnostics. Applying machine learning models to the data on Data Box Edge provides lower latency and savings on bandwidth costs, while enabling real-time insights and speed to action for critical business decisions.

Azure Machine Learning service is already a generally available, end-to-end, enterprise-grade, and compliant data science platform. Azure Machine Learning service enables data scientists to simplify and accelerate the building, training, and deployment of machine learning models. All these capabilities are accessed from your favorite Python environment using the latest open-source frameworks, such as PyTorch, TensorFlow, and scikit-learn. These models can run today on CPUs and GPUs, but this preview expands that to field programmable gate arrays (FPGA) on Data Box Edge.

What is in this preview?

This preview enhances Azure Machine Learning service by enabling you to train a TensorFlow model for image classification scenarios, containerize the model in a Docker container, and then deploy the container to a Data Box Edge device with Azure IoT Hub. Today we support ResNet 50, ResNet 152, DenseNet-121, and VGG-16. The model is accelerated by the ONNX runtime on an Intel Arria 10 FPGA that is included with every Data Box Edge.

Why does this matter?

Over the years, AI has been infused in our everyday lives and in industry. Smart home assistants understand what we say, and social media services can tag who’s in the picture we uploaded. Most, if not all, of this is powered by deep neural networks (DNNs), which are sophisticated algorithms that process unstructured data such as images, speech, and text. DNNs are also computationally expensive. For example, it takes almost 8 billion calculations to analyze one image using ResNet 50, a popular DNN.

There are many hardware options to run DNNs today, most commonly on CPUs and GPUs. Azure Machine Learning service brings customers the cutting-edge innovation that originated in Microsoft Research (featured in this recent Fast Company article), to run DNNs on reconfigurable hardware called FPGAs. By integrating this capability and the ONNX runtime in Azure Machine Learning service, we see vast improvements in the latencies of models.

Bringing it together

Azure Machine Learning service now brings the power of accelerated AI models directly to Data Box Edge. Let’s take the example of a manufacturing assembly line scenario, where cameras are photographing products at various stages of development.

The pictures are sent from the manufacturing line to Data Box Edge inside your factory, where AI models trained, containerized and deployed to FPGA using Azure Machine Learning service, are available. Data Box Edge is registered with Azure IoT Hub, so you can control which models you want deployed. Now you have everything you need to process incoming pictures in near real-time to detect manufacturing defects. This enables the machines and assembly line managers to make time-sensitive decisions about the products, improving product quality, and decreasing downstream production costs.

Join the preview

Azure Machine Learning service is already generally available today. To join the preview for containerization of hardware accelerated AI models, fill out the request form and get support on our forum.

Azure Data Box Family meets customers at the edge

$
0
0

Today I am pleased to announce the general availability of Azure Data Box Edge and the Azure Data Box Gateway. You can get these products today in the Azure portal.

Compute at the edge

Data-Box-Edge-2019-Draft2-DarkR[4]We’ve heard your need to bring Azure compute power closer to you – a trend increasingly referred to as edge computing. Data Box Edge answers that call and is an on-premises anchor point for Azure. Data Box Edge can be racked alongside your existing enterprise hardware or live in non-traditional environments from factory floors to retail aisles. With Data Box Edge, there's no hardware to buy; you sign up and pay-as-you-go just like any other Azure service and the hardware is included.

This 1U rack-mountable appliance from Microsoft brings you the following:

  • Local Compute – Run containerized applications at your location. Use these to interact with your local systems or to pre-process your data before it transfers to Azure.
  • Network Storage Gateway – Automatically transfer data between the local appliance and your Azure Storage account. Data Box Edge caches the hottest data locally and speaks file and object protocols to your on-premise applications.
  • Azure Machine Learning utilizing an Intel Arria 10 FPGA - Use the on-board Field Programmable Gate Array (FPGA) to accelerate inferencing of your data, then transfer it to the cloud to re-train and improve your models. Learn more about the Azure Machine Learning announcement.
  • Cloud managed – Easily order your device and manage these capabilities for your fleet from the cloud using the Azure Portal.

Since announcing Preview at Ignite 2018 just a few months ago, it has been amazing to see how our customers across different industries are using Data Box Edge to unlock some innovative scenarios:

 

Kroger

Sunrise Technology, a wholly owned division of The Kroger Co., plans to use Data Box Edge to enhance the Retail as a Service (RaaS) platform for Kroger and the retail industry to enable the features announced at NRF 2019: Retail's Big Show, including personalized, never-before-seen shopping experiences like at-shelf product recommendations, guided shopping and more. The live video analytics on Data Box Edge can help store employees identify and address out-of-stocks quickly and enhance their productivity. Such smart experiences will help retailers provide their customers with more personalized, rewarding experiences.

EsriEsri, a leader in location intelligence, is exploring how Data Box Edge can help those responding to disasters in disconnected environments. Data Box Edge will allow teams in the field to collect imagery captured from the air or ground and turn it into actionable information that provides updated maps. The teams in the field can use updated maps to coordinate response efforts even when completely disconnected from the command center. This is critical in improving the response effectiveness in situations like wildfires and hurricanes.

Data Box Gateway – Hardware not required

Data Box Edge comes with a built-in storage gateway. If you don’t need the Data Box Edge hardware or edge compute, then the Data Box Gateway is also available as a standalone virtual appliance that can be deployed anywhere within your infrastructure.

You can provision it in your hypervisor, using either Hyper-V or VMware, and manage it through the Azure Portal. Server message block (SMB) or network file system (NFS) shares will be set up on your local network. Data landing on these shares will automatically upload to your Azure Storage account, supporting Block Blob, Page Blob, or Azure Files. We’ll handle the network retries and optimize network bandwidth for you. Multiple network interfaces mean the appliance can either sit on your local network or in a DMZ, giving your systems access to Azure Storage without having to open network connections to Azure.

Whether you use the storage gateway inside of Data Box Edge or deploy the Data Box Gateway virtual appliance, the storage gateway capabilities are the same.

 

More solutions from the Data Box family

In addition to Data Box Edge and Data Box Gateway, we also offer three sizes of Data Box for offline data transfer:

  • Data Box – a ruggedized 100 TB transport appliance
  • Data Box Disk – a smaller, more nimble transport option with individual 8 TB disks and up to 40 TB per order
  • Data Box Heavy Preview - a bigger version of Data Box that can scale to 1 PB.

All Data Box offline transport products are available to order through the Azure Portal. We ship them to you and then you fill them up and ship them back to our data center for upload and processing. To make Data Box useful for even more customers, we’re enabling partners to write directly to Data Box with little required change to their software via our new REST API feature which has just reached general availability – Blob Storage on Data Box!

Get started

Thank you for partnering with us on our journey to bring Azure to the edge. We are excited to see how you use these new products to harness the power of edge computing for your business. Here’s how you can get started:

  • Order Data Box Edge or the Data Box Gateway today via the Azure portal.
  • Review server hardware specs on the Data Box Edge datasheet.
  • Learn more about our family of Azure Data Box products.

New updates to Azure AI expand AI capabilities for developers

$
0
0

As companies increasingly look to transform their businesses with AI, we continue to add improvements to Azure AI to make it easy for developers and data scientists to deploy, manage, and secure AI functions directly into their applications with a focus on the following solution areas:

  1. Leveraging machine learning to build and train predictive models that improve business productivity with Azure Machine Learning.
  2. Applying an AI-powered search experience and indexing technologies that quickly find information and glean insights with Azure Search.
  3. Building applications that integrate pre-built and custom AI capabilities like vision, speech, language, search, and knowledge to deliver more engaging and personalized experiences with our Azure Cognitive Services and Azure Bot Service.

Today, we’re pleased to share several updates to Azure Cognitive Services that continue to make Azure the best place to build AI. We’re introducing a preview of the new Anomaly Detector Service which uses AI to identify problems so companies can minimize loss and customer impact. We are also announcing the general availability of Custom Vision to more accurately identify objects in images. 

From using speech recognition, translation, and text-to-speech to image and object detection, Azure Cognitive Services makes it easy for developers to add intelligent capabilities to their applications in any scenario. To this date more than a million developers have already discovered and tried Cognitive Services to accelerate breakthrough experiences in their application.

Anomaly detection as an AI service

Anomaly Detector is a new Cognitive Service that lets you detect unusual patterns or rare events in your data that could translate to identifying problems like credit card fraud.

Today, over 200 teams across Azure and other core Microsoft products rely on Anomaly Detector to boost the reliability of their systems by detecting irregularities in real-time and accelerating troubleshooting. Through a single API, developers can easily embed anomaly detection capabilities into their applications to ensure high data accuracy, and automatically surface incidents as soon as they happen.

Common use case scenarios include identifying business incidents and text errors, monitoring IoT device traffic, detecting fraud, responding to changing markets, and more. For instance, content providers can use Anomaly Detector to automatically scan video performance data specific to a customer’s KPIs, helping to identify problems in an instant. Alternatively, video streaming platforms can apply Anomaly Detector across millions of video data sets to track metrics. A missed second in video performance can translate to significant revenue loss for content providers that monetize on their platform.

Custom Vision: automated machine learning for images

With the general availability of Custom Vision, organizations can also transform their business operations quickly and accurately identifying objects in images.

Powered by machine learning, Custom Vision makes it easy and fast for developers to build, deploy, and improve custom image classifiers to quickly recognize content in imagery. Developers can train their own classifier to recognize what matters most in their scenarios, or export these custom classifiers to run them offline and in real time on iOS (in CoreML), Android (in TensorFlow), and many other devices on the edge. The exported models are optimized for the constraints of a mobile device providing incredible throughput while still maintaining high accuracy.

Today, Custom Vision can be used for a variety of business scenarios. Minsur, the largest tin mine in the western hemisphere, located in Peru, applies Custom Vision to create a sustainable mining practice by ensuring that water used in the mineral extraction process is properly treated for reuse on agriculture and livestock by detecting treatment foam levels. They used a combination of Cognitive Services Custom Vision and Azure video analytics to replace a highly manual process so that employees can focus on more strategic projects within the operation.

Screenshot of the Custom Vision platform

Screenshot of the Custom Vision platform, where you can train the model to detect unique objects in an image, such as your brand’s logo.

Starting today, Custom Vision delivers the following improvements:

  • High quality models – Custom Vision features advanced training with a new machine learning backend for improved performance, especially on challenging datasets and fine-grained classification. With advanced training, you can specify a compute time budget and Custom Vision will experimentally identify the best training and augmentation settings.
  • Iterate with ease – Custom Vision makes it simple for developers to integrate computer vision capabilities into applications with 3.0 REST APIs and SDKs. The end to end pipeline is designed to support the iterative improvement of models, so you can quickly train a model, prototype in real world conditions, and use the resulting data to improve the model which gets models to production quality faster.
  • Train in the cloud, run anywhere – The exported models are optimized for the constraints of a mobile device, providing incredible throughput while still maintaining high accuracy. Now, you can also export classifiers to support Azure Resource Manager (ARM) for Raspberry Pi 3 and the Vision AI Dev Kit.

For more information, visit the Custom Vision Service Release Notes.

Get started today

Today’s milestones illustrate our commitment to make the Azure AI platform suitable for every business scenario, with enterprise-grade tools that simplify application development, and industry leading security and compliance for protecting customers’ data.

To get started building vision and search intelligent apps, please visit the Cognitive Services site.

Windows 10 SDK Preview Build 18361 available now!

$
0
0

Today, we released a new Windows 10 Preview Build of the SDK to be used in conjunction with Windows 10 Insider Preview (Build 18361 or greater). The Preview SDK Build 18361 contains bug fixes and under development changes to the API surface area.

The Preview SDK can be downloaded from developer section on Windows Insider.

For feedback and updates to the known issues, please see the developer forum. For new developer feature requests, head over to our Windows Platform UserVoice.

Things to note:

  • This build works in conjunction with previously released SDKs and Visual Studio 2017. You can install this SDK and still also continue to submit your apps that target Windows 10 build 1809 or earlier to the Microsoft Store.
  • The Windows SDK will now formally only be supported by Visual Studio 2017 and greater. You can download the Visual Studio 2017 here.
  • This build of the Windows SDK will install on Windows 10 Insider Preview builds and supported Windows operating systems.
  • In order to assist with script access to the SDK, the ISO will also be able to be accessed through the following URL: https://go.microsoft.com/fwlink/?prd=11966&pver=1.0&plcid=0x409&clcid=0x409&ar=Flight&sar=Sdsurl&o1=18361 once the static URL is published.

Tools Updates

Message Compiler (mc.exe)

  • The “-mof” switch (to generate XP-compatible ETW helpers) is considered to be deprecated and will be removed in a future version of mc.exe. Removing this switch will cause the generated ETW helpers to expect Vista or later.
  • The “-A” switch (to generate .BIN files using ANSI encoding instead of Unicode) is considered to be deprecated and will be removed in a future version of mc.exe. Removing this switch will cause the generated .BIN files to use Unicode string encoding.
  • The behavior of the “-A” switch has changed. Prior to Windows 1607 Anniversary Update SDK, when using the -A switch, BIN files were encoded using the build system’s ANSI code page. In the Windows 1607 Anniversary Update SDK, mc.exe’s behavior was inadvertently changed to encode BIN files using the build system’s OEM code page. In the 19H1 SDK, mc.exe’s previous behavior has been restored and it now encodes BIN files using the build system’s ANSI code page. Note that the -A switch is deprecated, as ANSI-encoded BIN files do not provide a consistent user experience in multi-lingual systems.

Breaking Changes

IAppxPackageReader2 has been removed from appxpackaging.h

The interface IAppxPackageReader2 was removed from appxpackaging.h. Eliminate the use of use of IAppxPackageReader2 or use IAppxPackageReader instead.

Change to effect graph of the AcrylicBrush

In this Preview SDK, we’ll be adding a blend mode to the effect graph of the AcrylicBrush called Luminosity. This blend mode will ensure that shadows do not appear behind acrylic surfaces without a cutout. We will also be exposing a LuminosityBlendOpacity API available for tweaking that allows for more AcrylicBrush customization.

By default, for those that have not specified any LuminosityBlendOpacity on their AcrylicBrushes, we have implemented some logic to ensure that the Acrylic will look as similar as it can to current 1809 acrylics. Please note that we will be updating our default brushes to account for this recipe change.

TraceLoggingProvider.h  / TraceLoggingWrite

Events generated by TraceLoggingProvider.h (e.g. via TraceLoggingWrite macros) will now always have Id and Version set to 0.

Previously, TraceLoggingProvider.h would assign IDs to events at link time. These IDs were unique within a DLL or EXE, but changed from build to build and from module to module.

API Updates, Additions and Removals

Additions:


namespace Windows.AI.MachineLearning {
  public sealed class LearningModelSession : IClosable {
    public LearningModelSession(LearningModel model, LearningModelDevice deviceToRunOn, LearningModelSessionOptions learningModelSessionOptions);
  }
  public sealed class LearningModelSessionOptions
  public sealed class TensorBoolean : IClosable, ILearningModelFeatureValue, IMemoryBuffer, ITensor {
    void Close();
    public static TensorBoolean CreateFromBuffer(long[] shape, IBuffer buffer);
    public static TensorBoolean CreateFromShapeArrayAndDataArray(long[] shape, bool[] data);
    IMemoryBufferReference CreateReference();
  }
  public sealed class TensorDouble : IClosable, ILearningModelFeatureValue, IMemoryBuffer, ITensor {
    void Close();
    public static TensorDouble CreateFromBuffer(long[] shape, IBuffer buffer);
    public static TensorDouble CreateFromShapeArrayAndDataArray(long[] shape, double[] data);
    IMemoryBufferReference CreateReference();
  }
  public sealed class TensorFloat : IClosable, ILearningModelFeatureValue, IMemoryBuffer, ITensor {
    void Close();
    public static TensorFloat CreateFromBuffer(long[] shape, IBuffer buffer);
    public static TensorFloat CreateFromShapeArrayAndDataArray(long[] shape, float[] data);
    IMemoryBufferReference CreateReference();
  }
  public sealed class TensorFloat16Bit : IClosable, ILearningModelFeatureValue, IMemoryBuffer, ITensor {
    void Close();
    public static TensorFloat16Bit CreateFromBuffer(long[] shape, IBuffer buffer);
    public static TensorFloat16Bit CreateFromShapeArrayAndDataArray(long[] shape, float[] data);
    IMemoryBufferReference CreateReference();
  }
  public sealed class TensorInt16Bit : IClosable, ILearningModelFeatureValue, IMemoryBuffer, ITensor {
    void Close();
    public static TensorInt16Bit CreateFromBuffer(long[] shape, IBuffer buffer);
    public static TensorInt16Bit CreateFromShapeArrayAndDataArray(long[] shape, short[] data);
    IMemoryBufferReference CreateReference();
  }
  public sealed class TensorInt32Bit : IClosable, ILearningModelFeatureValue, IMemoryBuffer, ITensor {
    void Close();
    public static TensorInt32Bit CreateFromBuffer(long[] shape, IBuffer buffer);
    public static TensorInt32Bit CreateFromShapeArrayAndDataArray(long[] shape, int[] data);
    IMemoryBufferReference CreateReference();
  }
  public sealed class TensorInt64Bit : IClosable, ILearningModelFeatureValue, IMemoryBuffer, ITensor {
    void Close();
    public static TensorInt64Bit CreateFromBuffer(long[] shape, IBuffer buffer);
    public static TensorInt64Bit CreateFromShapeArrayAndDataArray(long[] shape, long[] data);
    IMemoryBufferReference CreateReference();
  }
  public sealed class TensorInt8Bit : IClosable, ILearningModelFeatureValue, IMemoryBuffer, ITensor {
    void Close();
    public static TensorInt8Bit CreateFromBuffer(long[] shape, IBuffer buffer);
    public static TensorInt8Bit CreateFromShapeArrayAndDataArray(long[] shape, byte[] data);
    IMemoryBufferReference CreateReference();
  }
  public sealed class TensorString : IClosable, ILearningModelFeatureValue, IMemoryBuffer, ITensor {
    void Close();
    public static TensorString CreateFromShapeArrayAndDataArray(long[] shape, string[] data);
    IMemoryBufferReference CreateReference();
  }
  public sealed class TensorUInt16Bit : IClosable, ILearningModelFeatureValue, IMemoryBuffer, ITensor {
    void Close();
    public static TensorUInt16Bit CreateFromBuffer(long[] shape, IBuffer buffer);
    public static TensorUInt16Bit CreateFromShapeArrayAndDataArray(long[] shape, ushort[] data);
    IMemoryBufferReference CreateReference();
  }
  public sealed class TensorUInt32Bit : IClosable, ILearningModelFeatureValue, IMemoryBuffer, ITensor {
    void Close();
    public static TensorUInt32Bit CreateFromBuffer(long[] shape, IBuffer buffer);
   public static TensorUInt32Bit CreateFromShapeArrayAndDataArray(long[] shape, uint[] data);
    IMemoryBufferReference CreateReference();
  }
  public sealed class TensorUInt64Bit : IClosable, ILearningModelFeatureValue, IMemoryBuffer, ITensor {
    void Close();
    public static TensorUInt64Bit CreateFromBuffer(long[] shape, IBuffer buffer);
    public static TensorUInt64Bit CreateFromShapeArrayAndDataArray(long[] shape, ulong[] data);
    IMemoryBufferReference CreateReference();
  }
  public sealed class TensorUInt8Bit : IClosable, ILearningModelFeatureValue, IMemoryBuffer, ITensor {
    void Close();
    public static TensorUInt8Bit CreateFromBuffer(long[] shape, IBuffer buffer);
    public static TensorUInt8Bit CreateFromShapeArrayAndDataArray(long[] shape, byte[] data);
    IMemoryBufferReference CreateReference();
  }
}
namespace Windows.ApplicationModel {
  public sealed class Package {
    StorageFolder EffectiveLocation { get; }
    StorageFolder MutableLocation { get; }
  }
}
namespace Windows.ApplicationModel.AppService {
  public sealed class AppServiceConnection : IClosable {
    public static IAsyncOperation<StatelessAppServiceResponse> SendStatelessMessageAsync(AppServiceConnection connection, RemoteSystemConnectionRequest connectionRequest, ValueSet message);
  }
  public sealed class AppServiceTriggerDetails {
    string CallerRemoteConnectionToken { get; }
  }
  public sealed class StatelessAppServiceResponse
  public enum StatelessAppServiceResponseStatus
}
namespace Windows.ApplicationModel.Background {
  public sealed class ConversationalAgentTrigger : IBackgroundTrigger
}
namespace Windows.ApplicationModel.Calls {
  public sealed class PhoneLine {
    string TransportDeviceId { get; }
    void EnableTextReply(bool value);
  }
  public enum PhoneLineTransport {
    Bluetooth = 2,
  }
  public sealed class PhoneLineTransportDevice
}
namespace Windows.ApplicationModel.Calls.Background {
  public enum PhoneIncomingCallDismissedReason
  public sealed class PhoneIncomingCallDismissedTriggerDetails
  public enum PhoneTriggerType {
    IncomingCallDismissed = 6,
  }
}
namespace Windows.ApplicationModel.Calls.Provider {
  public static class PhoneCallOriginManager {
    public static bool IsSupported { get; }
  }
}
namespace Windows.ApplicationModel.ConversationalAgent {
  public sealed class ConversationalAgentSession : IClosable
  public sealed class ConversationalAgentSessionInterruptedEventArgs
  public enum ConversationalAgentSessionUpdateResponse
  public sealed class ConversationalAgentSignal
  public sealed class ConversationalAgentSignalDetectedEventArgs
  public enum ConversationalAgentState
  public sealed class ConversationalAgentSystemStateChangedEventArgs
  public enum ConversationalAgentSystemStateChangeType
}
namespace Windows.ApplicationModel.Preview.Holographic {
  public sealed class HolographicKeyboardPlacementOverridePreview
}
namespace Windows.ApplicationModel.Resources {
  public sealed class ResourceLoader {
    public static ResourceLoader GetForUIContext(UIContext context);
  }
}
namespace Windows.ApplicationModel.Resources.Core {
  public sealed class ResourceCandidate {
    ResourceCandidateKind Kind { get; }
  }
  public enum ResourceCandidateKind
  public sealed class ResourceContext {
    public static ResourceContext GetForUIContext(UIContext context);
  }
}
namespace Windows.ApplicationModel.UserActivities {
  public sealed class UserActivityChannel {
    public static UserActivityChannel GetForUser(User user);
  }
}
namespace Windows.Devices.Bluetooth.GenericAttributeProfile {
  public enum GattServiceProviderAdvertisementStatus {
    StartedWithoutAllAdvertisementData = 4,
  }
  public sealed class GattServiceProviderAdvertisingParameters {
    IBuffer ServiceData { get; set; }
  }
}
namespace Windows.Devices.Enumeration {
  public enum DevicePairingKinds : uint {
    ProvidePasswordCredential = (uint)16,
  }
  public sealed class DevicePairingRequestedEventArgs {
    void AcceptWithPasswordCredential(PasswordCredential passwordCredential);
  }
}
namespace Windows.Devices.Input {
  public sealed class PenDevice
}
namespace Windows.Devices.PointOfService {
  public sealed class JournalPrinterCapabilities : ICommonPosPrintStationCapabilities {
    bool IsReversePaperFeedByLineSupported { get; }
    bool IsReversePaperFeedByMapModeUnitSupported { get; }
    bool IsReverseVideoSupported { get; }
    bool IsStrikethroughSupported { get; }
    bool IsSubscriptSupported { get; }
    bool IsSuperscriptSupported { get; }
  }
  public sealed class JournalPrintJob : IPosPrinterJob {
    void FeedPaperByLine(int lineCount);
    void FeedPaperByMapModeUnit(int distance);
    void Print(string data, PosPrinterPrintOptions printOptions);
  }
  public sealed class PosPrinter : IClosable {
    IVectorView<uint> SupportedBarcodeSymbologies { get; }
    PosPrinterFontProperty GetFontProperty(string typeface);
  }
  public sealed class PosPrinterFontProperty
  public sealed class PosPrinterPrintOptions
  public sealed class ReceiptPrinterCapabilities : ICommonPosPrintStationCapabilities, ICommonReceiptSlipCapabilities {
    bool IsReversePaperFeedByLineSupported { get; }
    bool IsReversePaperFeedByMapModeUnitSupported { get; }
    bool IsReverseVideoSupported { get; }
    bool IsStrikethroughSupported { get; }
    bool IsSubscriptSupported { get; }
    bool IsSuperscriptSupported { get; }
  }
  public sealed class ReceiptPrintJob : IPosPrinterJob, IReceiptOrSlipJob {
    void FeedPaperByLine(int lineCount);
    void FeedPaperByMapModeUnit(int distance);
    void Print(string data, PosPrinterPrintOptions printOptions);
    void StampPaper();
  }
  public struct SizeUInt32
  public sealed class SlipPrinterCapabilities : ICommonPosPrintStationCapabilities, ICommonReceiptSlipCapabilities {
    bool IsReversePaperFeedByLineSupported { get; }
    bool IsReversePaperFeedByMapModeUnitSupported { get; }
    bool IsReverseVideoSupported { get; }
    bool IsStrikethroughSupported { get; }
    bool IsSubscriptSupported { get; }
    bool IsSuperscriptSupported { get; }
  }
  public sealed class SlipPrintJob : IPosPrinterJob, IReceiptOrSlipJob {
    void FeedPaperByLine(int lineCount);
    void FeedPaperByMapModeUnit(int distance);
    void Print(string data, PosPrinterPrintOptions printOptions);
  }
}
namespace Windows.Globalization {
  public sealed class CurrencyAmount
}
namespace Windows.Graphics.DirectX {
  public enum DirectXPrimitiveTopology
}
namespace Windows.Graphics.Holographic {
  public sealed class HolographicCamera {
    HolographicViewConfiguration ViewConfiguration { get; }
  }
  public sealed class HolographicDisplay {
    HolographicViewConfiguration TryGetViewConfiguration(HolographicViewConfigurationKind kind);
  }
  public sealed class HolographicViewConfiguration
  public enum HolographicViewConfigurationKind
}
namespace Windows.Management.Deployment {
  public enum AddPackageByAppInstallerOptions : uint {
    LimitToExistingPackages = (uint)512,
  }
  public enum DeploymentOptions : uint {
    RetainFilesOnFailure = (uint)2097152,
  }
}
namespace Windows.Media.Devices {
  public sealed class InfraredTorchControl
  public enum InfraredTorchMode
  public sealed class VideoDeviceController : IMediaDeviceController {
    InfraredTorchControl InfraredTorchControl { get; }
  }
}
namespace Windows.Media.Miracast {
  public sealed class MiracastReceiver
  public sealed class MiracastReceiverApplySettingsResult
  public enum MiracastReceiverApplySettingsStatus
  public enum MiracastReceiverAuthorizationMethod
  public sealed class MiracastReceiverConnection : IClosable
  public sealed class MiracastReceiverConnectionCreatedEventArgs
  public sealed class MiracastReceiverCursorImageChannel
  public sealed class MiracastReceiverCursorImageChannelSettings
  public sealed class MiracastReceiverDisconnectedEventArgs
  public enum MiracastReceiverDisconnectReason
  public sealed class MiracastReceiverGameControllerDevice
  public enum MiracastReceiverGameControllerDeviceUsageMode
  public sealed class MiracastReceiverInputDevices
  public sealed class MiracastReceiverKeyboardDevice
  public enum MiracastReceiverListeningStatus
  public sealed class MiracastReceiverMediaSourceCreatedEventArgs
  public sealed class MiracastReceiverSession : IClosable
  public sealed class MiracastReceiverSessionStartResult
  public enum MiracastReceiverSessionStartStatus
  public sealed class MiracastReceiverSettings
 public sealed class MiracastReceiverStatus
  public sealed class MiracastReceiverStreamControl
  public sealed class MiracastReceiverVideoStreamSettings
  public enum MiracastReceiverWiFiStatus
  public sealed class MiracastTransmitter
  public enum MiracastTransmitterAuthorizationStatus
}
namespace Windows.Networking.Connectivity {
  public enum NetworkAuthenticationType {
    Wpa3 = 10,
    Wpa3Sae = 11,
  }
}
namespace Windows.Networking.NetworkOperators {
  public sealed class ESim {
    ESimDiscoverResult Discover();
    ESimDiscoverResult Discover(string serverAddress, string matchingId);
    IAsyncOperation<ESimDiscoverResult> DiscoverAsync();
    IAsyncOperation<ESimDiscoverResult> DiscoverAsync(string serverAddress, string matchingId);
  }
  public sealed class ESimDiscoverEvent
  public sealed class ESimDiscoverResult
  public enum ESimDiscoverResultKind
}
namespace Windows.Perception.People {
  public sealed class EyesPose
  public enum HandJointKind
  public sealed class HandMeshObserver
  public struct HandMeshVertex
  public sealed class HandMeshVertexState
  public sealed class HandPose
  public struct JointPose
  public enum JointPoseAccuracy
}
namespace Windows.Perception.Spatial {
  public struct SpatialRay
}
namespace Windows.Perception.Spatial.Preview {
  public sealed class SpatialGraphInteropFrameOfReferencePreview
  public static class SpatialGraphInteropPreview {
    public static SpatialGraphInteropFrameOfReferencePreview TryCreateFrameOfReference(SpatialCoordinateSystem coordinateSystem);
    public static SpatialGraphInteropFrameOfReferencePreview TryCreateFrameOfReference(SpatialCoordinateSystem coordinateSystem, Vector3 relativePosition);
    public static SpatialGraphInteropFrameOfReferencePreview TryCreateFrameOfReference(SpatialCoordinateSystem coordinateSystem, Vector3 relativePosition, Quaternion relativeOrientation);
  }
}
namespace Windows.Security.Authorization.AppCapabilityAccess {
  public sealed class AppCapability
  public sealed class AppCapabilityAccessChangedEventArgs
  public enum AppCapabilityAccessStatus
}
namespace Windows.Security.DataProtection {
  public enum UserDataAvailability
  public sealed class UserDataAvailabilityStateChangedEventArgs
  public sealed class UserDataBufferUnprotectResult
  public enum UserDataBufferUnprotectStatus
  public sealed class UserDataProtectionManager
  public sealed class UserDataStorageItemProtectionInfo
  public enum UserDataStorageItemProtectionStatus
}
namespace Windows.Storage.AccessCache {
  public static class StorageApplicationPermissions {
    public static StorageItemAccessList GetFutureAccessListForUser(User user);
    public static StorageItemMostRecentlyUsedList GetMostRecentlyUsedListForUser(User user);
  }
}
namespace Windows.Storage.Pickers {
  public sealed class FileOpenPicker {
    User User { get; }
    public static FileOpenPicker CreateForUser(User user);
  }
  public sealed class FileSavePicker {
    User User { get; }
    public static FileSavePicker CreateForUser(User user);
  }
  public sealed class FolderPicker {
    User User { get; }
    public static FolderPicker CreateForUser(User user);
  }
}
namespace Windows.System {
  public sealed class DispatcherQueue {
    bool HasThreadAccess { get; }
  }
  public enum ProcessorArchitecture {
    Arm64 = 12,
    X86OnArm64 = 14,
  }
}
namespace Windows.System.Profile {
  public static class AppApplicability
  public sealed class UnsupportedAppRequirement
  public enum UnsupportedAppRequirementReasons : uint
}
namespace Windows.System.RemoteSystems {
  public sealed class RemoteSystem {
    User User { get; }
    public static RemoteSystemWatcher CreateWatcherForUser(User user);
    public static RemoteSystemWatcher CreateWatcherForUser(User user, IIterable<IRemoteSystemFilter> filters);
  }
  public sealed class RemoteSystemApp {
    string ConnectionToken { get; }
    User User { get; }
  }
  public sealed class RemoteSystemConnectionRequest {
    string ConnectionToken { get; }
    public static RemoteSystemConnectionRequest CreateFromConnectionToken(string connectionToken);
    public static RemoteSystemConnectionRequest CreateFromConnectionTokenForUser(User user, string connectionToken);
  }
  public sealed class RemoteSystemWatcher {
    User User { get; }
  }
}
namespace Windows.UI {
  public sealed class UIContentRoot
  public sealed class UIContext
}
namespace Windows.UI.Composition {
  public enum CompositionBitmapInterpolationMode {
    MagLinearMinLinearMipLinear = 2,
    MagLinearMinLinearMipNearest = 3,
    MagLinearMinNearestMipLinear = 4,
    MagLinearMinNearestMipNearest = 5,
    MagNearestMinLinearMipLinear = 6,
    MagNearestMinLinearMipNearest = 7,
    MagNearestMinNearestMipLinear = 8,
    MagNearestMinNearestMipNearest = 9,
  }
  public sealed class CompositionGraphicsDevice : CompositionObject {
    CompositionMipmapSurface CreateMipmapSurface(SizeInt32 sizePixels, DirectXPixelFormat pixelFormat, DirectXAlphaMode alphaMode);
    void Trim();
  }
  public sealed class CompositionMipmapSurface : CompositionObject, ICompositionSurface
  public sealed class CompositionProjectedShadow : CompositionObject
  public sealed class CompositionProjectedShadowCaster : CompositionObject
  public sealed class CompositionProjectedShadowCasterCollection : CompositionObject, IIterable<CompositionProjectedShadowCaster>
  public sealed class CompositionProjectedShadowReceiver : CompositionObject
  public sealed class CompositionProjectedShadowReceiverUnorderedCollection : CompositionObject, IIterable<CompositionProjectedShadowReceiver>
  public sealed class CompositionRadialGradientBrush : CompositionGradientBrush
  public sealed class CompositionSurfaceBrush : CompositionBrush {
    bool SnapToPixels { get; set; }
  }
  public class CompositionTransform : CompositionObject
  public sealed class CompositionVisualSurface : CompositionObject, ICompositionSurface
  public sealed class Compositor : IClosable {
    CompositionProjectedShadow CreateProjectedShadow();
    CompositionProjectedShadowCaster CreateProjectedShadowCaster();
    CompositionProjectedShadowReceiver CreateProjectedShadowReceiver();
    CompositionRadialGradientBrush CreateRadialGradientBrush();
    CompositionVisualSurface CreateVisualSurface();
  }
  public interface IVisualElement
}
namespace Windows.UI.Composition.Interactions {
  public enum InteractionBindingAxisModes : uint
  public sealed class InteractionTracker : CompositionObject {
    public static InteractionBindingAxisModes GetBindingMode(InteractionTracker boundTracker1, InteractionTracker boundTracker2);
    public static void SetBindingMode(InteractionTracker boundTracker1, InteractionTracker boundTracker2, InteractionBindingAxisModes axisMode);
  }
  public sealed class InteractionTrackerCustomAnimationStateEnteredArgs {
    bool IsFromBinding { get; }
  }
  public sealed class InteractionTrackerIdleStateEnteredArgs {
    bool IsFromBinding { get; }
  }
  public sealed class InteractionTrackerInertiaStateEnteredArgs {
    bool IsFromBinding { get; }
  }
  public sealed class InteractionTrackerInteractingStateEnteredArgs {
    bool IsFromBinding { get; }
  }
  public class VisualInteractionSource : CompositionObject, ICompositionInteractionSource {
    public static VisualInteractionSource CreateFromIVisualElement(IVisualElement source);
  }
}
namespace Windows.UI.Composition.Scenes {
  public enum SceneAlphaMode
  public enum SceneAttributeSemantic
  public sealed class SceneBoundingBox : SceneObject
  public class SceneComponent : SceneObject
  public sealed class SceneComponentCollection : SceneObject, IIterable<SceneComponent>, IVector<SceneComponent>
  public enum SceneComponentType
  public class SceneMaterial : SceneObject
  public class SceneMaterialInput : SceneObject
  public sealed class SceneMesh : SceneObject
  public sealed class SceneMeshMaterialAttributeMap : SceneObject, IIterable<IKeyValuePair<string, SceneAttributeSemantic>>, IMap<string, SceneAttributeSemantic>
  public sealed class SceneMeshRendererComponent : SceneRendererComponent
  public sealed class SceneMetallicRoughnessMaterial : ScenePbrMaterial
  public sealed class SceneModelTransform : CompositionTransform
  public sealed class SceneNode : SceneObject
  public sealed class SceneNodeCollection : SceneObject, IIterable<SceneNode>, IVector<SceneNode>
  public class SceneObject : CompositionObject
  public class ScenePbrMaterial : SceneMaterial
  public class SceneRendererComponent : SceneComponent
  public sealed class SceneSurfaceMaterialInput : SceneMaterialInput
  public sealed class SceneVisual : ContainerVisual
  public enum SceneWrappingMode
}
namespace Windows.UI.Core {
  public sealed class CoreWindow : ICorePointerRedirector, ICoreWindow {
    UIContext UIContext { get; }
  }
}
namespace Windows.UI.Core.Preview {
  public sealed class CoreAppWindowPreview
}
namespace Windows.UI.Input {
  public class AttachableInputObject : IClosable
  public enum GazeInputAccessStatus
  public sealed class InputActivationListener : AttachableInputObject
  public sealed class InputActivationListenerActivationChangedEventArgs
  public enum InputActivationState
}
namespace Windows.UI.Input.Preview {
  public static class InputActivationListenerPreview
}
namespace Windows.UI.Input.Spatial {
  public sealed class SpatialInteractionManager {
    public static bool IsSourceKindSupported(SpatialInteractionSourceKind kind);
  }
  public sealed class SpatialInteractionSource {
    HandMeshObserver TryCreateHandMeshObserver();
    IAsyncOperation<HandMeshObserver> TryCreateHandMeshObserverAsync();
  }
  public sealed class SpatialInteractionSourceState {
    HandPose TryGetHandPose();
  }
  public sealed class SpatialPointerPose {
    EyesPose Eyes { get; }
    bool IsHeadCapturedBySystem { get; }
  }
}
namespace Windows.UI.Notifications {
  public sealed class ToastActivatedEventArgs {
    ValueSet UserInput { get; }
  }
  public sealed class ToastNotification {
    bool ExpiresOnReboot { get; set; }
  }
}
namespace Windows.UI.ViewManagement {
  public sealed class ApplicationView {
    string PersistedStateId { get; set; }
    UIContext UIContext { get; }
    WindowingEnvironment WindowingEnvironment { get; }
    public static void ClearAllPersistedState();
    public static void ClearPersistedState(string key);
    IVectorView<DisplayRegion> GetDisplayRegions();
  }
  public sealed class InputPane {
    public static InputPane GetForUIContext(UIContext context);
  }
  public sealed class UISettings {
    bool AutoHideScrollBars { get; }
    event TypedEventHandler<UISettings, UISettingsAutoHideScrollBarsChangedEventArgs> AutoHideScrollBarsChanged;
  }
  public sealed class UISettingsAutoHideScrollBarsChangedEventArgs
}
namespace Windows.UI.ViewManagement.Core {
  public sealed class CoreInputView {
    public static CoreInputView GetForUIContext(UIContext context);
  }
}
namespace Windows.UI.WindowManagement {
  public sealed class AppWindow
  public sealed class AppWindowChangedEventArgs
  public sealed class AppWindowClosedEventArgs
  public enum AppWindowClosedReason
  public sealed class AppWindowCloseRequestedEventArgs
  public sealed class AppWindowFrame
  public enum AppWindowFrameStyle
  public sealed class AppWindowPlacement
  public class AppWindowPresentationConfiguration
  public enum AppWindowPresentationKind
  public sealed class AppWindowPresenter
  public sealed class AppWindowTitleBar
  public sealed class AppWindowTitleBarOcclusion
  public enum AppWindowTitleBarVisibility
  public sealed class CompactOverlayPresentationConfiguration : AppWindowPresentationConfiguration
  public sealed class DefaultPresentationConfiguration : AppWindowPresentationConfiguration
  public sealed class DisplayRegion
  public sealed class FullScreenPresentationConfiguration : AppWindowPresentationConfiguration
  public sealed class WindowingEnvironment
  public sealed class WindowingEnvironmentAddedEventArgs
  public sealed class WindowingEnvironmentChangedEventArgs
  public enum WindowingEnvironmentKind
  public sealed class WindowingEnvironmentRemovedEventArgs
}
namespace Windows.UI.WindowManagement.Preview {
  public sealed class WindowManagementPreview
}
namespace Windows.UI.Xaml {
  public class UIElement : DependencyObject, IAnimationObject, IVisualElement {
    Vector3 ActualOffset { get; }
    Vector2 ActualSize { get; }
    Shadow Shadow { get; set; }
    public static DependencyProperty ShadowProperty { get; }
    UIContext UIContext { get; }
    XamlRoot XamlRoot { get; set; }
  }
  public class UIElementWeakCollection : IIterable<UIElement>, IVector<UIElement>
  public sealed class Window {
    UIContext UIContext { get; }
  }
  public sealed class XamlRoot
  public sealed class XamlRootChangedEventArgs
}
namespace Windows.UI.Xaml.Controls {
  public sealed class DatePickerFlyoutPresenter : Control {
    bool IsDefaultShadowEnabled { get; set; }
    public static DependencyProperty IsDefaultShadowEnabledProperty { get; }
  }
  public class FlyoutPresenter : ContentControl {
    bool IsDefaultShadowEnabled { get; set; }
    public static DependencyProperty IsDefaultShadowEnabledProperty { get; }
  }
  public class InkToolbar : Control {
    InkPresenter TargetInkPresenter { get; set; }
    public static DependencyProperty TargetInkPresenterProperty { get; }
  }
  public class MenuFlyoutPresenter : ItemsControl {
    bool IsDefaultShadowEnabled { get; set; }
    public static DependencyProperty IsDefaultShadowEnabledProperty { get; }
  }
  public sealed class TimePickerFlyoutPresenter : Control {
    bool IsDefaultShadowEnabled { get; set; }
    public static DependencyProperty IsDefaultShadowEnabledProperty { get; }
  }
  public class TwoPaneView : Control
  public enum TwoPaneViewMode
  public enum TwoPaneViewPriority
  public enum TwoPaneViewTallModeConfiguration
  public enum TwoPaneViewWideModeConfiguration
}
namespace Windows.UI.Xaml.Controls.Maps {
  public sealed class MapControl : Control {
    bool CanTiltDown { get; }
    public static DependencyProperty CanTiltDownProperty { get; }
    bool CanTiltUp { get; }
    public static DependencyProperty CanTiltUpProperty { get; }
    bool CanZoomIn { get; }
    public static DependencyProperty CanZoomInProperty { get; }
    bool CanZoomOut { get; }
    public static DependencyProperty CanZoomOutProperty { get; }
  }
  public enum MapLoadingStatus {
    DownloadedMapsManagerUnavailable = 3,
  }
}
namespace Windows.UI.Xaml.Controls.Primitives {
  public sealed class AppBarTemplateSettings : DependencyObject {
    double NegativeCompactVerticalDelta { get; }
    double NegativeHiddenVerticalDelta { get; }
    double NegativeMinimalVerticalDelta { get; }
  }
  public sealed class CommandBarTemplateSettings : DependencyObject {
    double OverflowContentCompactYTranslation { get; }
    double OverflowContentHiddenYTranslation { get; }
    double OverflowContentMinimalYTranslation { get; }
  }
  public class FlyoutBase : DependencyObject {
    bool IsConstrainedToRootBounds { get; }
    bool ShouldConstrainToRootBounds { get; set; }
    public static DependencyProperty ShouldConstrainToRootBoundsProperty { get; }
    XamlRoot XamlRoot { get; set; }
  }
  public sealed class Popup : FrameworkElement {
    bool IsConstrainedToRootBounds { get; }
    bool ShouldConstrainToRootBounds { get; set; }
    public static DependencyProperty ShouldConstrainToRootBoundsProperty { get; }
  }
}
namespace Windows.UI.Xaml.Core.Direct {
  public enum XamlPropertyIndex {
    AppBarTemplateSettings_NegativeCompactVerticalDelta = 2367,
    AppBarTemplateSettings_NegativeHiddenVerticalDelta = 2368,
    AppBarTemplateSettings_NegativeMinimalVerticalDelta = 2369,
    CommandBarTemplateSettings_OverflowContentCompactYTranslation = 2384,
    CommandBarTemplateSettings_OverflowContentHiddenYTranslation = 2385,
    CommandBarTemplateSettings_OverflowContentMinimalYTranslation = 2386,
    FlyoutBase_ShouldConstrainToRootBounds = 2378,
    FlyoutPresenter_IsDefaultShadowEnabled = 2380,
    MenuFlyoutPresenter_IsDefaultShadowEnabled = 2381,
    Popup_ShouldConstrainToRootBounds = 2379,
    ThemeShadow_Receivers = 2279,
    UIElement_ActualOffset = 2382,
    UIElement_ActualSize = 2383,
    UIElement_Shadow = 2130,
  }
  public enum XamlTypeIndex {
    ThemeShadow = 964,
  }
}
namespace Windows.UI.Xaml.Documents {
  public class TextElement : DependencyObject {
    XamlRoot XamlRoot { get; set; }
  }
}
namespace Windows.UI.Xaml.Hosting {
  public sealed class ElementCompositionPreview {
    public static UIElement GetAppWindowContent(AppWindow appWindow);
    public static void SetAppWindowContent(AppWindow appWindow, UIElement xamlContent);
  }
}
namespace Windows.UI.Xaml.Input {
  public sealed class FocusManager {
    public static object GetFocusedElement(XamlRoot xamlRoot);
  }
  public class StandardUICommand : XamlUICommand {
    StandardUICommandKind Kind { get; set; }
  }
}
namespace Windows.UI.Xaml.Media {
  public class AcrylicBrush : XamlCompositionBrushBase {
    IReference<double> TintLuminosityOpacity { get; set; }
    public static DependencyProperty TintLuminosityOpacityProperty { get; }
  }
  public class Shadow : DependencyObject
  public class ThemeShadow : Shadow
  public sealed class VisualTreeHelper {
    public static IVectorView<Popup> GetOpenPopupsForXamlRoot(XamlRoot xamlRoot);
  }
}
namespace Windows.UI.Xaml.Media.Animation {
  public class GravityConnectedAnimationConfiguration : ConnectedAnimationConfiguration {
    bool IsShadowEnabled { get; set; }
  }
}
namespace Windows.Web.Http {
  public sealed class HttpClient : IClosable, IStringable {
    IAsyncOperationWithProgress<HttpRequestResult, HttpProgress> TryDeleteAsync(Uri uri);
    IAsyncOperationWithProgress<HttpRequestResult, HttpProgress> TryGetAsync(Uri uri);
    IAsyncOperationWithProgress<HttpRequestResult, HttpProgress> TryGetAsync(Uri uri, HttpCompletionOption completionOption);
    IAsyncOperationWithProgress<HttpGetBufferResult, HttpProgress> TryGetBufferAsync(Uri uri);
    IAsyncOperationWithProgress<HttpGetInputStreamResult, HttpProgress> TryGetInputStreamAsync(Uri uri);
    IAsyncOperationWithProgress<HttpGetStringResult, HttpProgress> TryGetStringAsync(Uri uri);
    IAsyncOperationWithProgress<HttpRequestResult, HttpProgress> TryPostAsync(Uri uri, IHttpContent content);
    IAsyncOperationWithProgress<HttpRequestResult, HttpProgress> TryPutAsync(Uri uri, IHttpContent content);
    IAsyncOperationWithProgress<HttpRequestResult, HttpProgress> TrySendRequestAsync(HttpRequestMessage request);
    IAsyncOperationWithProgress<HttpRequestResult, HttpProgress> TrySendRequestAsync(HttpRequestMessage request, HttpCompletionOption completionOption);
  }
  public sealed class HttpGetBufferResult : IClosable, IStringable
  public sealed class HttpGetInputStreamResult : IClosable, IStringable
  public sealed class HttpGetStringResult : IClosable, IStringable
  public sealed class HttpRequestResult : IClosable, IStringable
}
namespace Windows.Web.Http.Filters {
  public sealed class HttpBaseProtocolFilter : IClosable, IHttpFilter {
    User User { get; }
    public static HttpBaseProtocolFilter CreateForUser(User user);
  }
}

The post Windows 10 SDK Preview Build 18361 available now! appeared first on Windows Developer Blog.

Python in Visual Studio Code – March 2019 Release

$
0
0

We are pleased to announce that the March 2019 release of the Python Extension for Visual Studio Code is now available. You can download the Python extension from the Marketplace, or install it directly from the extension gallery in Visual Studio Code. You can learn more about Python support in Visual Studio Code in the documentation.  

In this release we made a series of improvements that are listed in our changelog, closing a total of 52 issues that includes: 

  • Live Share support in the Python Interactive Window 
  • Support installing packages with Poetry 
  • Improvements to the Python Language Server 
  • Improvements to the Test Explorer 

 Keep on reading to learn more! 

Live Share for Python Interactive  

Real-time collaboration is made easy with Visual Studio Live Share – it provides you with the ability to co-edit and co-debug while sharing audio, servers, terminals, diffs, comments, and more.  

In this update, the Python Interactive window has been enhanced to participate in Live Share collaboration sessions, making it possible to collaboratively explore and visualize data. Whether you are conducting a code review, pair programming with a teammate, participating in a hack-a-thon, or even teaching an interactive lecture, Live Share can support you in the many ways you collaborate. 

Support installing packages with Poetry  

This new release also adds the ability to use Poetry in Visual Studio Code with the Python extension, a dependency manager that allows you to keep the project’s development dependencies separate from production ones. Poetry support in the Python extension was a highly requested feature on our GitHub repository.

To try out this new feature, first make sure you have Poetry installed as well as the correspondent lock file generated. You can refer to the documentation to learn how to get started with Poetry. Then add the path to Poetry in your settings (through File > Preferences > Settings and searching for Poetry or adding “python.poetryPath”: “path/to/poetry” to your settings.json file). 

Now when you install a new package, it’ll use the provided Poetry path to install them:

Improvements to the Python Language Server

This release includes significant enhancements made to the Python Language Server, which was largely re-written and includes improvements on performance, memory usage and information display, support for relative imports and implicit packages, and understanding of typing, generics, PEP hints and annotations. And now it also offers auto-completion for f-strings and type information when you hover over sub-expressions:

As a reminder, the Language Server was released as a preview the last July release of the Python extension. To opt-in to the Language Server, change the python.jediEnabled setting to false in File > Preferences > User Settings. Since large changes were made to code analysis, there’s a list of known issues introduced that we are currently fixing. If you run into different problems, please file an issue on the Python Language Server GitHub page. We are working towards making the language server the default in future releases.

Improvements to the Test Explorer

On the last February release of the Python extension we added a built-in Test Explorer, that can be accessed through the Test beaker icon on the Activity Bar when tests are discovered in the workspace.

In this release we made improvements to the Test Explorer, including support for multi-root workspaces, parametrized tests and new status icons. The status icons allow you to quickly visualize which tests files or suites have failed without needing to expand the tree.

As a reminder, you can try the Test Explorer out by running the command Python: Discover Unit Tests from the Command Palette (View > Command Palette). If the unit test feature is disabled or no test framework is configured in the settings.json file, you’ll be prompted to select a framework and configure it. Once tests are discovered, the Test Explorer icon will appear on the Activity Bar.

Other Changes and Enhancements

We have also added small enhancements and fixed issues requested by users that should improve your experience working with Python in Visual Studio Code. Some notable changes include:

  • Fixed stopOnEntry not stopping on user code (#1159)
  • Support multiline comments for markdown cells (#4215)
  • Update icons and tooltip in test explorer indicating status of test files/suites (#4583)
  • Added commands translation for polish locale. (thanks pypros) (#4435)

Be sure to download the Python extension for Visual Studio Code now to try out the above improvements. If you run into any problems, please file an issue on the Python VS Code GitHub page.

The post Python in Visual Studio Code – March 2019 Release appeared first on Python.

What’s new in Azure IoT Central – March 2019

$
0
0

In IoT Central, our aim is to simplify IoT. We want to make sure your IoT data drives meaningful actions and visualizations. In this post, I will share new features now available in Azure IoT Central including embedded Microsoft Flow, updates to the Azure IoT Central connector, Azure Monitor action groups, multiple dashboards, and localization support. We also recently expanded Jobs functionality in IoT Central, so you can check out the announcement blog post to learn more.

Microsoft Flow is now embedded in IoT Central

You can now build workflows using your favorite connectors directly within IoT Central. For example, you can build a temperature alert rule that triggers a workflow to send push notifications and SMS all in one place within IoT Central. You can also test and share the workflow, see the run history, and manage all workflows attached to that rule.

Try it out in your IoT Central app by visiting Device Templates in Rules, adding a new action, and picking the Microsoft Flow tile.

Embedded Microsoft Flow experience in IoT Central.

Updated Azure IoT Central connector: Send a command and get device actions

With the updated Azure IoT Central connector, you can now build workflows in Microsoft Flow and Azure Logic Apps that can send commands on an IoT device and get device information like the name, properties, and settings values. You can also now build a workflow to tell an IoT device to reboot from a mobile app, and display the device’s temperature setting and location property in a mobile app.

Try it out in Microsoft Flow or Azure Logic Apps by using the Send a command and Get device actions in your workflow.

Get a device and Run a command actions in Microsoft Flow.

Integration with Azure Monitor action groups

Azure Monitor action groups are reusable groups of actions that can be attached to multiple rules at once. Instead of creating separate actions for each rule and entering in the recipient’s email address, SMS number, and webhook URL for each, you can choose an action group that contains all three from a drop down and expect to receive notifications on all three channels. The same action group can be attached to multiple rules and are reusable across Azure Monitor alerts.

Try it out in your IoT Central app by visiting Device Templates in Rules, adding a new action, and then pick the Azure Monitor action groups tile.

Azure monitor action groups as an action for your rules.

Multiple dashboards

Users can now create multiple personal dashboards in their IoT Central app! You can now build customized dashboards to better organize your devices and data. The default application dashboard is still available for all users, but each user of the app can create personalized dashboards and switch between them.

Multiple dashboards: choosing between application dashboards and personal dashboards.

Localization support

As of today, IoT Central supports 17 languages! You can select your preferred language in the settings section in the top navigation, and this will apply when you use any app in IoT Central. Each user can have their own preferred language, and you can change it at any time.

Choosing a different language in IoT Central.

With these new features, you can more conveniently build workflows as actions and reuse groups of actions, organize your visualizations across multiple dashboards, and work with IoT Central with your favorite language. Stay tuned for more developments in IoT Central. Until next time!

Next steps

  • Have ideas or suggestions for new features? Post it on Uservoice.
  • To explore the full set of features and capabilities and start your free trial, visit the IoT Central website.
  • Check out our documentation including tutorials to connect your first device.
  • To give us feedback about your experience with Azure IoT Central, take this survey.
  • To learn more about the Azure IoT portfolio including the latest news, visit the Microsoft Azure IoT page.

High-Throughput with Azure Blob Storage

$
0
0

I am happy to announce that High-Throughput Block Blob (HTBB) is globally enabled in Azure Blob Storage. HTBB provides significantly improved and instantaneous write throughput when ingesting larger block blobs, up to the storage account limits for a single blob. We have also removed the guesswork in naming your objects, enabling you to focus on building the most scalable applications and not worry about the vagaries of cloud storage.

HTBB demo of 12.5GB/s single blob throughput at Microsoft Ignite

I demonstrated the significantly improved write performance at Microsoft Ignite 2018. The demo application orchestrated the upload of 50,000 32MiB (1,600,000 MiB) block blobs from RAM using Put Block operations to a single blob. When all blocks were uploaded, it sent the block list to create the blob using the Put Block List operation. It orchestrated the upload using four D64v3 worker virtual machines (VMs), each VM writing 25 percent of the block blobs. The total time to upload the object took around 120 seconds which is about 12.5GB/s. Check out the demo in the video below to learn more.

GB+ throughput using a single virtual machine

To illustrate the possible performance using just a single VM, I created a D32v3 VM running Linux in US West2. I stored the files to upload on a local RAM disk to reduce local storage performance affecting the results. I then created the files using the head command with input from /dev/urandom to fill them with random data. Finally I used AzCopy v10 (v10.0.4) to upload the files to a standard storage account in the same region. I ran each iteration 5 times and averaged the time to upload in the table below.

Data set Time to upload Throughput
1,000 x 10MB 10 seconds 1.0 GB/s
100 x 100MB 8 seconds 1.2 GB/s
10 x 1GB 8 seconds 1.2 GB/s
1 x 10GB 8 seconds 1.2 GB/s
1 x 100GB 58 seconds 1.7 GB/s

HTBB everywhere

HTBB is active on all your existing storage accounts, and does not require opt-in. It also comes without any extra cost. HTBB doesn’t introduce any new APIs and is automatically active when using Put Block or Put Blob operations over a certain size. The following table lists the minimum required Put Blob or Put Block size to activate HTBB.

Storage Account type Minimum size for HTBB
StorageV2 (General purpose v2) >4MB
Storage (General purpose v1) >4MB
Blob Storage >4MB
BlockBlobStorage (Premium) >256KB

Azure Tools and Services supporting HTBB

There is a broad set of tools and services that already support HTBB, including:

Conclusion

We’re excited about the throughput improvements and application simplifications High-Throughput Block Blob brings to Azure Blob Storage! It is now available in all Azure regions and automatically active on your existing storage accounts at no extra cost. We look forward to hearing your feedback. To learn more about Blob Storage, please visit our product page.

Azure Sphere ecosystem accelerates innovation

$
0
0

The Internet of Things (IoT) promises to help businesses cut costs and create new revenue streams, but it also brings an unsettling amount of risk. No one wants a fridge that gets shut down by ransomware, a toy that spies on children, or a production line that’s brought to a halt through an entry point in a single hacked sensor.

So how can device builders bring a high level of security to the billions of network-connected devices expected to be deployed in the next decade?

It starts with building security into your IoT solution from the silicon up. In this piece, I will discuss the holistic device security of Azure Sphere, as well as how the expansion of the Azure Sphere ecosystem is helping to accelerate the process of taking secure solutions to market. For additional partner-delivered insights around Azure Sphere, view the Azure Sphere Ecosystem Expansion Webinar.

Two women sitting together at a desk working on an Azure Sphere device

A new standard for security

Small, lightweight microcontrollers (or MCUs) are the most common class of computer, powering everything from appliances to industrial equipment. Organizations have learned that security for their MCU-powered devices is critical to their near-term sales and to the long-term success of their brands (one successful attack can drive customers away from the affected brand for years). Yet predicting which devices can endure attacks is difficult.

Through years of experience, Microsoft has learned that to be highly secured, a connected device must possess seven specific properties:

  1. Hardware-based root of trust: The device must have a unique, unforgeable identity that is inseparable from the hardware.
  2. Small trusted computing base: Most of the device's software should be outside a small trusted computing base, reducing the attack surface for security resources such as private keys.
  3. Defense in depth: Multiple layers of defense mean that even if one layer of security is breached, the device is still protected.
  4. Compartmentalization: Hardware-enforced barriers between software components prevent a breach in one from propagating to others.
  5. Certificate-based authentication: The device uses signed certificates to prove device identity and authenticity.
  6. Renewable security: Updated software is installed automatically and devices that enter risky states are always brought into a secure state.
  7. Failure reporting: All device failures, which could be evidence of attacks, are reported to the manufacturer.

These properties work together to keep devices protected and secured in today's dynamic threat landscape. Omitting even one of these seven properties can leave devices open to attack, creating situations where responding to security events is difficult and costly. The seven properties also act as a practical framework for evaluating IoT device security.

How Azure Sphere helps you build secure devices

Azure Sphere – Microsoft’s end-to-end solution for creating highly-secure, connected devices – delivers these seven properties, making it easy and affordable for device manufacturers to create devices that are innately secure and prepared to meet evolving security threats. Azure Sphere introduces a new class of MCU that includes built-in Microsoft security technology and connectivity and the headroom to support dynamic experiences at the intelligent edge.

Multiple levels of security are baked into the chip itself. The secured Azure Sphere OS runs on top of the hardware layer, only allowing authorized software to run. The Azure Sphere Security Service continually verifies the device's identity and authenticity and keeps its software up to date. Azure Sphere has been designed for security and affordability at scale, even for low-cost devices. 

Opportunities for ecosystem expansion

In today’s world, device manufacturing partners view security as a necessity for creating connected experiences. The end-to-end security of Azure Sphere creates a potential for significant innovation in IoT. With a turnkey solution that helps prevent, detect, and respond to threats, device manufacturers don’t need to invest in additional infrastructure or staff to secure these devices. Instead, they can focus their efforts on rethinking business models, product experiences, how they serve customers, and how they predict customer needs.

To accelerate innovation, we’re working to expand our partner ecosystem. Ecosystem expansion offers many advantages. It reduces the overall complexity of the final product and speeds time to market. It frees up device builders to expand technical capabilities to meet the needs of customers. Plus, it enables more responsive innovation of feature sets for module partners and customization of modules for a diverse ecosystem. Below we’ve highlighted some partners who are a key part of the Azure Sphere ecosystem.

Seeed Studio, a Microsoft partner that specializes in hardware prototyping, design and manufacturing for IoT solutions, has been selling their MT3620 Development Board since April 2018. They also sell complementary hardware that enables rapid, solder-free prototyping using their Grove system of modular sensors, actuators, and displays. In September 2018, they released the Seeed Grove starter kit, which contains an expansion shield and a selection of sensors. Besides hardware for prototyping, they are going to launch more vertical solutions based on Azure Sphere for the IoT market. In March, Seeed also introduced the MT3620 Mini Dev Board, a lite version of Seeed’s previous Azure Sphere MT3620 Development Kit. Seeed developed this board to meet the needs of developers who need smaller sizes, greater scalability and lower costs.

AI-Link has released the first Azure Sphere module that is ready for mass production. AI-Link is the top IoT module developer and manufacturer in the market today and shipped more than 90 million units in 2018.

Avnet, an IoT solution aggregator and Azure Sphere chips distributor, unveiled their Azure Sphere module and starter kit in January 2019. Avnet will also be building a library of general and application specific Azure Sphere reference designs to accelerate customer adoption and time to market for Azure Sphere devices and solutions.

Universal Scientific Industrial (Shanghai) Co., Ltd. (USI) recently unveiled their Azure Sphere combo module, uniquely designed for IoT applications, with multi-functionality design-in support by standard SDK. Customers can easily migrate from a discrete MCU solution to build their devices based on this module with secured connectivity to the cloud and shorten design cycle.

Learn more about the Azure Sphere ecosystem

To learn more, view the on-demand Azure Sphere Ecosystem Expansion webinar. You’ll hear from each of our partners as they discuss the Azure Sphere opportunity from their own perspective, as well as how you can take full advantage of Azure Sphere ecosystem expansion efforts.

For in-person opportunities to gain actionable insights, deepen partnerships, and unlock the transformative potential of intelligent edge and intelligent cloud IoT solutions, sign up for an in-person IoT in Action event coming to a city near you.

Azure Stack IaaS – part six

$
0
0

Pay for what you use

In the virtualization days I used to pad all my requests for virtual machines (VM) to get the largest size possible. Since decisions and requests took time, I would ask for more than I required just so I wouldn’t have delays if I needed more capacity. This resulted in a lot of waste and a term I heard often–VM sprawl.

The behavior is different with Infrastructure-as-a-Service (IaaS) VMs in the cloud. A fundamental quality of a cloud is that it provides an elastic pool for your resource to use when needed. Since you only pay for what you use, you don’t need to over provision. Instead, you can optimize capacity based on demand. Let me show you some of the ways you can do this for your IaaS VMs running in Azure and Azure Stack.

Resize

It’s hard to know exactly how big your VM should be. There are so many dimensions to consider, such as CPU, memory, disks, and network. Instead of trying to predict what your VM needs for the next year or even month, why not take a guess, let it run, and then adjust the size once you have some historical data.

Azure and Azure Stack makes it easy for you to resize your VM from the portal. Pick the new size and you’re done. No need to call the infrastructure team and beg for more capacity. No need to over spend for a huge VM that isn’t even used.

Choose a virtual machine size through the portal in Microsoft Azure Stack

Learn more:

Scale out

Another dimension of scale is to make multiple copies of identical VMs to work together as a unit. When you need more, create additional VMs. When you need less, remove some of the VMs. Azure has a feature for this called Virtual Machine Scale Sets (VMSS) which is also available in Azure Stack. You can create a VMSS with a wizard. Fill out the details of how the VM should be configured, including which extensions to use and which software to load onto your VM. Azure takes care of wiring the network, placing the VMs behind a load balancer, creating the VMs, and running the in guest configuration.

Create a virtual machine scale set in Microsoft Azure Stack

Once you have created the VMSS, you can scale it up or down. Azure automates everything for you. You control it like IaaS, but scale it like PaaS. It was never this easy in the virtualization days.

Scale a Virtual Machine Scale Set up or down

Learn more:

Add, remove, and resize disk

Just like virtual machines in the cloud, storage is pay per use. Both Azure and Azure Stack make it easy for you to manage the disks running on that storage so you only need to use what your application requires. Adding, removing, and resizing data disks is a self-service action so you can right-size your VM’s storage based on your current needs.

Add, remove, and resize disk

Learn more:

Usage based pricing

Just like Azure, Azure Stack prices are based on how much you use. Since you take on the hardware and operating costs, Azure Stack service fees are typically lower than Azure prices. Your Azure Stack usage will show up as line items in your Azure bill. If you run your Azure Stack in a network which is disconnected from the Internet, Azure Stack offers a yearly capacity model.

Pay-per-use really benefits Azure Stack customers. For example, one organization runs a machine learning model once a month. It takes about one week for the computation. During this time, they use all the capacity of their Azure Stack, but for the other three weeks of the month, they run light, temporary workloads on the system. A later blog will cover how automation and infrastructure-as-code allows you to quickly set this up and tear it down, allowing you to just use what the app needs in the time window it’s needed. Right-sizing and pay-per-use saves you a lot of money.

Learn more:

In this blog series

We hope you come back to read future posts in this blog series. Here are some of our past and upcoming topics:

Analysis of network connection data with Azure Monitor for virtual machines

$
0
0

Azure Monitor for virtual machines (VMs) collects network connection data that you can use to analyze the dependencies and network traffic of your VMs. You can analyze the number of live and failed connections, bytes sent and received, and the connection dependencies of your VMs down to the process level. If malicious connections are detected it will include information about those IP addresses and threat level. The newly released VMBoundPort data set enables analysis of open ports and their connections for security analysis.

To begin analyzing this data, you will need to be on-boarded to Azure Monitor for VMs.

Workbooks

If you would like to start your analysis with a prebuilt, editable report you can try out some of the Workbooks we ship with Azure Monitor for VMs. Once on-boarded you navigate to Azure Monitor and select Virtual Machines (preview) from the insights menu section. From here, you can navigate to the Performance or Map tab to see a link for View Workbook that will open the Workbook gallery which includes the following Workbooks that analyze our network data:

  • Connections overview
  • Failed connections
  • TCP traffic
  • Traffic comparison
  • Active ports
  • Open ports

These editable reports let you analyze your connection data for a single VM, groups of VMs, and virtual machine scale sets.

Log Analytics

If you want to use Log Analytics to analyze the data, you can navigate to Azure Monitor and select Logs to begin querying the data. The logs view will show the name of the workspace that has been selected and the schema within that workspace. Under the ServiceMap data type you will find two tables:

  • VMBoundPort
  • VMConnection

You can copy and paste the queries below into the Log Analytics query box to run them. Please note, you will need to edit a few of the examples below to provide the name of a computer that you want to query.

Screenshot of copying and pasting queries into the Log Analytics query box

Common queries

Review the count of ports open on your VMs, which is useful when assessing which VMs configuration and security vulnerabilities.

VMBoundPort
| where Ip != "127.0.0.1"
| summarize by Computer, Machine, Port, Protocol
| summarize OpenPorts=count() by Computer, Machine
| order by OpenPorts desc

List the bound ports on your VMs, which is useful when assessing which VMs configuration and security vulnerabilities.

VMBoundPort
| distinct Computer, Port, ProcessName

Analyze network activity by port to determine how your application or service is configured.

VMBoundPort
| where Ip != "127.0.0.1"
| summarize BytesSent=sum(BytesSent), BytesReceived=sum(BytesReceived), LinksEstablished=sum(LinksEstablished), LinksTerminated=sum(LinksTerminated), arg_max(TimeGenerated, LinksLive) by Machine, Computer, ProcessName, Ip, Port, IsWildcardBind
| project-away TimeGenerated
| order by Machine, Computer, Port, Ip, ProcessName

Bytes sent and received trends for your VMs.

VMConnection
| summarize sum(BytesSent), sum(BytesReceived) by bin(TimeGenerated,1hr), Computer
| order by Computer desc
//| limit 5000
| render timechart

If you have a lot of computers in your workspace, you may want to uncomment the limit statement in the example above. You can use the chart tools to view either bytes sent or received, and to filter down to specific computers.

Screenshot of chart tools being used to view Bytes sent or received

Connection failures over time, to determine if the failure rate is stable or changing.

VMConnection
| where Computer == <replace this with a computer name, e.g. ‘acme-demo’>
| extend bythehour = datetime_part("hour", TimeGenerated)
| project bythehour, LinksFailed
| summarize failCount = count() by bythehour
| sort by bythehour asc
| render timechart

Link status trends, to analyze the behavior and connection status of a machine.

VMConnection
| where Computer == <replace this with a computer name, e.g. ‘acme-demo’>
| summarize  dcount(LinksEstablished), dcount(LinksLive), dcount(LinksFailed), dcount(LinksTerminated) by bin(TimeGenerated, 1h)
| render timechart

Screenshot of line chart showing query results from the last 24 hours

Getting started with log queries in Azure Monitor for VMs

To learn more about Azure Monitor for VMs, please read our overview, “What is Azure Monitor for VMs (preview).” If you are already using Azure Monitor for VMs, you can find additional example queries in our documentation for querying data with Log Analytics.

Happy birthday to managed Open Source RDBMS services in Azure!

$
0
0

March 20, 2019 marked the first anniversary of general availability for our managed Open Source relational database management system (RDBMS) services, including Azure Database for PostgreSQL and Azure Database for MySQL. A great year of learning and improvements lays behind us, and we are looking forward to an exciting future!

Thank you to all our customers, who have trusted Azure to host their Open Source Software (OSS) applications with MySQL and PostgreSQL databases. We are very grateful for your support and for pushing us to build the best managed services in the cloud!

It’s amazing to the see the variety of mission critical applications that customers run on top of our services. From line of business applications over real-time event processing to internet of things applications, we see all possible patterns running across our different OSS RDBMS offerings. Check out some great success stories by reading our case studies! It’s humbling to see the trust our customers put in the platform! We love the challenges posed by this variety of use cases, and we are always hungry to learn and provide even more enhanced support.

We wouldn’t have reached this point without ongoing feedback and feature requests from our customers. There have been asks for functionality such as read replicas, greater performance, extended regional coverage, additional RDBMS engines like MariaDB, and more. In response, over the year since our services became generally available, we have delivered features and functionality to address these asks. Just check out some of the announcements we have made over the past year:

We also want to enable customers to focus on using these features when developing their applications. To that end, we are constantly enhancing our compliance certification portfolio to address a broader set of standards. This gives customers the peace of mind, knowing that our services are increasingly safe and secure. We have also introduced features such as Threat Protection (MySQL, PostgreSQL) and Intelligent Performance (PostgreSQL) to the OSS RDBMS services, so there are two fewer things to worry about!

Open Source is all about the community and the ecosystem built around the Open Source products delivered by the community. We want to bring this goodness to our platform and support it so that customers can leverage the benefits when using our managed services. For example, we have recently announced support for GraphQL with Hasura and TimescaleDB! However, we want to be more than a consumer and make significant contributions to the community. Our first major contribution was the release of the Open Source Azure Data Studio with support for PostgreSQL.

While we are proud to highlight these developments, we also understand that we are still at the outset of the journey. We have a lot of work to do and many challenges to overcome, but we are continuing to move ahead at full steam. We are very thrilled to have Citus Data joining the team, and you can expect to see a lot of focus on enabling improved performance, greater scale, and more built-in intelligence. Find more information about this acquisition by visiting the blog post, “Microsoft and Citus Data: Providing the best PostgreSQL service in the cloud.”

Next steps

In the interim, be sure to take advantage of the following, helpful resources.

We look forward to continued feedback and feature requests from our customers. More than ever, we are committed to ensuring that our OSS RDBMS services are top-notch leaders in the cloud! Stay tuned, as we have a lot more in the pipeline!

Viewing all 5264 articles
Browse latest View live


Latest Images

<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>