Quantcast
Channel: Category Name
Viewing all 5264 articles
Browse latest View live

Transforming your data in Azure SQL Database to columnstore format

$
0
0

We are excited to reveal a public preview of a new feature in Azure SQL Database, both in logical server and Managed Instance, called CLUSTERED COLUMNSTORE ONLINE INDEX build. This operation enables you to migrate your data stored in row-store format to the columnstore format and maintain your columnstore data structures with the minimal downtime on your workload.

Why columnstore format?

Azure SQL Database enables you to fine-tune and optimize data structures and indexes in your database to get the best performance of your queries depending on your workload and size of data. Relational data in Azure SQL Database can be organized in two formats:

  • Row-store format, which is an ideal option for OLTP workloads where the queries are accessing individual rows or set of rows in the table. This is the general-purpose table format used for most of the data in relational databases.
  • Columnstore format, which is optimized for analytical queries and high compression of data (up to 100x). This format is perfect for the large data sets that can be efficiently compressed using this format and the analytical queries with complex calculations that use subset of the table columns.

In some cases, you might notice that your existing table that is organized in row-store format is not suitable for the queries that are executed on the table. Also, in some cases, you might want to apply high compression in columnstore tables to minimize the size of your table. In that case, you would need to transform your data into column-store format to compress the data and boost the performance of your analytical queries.

Transforming data to columnstore format

You can transform the existing row-store tables into the column-store format by creating CLUSTERED COLUMNSTORE INDEX on a table. Clustered columnstore index will take your original dataset from the table, organize it by columns, and apply efficient high compression algorithms to minimize the size of your data.

A Transact-SQL statement that creates CLUSTERED COLUMNSTORE INDEX on a table and transforms the data to columnstore format is shown in the following example:

CREATE CLUSTERED COLUMNSTORE INDEX cci ON Sales.Orders

Clustered columnstore index created on a table will reorganize data in the table and convert rows into the columns with high compression.

The limitation of this operation is that all incoming transactions that are trying to update the rows in the table that is going to be transformed from row-store to columnstore format must be blocked until the transformation finishes. This is known as offline index build shown on the following picture:

Azure SQL clustered columnstore index offline build

All incoming transactions are blocked while the table is transformed from the row-store to the column-store format. This process might cause some downtime in your workload while the table is transformed so you would need to choose the time when to transform the data to the columnstore format.

Online transition to columnstore format

The latest release of Azure SQL Database enables you to transform your row-store tables to the columnstore format without blocking incoming transactions using the online version of columnstore index build (currently in public preview).

You can use the following T-SQL syntax to transform your row-store table into the columnstore format:

CREATE COLUMNSTORE INDEX cci ON Sales.Orders WITH ( ONLINE = ON )

Create clustered index operation with the option WITH (ONLINE = ON) will take all incoming transactions and continuously include the data changes into the target columnstore data structure while the original data is transformed:

Online clustered columnstore index build in Azure SQL

Online clustered columnstore index build enables you to optimize and compress your data with minimal downtime without major blocking operations on the queries that are executing while you are transforming the data.

Besides online transformation from row-store to columnstore format, the following online features in clustered columnstore indexes are available in Azure SQL Database:

  • Existing clustered columnstore indexes can be rebuilt in the online mode, meaning that workload that is working with the table don’t need to be blocked while you are performing this index maintenance operation.
  • Non-clustered indexes on the tables organized in columnstore format can be rebuilt using the online option, without blocking the workload.

ONLINE CLUSTERED COLUMNSTORE index build operation would help you to perform data transformation and maintenance operations on clustered columnstore indexes with minimal downtime on the incoming workload. This feature is currently in preview in all flavors of Azure SQL Database including logical servers, elastic pools, and Managed Instances.

For more information, please see the columnstore indexes documentation page.


Connect Azure Data Explorer to Power BI for visual depiction of data

$
0
0

Do you want to analyze vast amounts of data, create Power BI dashboards and reports to help you visualize your data, and share insights across your organization? Azure Data Explorer (ADX), a lightning-fast indexing and querying service helps you build near real-time and complex analytics solutions for vast amounts of data. ADX can connect to Power BI, a business analytics solution that lets you visualize your data and share the results across your organization. The various methods of connection to Power BI allow for interactive analysis of organizational data such as tracking and presentation of trends.

Simple and intuitive native connector

The native connector to Power BI unlocks the power of Azure Data Explorer in only a minute. In a very intuitive process, add your cluster name and let the connector take care of the rest. Provide the database and table name to focus your analysis on specific data. You can use import mode for snappy interaction with the data or direct query mode for filtering large datasets and near real-time updates. To use the native connector method read our documentation, “Quickstart: Visualize data using the Azure Data Explorer connector for Power BI.”

Imported query

A specific Azure Data Explorer query can also be used to import data to Power BI. When copying a query from Kusto Explorer or Kusto Web UI, paste it in the blank query connector window and load the query’s data to Power BI. To use the imported query method read our documentation, “Quickstart: Visualize data using a query imported into Power BI.”

General purpose SQL (MS-TDS) connector

If you prefer running SQL queries to analyze your data, use the Azure SQL Database connector to connect to Azure Data Explorer. You can use the import mode or direct query mode to bring just the data needed for analysis. To use the SQL connector read our documentation, “Quickstart: Visualize data using the Azure Data Explorer connector for Power BI.”

Example

The following example depicts how the native Power BI connector is used to query GitHub public data that was pulled into the Demo11 ADX cluster and stored in the GitHub database.

screenshot of Power BI being used to query GitHub public data

Once loaded to Power BI, you can build any report or dashboard to analyze and visually represent the GitHub event data.

Example of Power BI built report representing GitHub event data

Next steps

In this blog, we depict the various ways to query data from Azure Data Explorer to Power BI. Additional connectors and plugins to analytics tools and services will be added in the weeks to come. Stay tuned for more updates.

To find out more about Azure Data Explorer you can:

Azure Marketplace new offers – Volume 28

$
0
0

We continue to expand the Azure Marketplace ecosystem. From November 17 to November 30, 2018, 80 new offers successfully met the onboarding criteria and went live. See details of the new offers below:

Virtual machines

CloudflareAzure

CloudflareAzure: Cloudflare speeds up and protects millions of websites, APIs, Software-as-a-Service solutions, and more. Our offering will allow you to leverage the benefits provided by Cloudflare services without the need to reconfigure components of your Azure setup.

CrystalBall

CrystalBall: Machinesense's industrial IoT app for data monitoring and analytics offers onboarding and fleet management of sensors and machines, along with machine health data visualization, energy management, and a maintenance, operations, and repairs log.

HPE StoreOnce VSA 3.18.7

HPE StoreOnce VSA 3.18.7: The HPE StoreOnce virtual appliance enables you to reduce backup data storage costs with high-performance data deduplication.

InterSystems IRIS Data Platform

InterSystems IRIS Data Platform: IRIS by InterSystems is a complete data platform that gives developers the freedom to choose the language and data model best suited for rapidly developing their applications.

JAMS V7 (BYOL) - Server 2016

JAMS V7 (BYOL) - Server 2016: JAMS is an enterprise batch-scheduling and workload automation solution. Automate jobs on Windows, Linux, UNIX, z/OS, System I, and OpenVMS, with support for jobs running on databases, ERP or CRM solutions, BI tools, and more.

Neoway Ubuntu Image

Neoway Ubuntu Image: This Ubuntu image for a Neoway environment ensures security compliance provided by CIS controls, scripted processes, and/or custom apps. Define, manage, and monitor jobs through a graphical user interface, a REST or .NET API, or PowerShell cmdlets.

NFS 2016 Windows Storage File Server

NFS 2016 Windows Storage File Server: Network File System (NFS) 2016 Server provides a file-sharing solution for enterprises that have heterogeneous environments that include Windows and non-Windows computers.

Postgres Pro Standard Database 11

Postgres Pro Standard Database 11: Postgres Pro is a modern, PostgreSQL-based database with SQL and NoSQL features.

ProcessGold

ProcessGold: ProcessGold is a process analytics platform for large organizations. ProcessGold transforms traditional audits by creating end-to-end process maps across applications and teams to highlight commercial and operational risk.

QoreStor 5.0.1

QoreStor™ 5.0.1: Break free from backup appliances, accelerate backup performance, and reduce storage requirements and costs with Quest's QoreStor, a software-defined secondary storage platform for backup.

RADIUS 2016 Server - Wireless Authentication NPS

RADIUS 2016 Server - Wireless Authentication NPS: This RADIUS server uses NPS to perform centralized authentication, authorization, and accounting for wireless, authenticating switches, remote access dial-up, or virtual private network (VPN) connections.

S2IX - Virtual Machine

S2IX - Virtual Machine: Get it now in the Azure Marketplace.

Tiger Bridge

Tiger Bridge: Tiger Bridge makes it easy to align data value with storage costs by seamlessly extending your NTFS or Tiger Store file system and performing transparent data migration between one storage tier and another.

VIP

VIP: Automate manual processes, letting bots perform repetitive tasks for you. VIP offers high-performance robotic process automation and end-to-end test automation.

Web applications

Bitbucket Data Center

Bitbucket Data Center: Bitbucket Data Center by Atlassian is a self-managed solution for modern source code collaboration. Cluster multiple active servers to ensure users have uninterrupted access to Bitbucket Data Center in the event of unexpected node failure.

Citrix SD-WAN Center 10.2

Citrix SD-WAN Center 10.2: Citrix SD-WAN Center is a centralized management solution for Citrix SD-WAN appliances. It provides visibility into application performance by exposing a rich set of reports and statistics across the Citrix SD-WAN network.

Corda Single Node

Corda Single Node: Create and deploy a Corda node that can join a network. You'll be able to supply your own CorDapps that you wish to be included in the node.

IPFS (beta)

IPFS (beta): This offering enables the creation of permissioned networks of IPFS nodes to form a decentralized storage network. Users can select the size of network they would like to provision and share with others.

Microsoft Healthcare Bot (Preview)

Microsoft Healthcare Bot (Preview): The Microsoft Healthcare Bot service empowers healthcare organizations to build and deploy AI-powered virtual assistants and chatbots to enhance their processes, self-service offerings, and cost-reduction efforts.

Provance ITSM Azure Connector

Provance ITSM Azure Connector: The Provance ITSM Azure Connector bridges the gap between your Azure infrastructure and Provance ITSM, letting you manage Azure resources like any other infrastructure within Provance ITSM.

SAP BusinessObjects to Alation

SAP BusinessObjects to Alation: Information Asset has developed a solution to ingest reports from SAP BusinessObjects Web Intelligence and SAP Crystal Reports into Alation Data Catalog. This solution requires Java 8 and is packaged as an executable JAR file.

TIBCO Spotfire to Alation

TIBCO Spotfire to Alation: This Information Asset solution imports dashboards from TIBCO Spotfire into BI Server in Alation. The library structure of TIBCO Spotfire is imported into Alation as well as the names and thumbnails of dashboards within the library.

Container solutions

Apache 2.4 Secured Alpine Container with Antivirus

Apache 2.4 Secured Alpine Container with Antivirus: Deploy an enterprise-ready container for Apache 2.4 on Alpine.

Kubectl Container Image

Kubectl Container Image: Kubectl is the Kubernetes command line interface. It allows you to manage a Kubernetes cluster by providing a wide set of commands to communicate with the Kubernetes API.

Consulting services

Azure Architecture- 1-Hr Briefing

Azure Architecture: 1-Hr Briefing: At the end of this briefing by igroup, you’ll have a high-level overview of what Azure can do for your business, as well as an understanding of the timeline and costs.

Azure Cloud Assessment- 3-Day

Azure Cloud Assessment: 3-Day: In this assessment, igroup will provide a step-by-step guide to help you successfully achieve your cloud deployment, clearly listing the costs involved going forward and giving you a good understanding of the resources required.

Azure Cloud VDI 6-Day Proof of Concept

Azure Cloud VDI 6-Day Proof of Concept: As part of this proof of concept, Meritum Cloud will build an Azure tenant for the customer, a basic VDI image, and configure the XenDesktop essentials according to best practices.

Azure Data Center Migration- 2-Week POC

Azure Data Center Migration: 2-Week POC: Communication Square will devise a plan for your migration, help you set up your environment on Azure, lift and shift your workload to the cloud, and then test, analyze, and optimize the migration.

Azure Data Center Migrations- 10 Weeks Workshop

Azure Data Center Migrations: 10 Weeks Workshop: BUI's offer includes a cloud readiness assessment, a road map outlining a cloud implementation strategy, a hybrid architecture implementation, and full-time support for your IT team during the migration process.

Azure Health Assessment- 3-Day

Azure Health Assessment: 3-Day: In this engagement, igroup will spend one day on-site assessing your current infrastructure. The igroup team will then analyze the findings and create a remediation process with actions and recommendations.

Azure Hybrid Cloud 5-Day Proof of Concept

Azure Hybrid Cloud 5-Day Proof of Concept: Meritum Cloud will brief the customer on Azure hybrid cloud services, carry out a discovery workshop, and build an Azure tenant, configuring the environment with security best practices.

Azure IaaS Proof of Concept- 5-Day

Azure IaaS Proof of Concept: 5-Day: This proof of concept by igroup will allow you to assess if you are overcomplicating your environment or not using Azure services to their full potential.

Azure Marketplace On-boarding for ISVs

Azure Marketplace On-boarding for ISVs: In this consulting offer, Spektra Systems will help you assess your application/solution for Azure Marketplace onboarding and define the next steps to take.

Azure Migration Assessment- 2-Day Assessment

Azure Migration Assessment: 2-Day Assessment: CDW will work with you to deploy an assessment tool in your environment, ensure the tool is configured properly, run the tool, and then review and help you interpret the results.

Azure Migration Planning- 5 Day Implementation

Azure Migration Planning: 5 Day Implementation: Atech Support can provide certified Microsoft Azure solutions architects and certified Microsoft Azure engineers to design, plan, and migrate your on-premises infrastructure to Azure.

BearingPoint - Digital Process Twin- 4 Weeks PoC

BearingPoint - Digital Process Twin: 4 Weeks PoC: The Digital Process Twin connects business processes, devices, and systems through real-time horizontal data integration to improve reliability for logistics processes. This proof of concept will realize an IoT use case.

Cloud Architecture Tech Consulting- 4-Wk Briefing

Cloud Architecture Tech Consulting: 4-Wk Briefing: Azure MVPs will consult with you about your challenges, addressing architecture design and technical issues.

Data Center Migration Assessment Service- 4 Weeks

Data Center Migration Assessment Service: 4 Weeks: PC Solutions will provide an Azure knowledge session, a business assessment of your datacenter, a technical assessment of your datacenter, an assessment with tools, a recommendations report, and a presentation.

HIPAA Compliant Cloud Solutions- 1-Hour Briefing

HIPAA Compliant Cloud Solutions: 1-Hour Briefing: For any healthcare organization, achieving HIPAA compliance is very important. In this briefing by Communication Square, you will learn about HIPAA-compliant Microsoft cloud solutions.

HIPAA Compliant Cloud Solutions- 4-Week Imp.

HIPAA Compliant Cloud Solutions: 4-Week Imp.: Trying to get your IT solutions to match HIPAA requirements from scratch can be a losing battle. In this engagement, Communication Square will implement HIPAA-compliant Microsoft cloud solutions.

How to move to Azure- 1-Hr Briefing

How to move to Azure: 1-Hr Briefing: At the end of this briefing by igroup, you’ll have a high-level overview of how to move to Azure, as well as an understanding of the timeline and costs. Our specialist will answer any questions and explain the best route for your business.

Implementing Managing Azure Infrastructure 4 days

Implementing Managing Azure Infrastructure 4 days: In this workshop, Emm&mmE Informatica will introduce you to Azure services, including virtual machines, storage, and containers.

Infrastructure manage service-4week Implementation

Infrastructure manage service:4week Implementation: In this offer from PC Solutions, get end-to-end managed services on up to 10 virtual machines.

Migrate MySQL to Azure 2-weeks Implementation

Migrate MySQL to Azure 2-weeks Implementation: Bring scalability, flexibility, security, and performance to your open-source database workloads while reducing the cost of infrastructure, maintenance, and support when you deploy on Azure.

Migrate PostgreSQL to Azure 2-weeks Implementation

Migrate PostgreSQL to Azure 2-weeks Implementation: Ascent Technology will lift and shift your PostgreSQL Database to Azure, providing availability, security, and performance at a fraction of on-premises costs.

Move Your Current ERP into Azure- 1-Day Assessment

Move Your Current ERP into Azure: 1-Day Assessment: This assessment by Altron Karabina involves understanding your ERP infrastructure, defining troubles with your existing environment, identifying migration steps, discussing your ERP road map, and more.

POPIA Compliance & Readiness- 10 Week Assessment

POPIA Compliance & Readiness: 10 Week Assessment: POPIACheck allows organizations to rapidly perform POPIA assessments and generate corrective actions to maintain compliance with the Protection of Personal Information Act of South Africa.

Power BI Adoption Tracker - 1 Day Assessment

Power BI Adoption Tracker - 1 Day Assessment: Altron Karabina allows you to gain insight into your Microsoft Power BI adoption. The adoption tracker will provide users with the tools they need to make more informed business decisions.

Project Execution Sprint- 2-Wk Implementation

Project Execution Sprint: 2-Wk Implementation: Clientek's implementation will focus on developing, testing, and delivering predetermined features and project objectives.

RESCAN AD Audit Reporting- 4 Week Assessment

RESCAN AD Audit Reporting: 4 Week Assessment: BUI's RESCAN service will provide easy-to-read audit reporting on sources like Active Directory by using Microsoft Azure.

SQL Consulting Services- 1 Week Implementation (1)

SQL Consulting Services: 1 Week Implementation: This offer is for customers in Belgium. A Denny Cherry & Associates expert will work with your IT team to install, configure, and tune your Azure virtual machines running SQL Server for up to one week.

SQL Consulting Services- 1 Week Implementation (2)

SQL Consulting Services: 1 Week Implementation: This offer is for customers in Germany. A Denny Cherry & Associates expert will work with your IT team to install, configure, and tune your Azure virtual machines running SQL Server for up to one week.

SQL Consulting Services- 1 Week Implementation (3)

SQL Consulting Services: 1 Week Implementation: This offer is for customers in France. A Denny Cherry & Associates expert will work with your IT team to install, configure, and tune your Azure virtual machines running SQL Server for up to one week.

SQL Consulting Services- 1 Week Implementation (4)

SQL Consulting Services: 1 Week Implementation: This offer is for customers in Europe. A Denny Cherry & Associates expert will work with your IT team to install, configure, and tune your Azure virtual machines running SQL Server for up to one week.

SQL Consulting Services- 1 Week Implementation (5)

SQL Consulting Services: 1 Week Implementation: This offer is for U.S. customers. A Denny Cherry & Associates expert will work with your IT team to install, configure, and tune your Azure virtual machines running SQL Server for up to one week.

SQL Consulting Services- 1 Week Implementation (6)

SQL Consulting Services: 1 Week Implementation: This offer is for customers in Canada. A Denny Cherry & Associates expert will work with your IT team to install, configure, and tune your Azure virtual machines running SQL Server for up to one week.

SQL Consulting Services- 1 Week Implementation (7)

SQL Consulting Services: 1 Week Implementation: This offer is for U.K. customers. A Denny Cherry & Associates expert will work with your IT team to install, configure, and tune your Azure virtual machines running SQL Server for up to one week.

SQL Consulting Services- 1 Week Implementation (8)

SQL Consulting Services: 1 Week Implementation: This offer is for customers in Australia. A Denny Cherry & Associates expert will work with your IT team to install, configure, and tune your Azure virtual machines running SQL Server for up to one week.

SQL Server Always On- 5 Day Implementation (1)

SQL Server Always On: 5 Day Implementation: This offer is for customers in Australia. In this engagement, Denny Cherry & Associates Consulting will build a new SQL Server Always On availability groups solution within your Azure environment.

SQL Server Always On- 5 Day Implementation (2)

SQL Server Always On: 5 Day Implementation: This offer is for customers in Canada. In this engagement, Denny Cherry & Associates Consulting will build a new SQL Server Always On availability groups solution within your Azure environment.

SQL Server Always On- 5 Day Implementation (3)

SQL Server Always On: 5 Day Implementation: This offer is for U.K. customers. In this engagement, Denny Cherry & Associates Consulting will build a new SQL Server Always On availability groups solution within your Azure environment.

SQL Server Always On- 5 Day Implementation (4)

SQL Server Always On: 5 Day Implementation: This offer is for U.S. customers. In this engagement, Denny Cherry & Associates Consulting will build a new SQL Server Always On availability groups solution within your Azure environment.

SQL Server Health Check- 3-Day Assessment (1)

SQL Server Health Check: 3-Day Assessment: This offer is for customers in Australia. Denny Cherry & Associates Consulting will review your SQL Server environment and provide a list of changes, which, if made, will improve the performance of your SQL Server.

SQL Server Health Check- 3-Day Assessment (2)

SQL Server Health Check: 3-Day Assessment: This offer is for customers in Belgium. Denny Cherry & Associates Consulting will review your SQL Server environment and provide a list of changes, which, if made, will improve the performance of your SQL Server.

SQL Server Health Check- 3-Day Assessment (3)

SQL Server Health Check: 3-Day Assessment: This offer is for customers in New Zealand. Denny Cherry & Associates Consulting will review your SQL Server environment and provide a list of changes, which, if made, will improve the performance of your SQL Server.

SQL Server Health Check- 3-Day Assessment (4)

SQL Server Health Check: 3-Day Assessment: This offer is for customers in Canada. Denny Cherry & Associates Consulting will review your SQL Server environment and provide a list of changes, which, if made, will improve the performance of your SQL Server.

SQL Server Health Check- 3-Day Assessment (5)

SQL Server Health Check: 3-Day Assessment: This offer is for customers in Europe. Denny Cherry & Associates Consulting will review your SQL Server environment and provide a list of changes, which, if made, will improve the performance of your SQL Server.

SQL Server Health Check- 3-Day Assessment (6)

SQL Server Health Check: 3-Day Assessment: This offer is for customers in the Netherlands. Denny Cherry & Associates Consulting will review your SQL Server environment and provide a list of changes, which, if made, will improve the performance of your SQL Server.

SQL Server Health Check- 3-Day Assessment (7)

SQL Server Health Check: 3-Day Assessment: This offer is for customers in France. Denny Cherry & Associates Consulting will review your SQL Server environment and provide a list of changes, which, if made, will improve the performance of your SQL Server.

SQL Server Health Check- 3-Day Assessment (8)

SQL Server Health Check: 3-Day Assessment: This offer is for U.K. customers. Denny Cherry & Associates Consulting will review your SQL Server environment and provide a list of changes, which, if made, will improve the performance of your SQL Server.

SQL Server Health Check- 3-Day Assessment (9)

SQL Server Health Check: 3-Day Assessment: This offer is for customers in Denmark. Denny Cherry & Associates Consulting will review your SQL Server environment and provide a list of changes, which, if made, will improve the performance of your SQL Server.

SQL Server Health Check- 3-Day Assessment (10)

SQL Server Health Check: 3-Day Assessment: This offer is for customers in Italy. Denny Cherry & Associates Consulting will review your SQL Server environment and provide a list of changes, which, if made, will improve the performance of your SQL Server.

SQL Server Health Check- 3-Day Assessment (11)

SQL Server Health Check: 3-Day Assessment: This offer is for customers in Germany. Denny Cherry & Associates Consulting will review your SQL Server environment and provide a list of changes, which, if made, will improve the performance of your SQL Server.

SQL Server Health Check- 3-Day Assessment (12)

SQL Server Health Check: 3-Day Assessment: This offer is for customers in Japan. Denny Cherry & Associates Consulting will review your SQL Server environment and provide a list of changes, which, if made, will improve the performance of your SQL Server.

SQL Server Health Check- 3-Day Assessment (13)

SQL Server Health Check: 3-Day Assessment: This offer is for U.S. customers. Denny Cherry & Associates Consulting will review your SQL Server environment and provide a list of changes, which, if made, will improve the performance of your SQL Server.

Two Node SQL Server Cluster- 2 Day Implementation (1)

Two Node SQL Server Cluster: 2 Day Implementation: This offer is for U.S. customers. During this two-day process, the expert team at Denny Cherry & Associates Consulting will create and build a new SQL Server failover cluster within your Microsoft Azure environment.

Two Node SQL Server Cluster- 2 Day Implementation (2)

Two Node SQL Server Cluster: 2 Day Implementation: This offer is for customers in Australia. During this two-day process, the expert team at Denny Cherry & Associates Consulting will create and build a new SQL Server failover cluster within your Microsoft Azure environment.

Two Node SQL Server Cluster- 2 Day Implementation (3)

Two Node SQL Server Cluster: 2 Day Implementation: This offer is for U.K. customers. During this two-day process, the expert team at Denny Cherry & Associates Consulting will create and build a new SQL Server failover cluster within your Microsoft Azure environment.

Two Node SQL Server Cluster- 2 Day Implementation (4)

Two Node SQL Server Cluster: 2 Day Implementation: This offer is for customers in Canada. During this two-day process, the expert team at Denny Cherry & Associates Consulting will create and build a new SQL Server failover cluster within your Microsoft Azure environment.

XS VM Lift & Shift- 1-Day Implementation

XS VM Lift & Shift: 1-Day Implementation: With this offer, you choose the server to migrate and Beacon42 experts move it to Azure.

Anatomy of a secured MCU

$
0
0

Secure silicon

Azure Sphere is an end-to-end solution containing three complementary components that provide a secured IoT platform. They include an Azure Sphere microcontroller unit (MCU), an operating system optimized for IoT scenarios that is managed by Microsoft, and a suite of secured, scalable online services. Microsoft provides over a decade of support for the operating system as well as use of the security service for a single per device fee to simplify business planning.

Microsoft built its name in software, but our expertise in silicon runs deep. Over the last 15 years, Microsoft has deeply invested in hardware-based security by designing custom silicon for various Microsoft products. Azure Sphere’s silicon architecture is a culmination of all those years of experience, and our Pluton Security Subsystem is the heart of our security story. In this blog post, I’ll drill down a layer to discuss what puts the “secured” in a secured Azure Sphere MCU. Specifically, I’ll dive into Pluton’s design details, as well as some other general silicon security improvements.

Broadly, any MCU-based device belongs in one of two categories – devices that may connect to the Internet and devices designed to never connect to the Internet. Until recently, virtually all MCU-based devices were disconnected, which led to a security model that considered the value of the device, the physical threat model (Is the device locked in a cage? Does the general public interact with it?), and the risk of the device being attacked (e.g., an automatic paper towel dispenser versus an MRI machine). Therefore, security had a simple model – pay more, get more.

Connecting an MCU-based device to the Internet is a watershed moment because any MCU can become a potential general-purpose digital weapon in the hands of an attacker. The Mirai botnet was composed of only 100,000 connected cameras. If you’ve never thought about a distributed denial of service (DDOS) attack launched by 100 million paper towel dispensers, consider this blog post your moment of clarity.

Internet connectivity means that the table stakes for security must be changed. We authored the white paper on the seven properties of highly secured connected devices to reset the conversation around security. However, Azure Sphere certified MCUs go beyond a typical hardware root of trust used in an MCU.

Pluton Key management

Pluton generates its own key-pairs in silicon during the manufacturing process. Other secured chips often depend on a hardware security module (HSM) on the factory floor to generate keys. Pluton goes further by generating its keys privately in silicon, and then persistently storing those keys into e-fuses. The private keys are never visible to software. Even the most trusted firmware on the device does not have access to the private keys. Pluton generates two different public/private elliptic curve cryptography (ECC) key-pairs. One is used exclusively for remote-attestation (more on that later) and the other is available for general purpose cryptography. The chips’ public keys, but not the private keys, are sent to Microsoft from the silicon manufacturer, which means Microsoft knows about and establishes a trust relationship with every Azure Sphere chip from the point the chip is manufactured.

Pluton’s random number generator

Pluton implements a true random number generator. Pluton’s random number generator collects entropy (i.e., randomness) from the environment to generate random numbers. This random number generator is critical to each MCU generating its own keys during the manufacturing process and is therefore a critical attack surface that must be defended. Pluton’s true random number generator measures the entropy it collects, and if the entropy does not meet a certain standard defined as part of its design, the random number generator refuses to deliver random numbers.

Pluton’s cryptographic helpers

Pluton accelerates common cryptographic tasks. Pluton accelerates cryptographic tasks such as hashing (via SHA2), ECC, and advanced encryption standard (AES) cryptographic operations.

The benefits of secure boot

Pluton minimizes supply chain risk. ECDSA is an algorithm for checking digital signatures with ECC key-pairs. Every piece of software on an Azure Sphere device must be signed by Microsoft. Microsoft uses its own proven signing infrastructure, the same infrastructure that protects the private keys of some of Microsoft’s most valuable products, and therefore ensures that private keys are kept in secure HSMs and that every use of this key follows a strict and documented process.

Leveraging remote attestation

Pluton implements support for measured boot and remote attestation in silicon. When an Azure Sphere device connects to the Azure Sphere Security Service (AS3), it completes server authentication by using a locally-stored certificate. However, AS3 must also authenticate the device itself. It does that via a protocol called remote attestation:

1. AS3 sends the device a nonce, which is combined with the measured boot value that consists of the cryptographic hashes of the software components that have been booted.

2. The device signs these values with Pluton’s private ECC attestation key and sends them back to AS3.

3. AS3 already has the device’s public ECC attestation key and can therefore determine whether the device is authentic, was booted with genuine software, and if the genuine software is trusted.

It is critical that this measured-boot value cannot be forged or changed; therefore, it is kept in an accumulation register that can only be reset if the entire chip is reset. Since signing happens in silicon, even a device with fully compromised software cannot forge an attestation value.

4. If a device is up to date, it is given a client certificate that can be presented to any online service and an AS3 certificate that is used only with AS3 services. If the certificate chain is valid, it represents AS3 vouching for the health of the chip. The certificate is valid for roughly a day, which means the device is forced to attest to its health on a regular basis if it wants to maintain a connection to the Internet.

If the device is not healthy, the client certificate is not issued, effectively allowing the device to connect only to AS3 and perform a software update.

Silicon-based attestation with continual renewal prevents device impersonation and forgeries. A future blog post will provide a more thorough deep dive on this topic, but it all starts with Pluton.

Silicon security beyond Pluton

Security doesn’t stop at the silicon fabric that makes up Pluton. In fact, Azure Sphere MCUs go beyond Pluton by implementing additional security features that improve the platform’s defense in depth strategy. Let’s use the MT3620, a chip composed of five cores, as an example. One core is dedicated to the runtime that invokes operations within Pluton. Next is a core dedicated to Wi-Fi that interacts with the Wi-Fi RF components of the chip. The A7 core is dedicated to running the Azure Sphere operating system, and two M4 cores are available for real time processing. The A7 core leverages Arm’s Trust Zone technology, the Azure Sphere security monitor runs in Secure World, while the Linux kernel and other OS components run in Normal World. 

Between all the cores and peripherals are what we call “firewalls.” These are not firewalls in the network sense, but instead a set of mappings of resources to cores. In fact, internally we call this the core mapping feature. All resources on the device are mapped this way and by default all resources (A7 SRAM, peripherals, and flash) are mapped only to Secure World, denying access to software executing in Normal World. Secure World can selectively grant access to peripherals, and those selections are “sticky.” Sticky is not a term you often see when discussing silicon. In this case we mean that a core mapping is “locked” once set, which means that even if an attacker compromises the code that programs the firewalls, the attacker cannot get access to resources that were not originally assigned to the core in which it’s executing.

Limited peripheral access reduces the surface area that could potentially be subject to attack. Sticky selection further reduces the surface area. After the device is deployed, an attacker cannot exploit the code to communicate with a rogue web service or to control another part of the device. The result is greater security for individual devices as well as through the entire supply chain.

These are just a few of the silicon features that make an Azure Sphere MCU unique, and that make it more difficult for an attacker to take control of an Azure Sphere device. Each additional feature provides one more complication that an attacker must overcome to compromise a device’s functionality, and Pluton provides a rich set of silicon security features that are not often present in a hardware root of trust. Microsoft believes in the benefits of Pluton so strongly that it is licensing Pluton, royalty free, to any silicon manufacturer that wants to make an Azure Sphere chip. Better security is not foolproof, but raising the bar in IoT makes it that much harder to compromise a device.

IoT in Action: New insights for retail

$
0
0

The pace of development for retail Internet of Things (IoT) solutions continues to build. From enhanced customer insights to better staff utilization and increased supply chain efficiency, sophisticated IoT solutions are helping retailers improve, and even reimagine, the retail experience.

For in-depth insights around the latest developments in IoT for retail, including how customer expectations are changing and how IoT investments can impact store profitability, register for our live IoT in Action event in New York (co-located with NRF 2019) on January 14, 2019 or sign up for our industry-specific retail webinar on January 8, 2019.

IoT in Action Webinar Series

Focusing on store performance

In-store retail continues to account for approximately 90 percent of retail sales, but the retail landscape is changing. According to IHL, nearly 10,000 stores closed in the United States in 2017 – but another 14,000 opened. In the new retail environment, successful stores are focused on improving the customer experience and in-store operations with the goal of offering truly frictionless shopping. IoT technologies are helping to transform both efforts, allowing rapid testing and deployment in a common platform that spans both digital and physical environments.

Four ways IoT can help increase conversions

It's one thing to get a customer into your store, but another to create a successful shopping experience where they make the purchase they intended to make. IoT can increase sales conversions by reducing wait times, making items easy to find, and removing friction from the experience. Here are a few methods for doing so:

1. Better store navigation

A customer can only purchase something if they can find it. By analyzing traffic patterns in the store from cameras and traffic sensors, then layering this data on top of inventory and purchase data, stores can optimize their physical layout, so shoppers can quickly find items and be inspired by complimentary items. Where online retailers try to get products in front of customers in the fewest number of clicks, IoT technologies can help brick-and-mortar retailers minimize the number of steps a customer has to take.

2. Helping customers when they need it most

One of the reasons customers shop in stores is to get in-person help from sales associates. But if the sales associates don't interact with the customer at the right time, sales and upsells can be lost. That’s why companies are creating IoT solutions that use existing video infrastructure to offer intelligent retail insights and recommendations that help stores convert shoppers to customers.

Also working to improve the retail customer experience is Genetec™, a Microsoft partner who has IoT solutions that leverage existing store security cameras to provide retailers with customer and operational insights to improve business outcomes. For example, their solution can detect increased congestion at checkout terminals to reduce abandonment due to delays checking out and provide customer traffic and flow information for making good merchandising decisions based on that intelligence.

3. The right item for the right person

Even if the right item is in the store, if an associate can't find it and hand it to a customer, it won't get bought. IoT solutions can help make sure every item is on the right shelf. They can also keep track of items that are moving within the store, for example to the fitting rooms and the sometimes-cluttered racks outside them.

4. A more personalized experience

IoT solutions are also enabling retailers to glean far deeper insights into customer needs, preferences, and buying habits. Microsoft IoT solutions can assess how customers interact with your brand and the products on your shelves. They can help gauge customer sentiment and track search and buying habits through Dynamics CRM. These insights can enable retailers to truly personalize customer experiences and promotions to increase loyalty and market share.

Operational improvements through retail IoT

Secure IoT solutions on the intelligent edge and intelligent cloud can increase operational efficiencies and reduce costs. Add in the Azure Sphere solution, which ensures end-to-end device security for MCU-powered devices, and retailers can focus their efforts on reimagining everything from business models to product experiences.

For instance, many time-consuming activities like reordering, inventory tracking, and setting price points can be automated. Analysis of traffic patterns and demographics between stores can help select the right mix of products for each store. Other ways operations can be improved include:

  • Intelligent supply chain and micro-warehousing: Keeping items stocked at the right levels and anticipating demand surges is critical to a retailer’s success. Microsoft inventory management solutions streamline and accelerate supply chain and inventory management processes to improve efficiency, agility, and cost management. IoT sensors can track inventory levels in real time and send alerts when levels dip.
  • Workforce empowerment and efficiency: Many repetitive tasks can be shifted from associates to IoT-enabled systems. Shelf compliance monitoring is one task that can be automated to free up staff time. Companies are even using robots to scan entire stores and help employees take immediate action based on their findings. This enables associates to be brand ambassadors rather than shelf checkers.
  • Increased security: Some customer-focused improvements, like Mobile POS within the store, bring with them increased risk of shrinkage, both intentional and unintentional. Video surveillance of mobile checkout transactions is challenging, but partners like Genetec have created solutions so that all checkout activity can be located and recorded, which is especially important as more transactions occur away from fixed terminals.

Register for the IoT in Action Webinar

To explore IoT for retail in more detail, be sure to register for our retail-focused IoT in Action webinar on January 8, 2019. You will get insights into how IoT can help you delight customers, improve the effectiveness of your associates, and increase the efficiency of your operations.

You can also learn how intelligent edge and intelligent cloud IoT solutions can transform your retail business by signing up for Microsoft’s in-person IoT in Action event in New York City on January 14, 2019.

Finally, you can take a deep dive into building retail IoT solutions at our upcoming 2-day Virtual Bootcamp in late January and early February.

Conversational – AI updates December 2018

$
0
0

This blog post was co-authored by Vishwac Sena Kannan, Principal Program Manager, FUSE Labs.

We are thrilled to present the release of Bot Framework SDK version 4.2 and we want to use this opportunity to provide additional updates on Conversational-AI releases from Microsoft.

In the SDK 4.2 release, the team focused on enhancing monitoring, telemetry, and analytics capabilities of the SDK by improving the integration with Azure App Insights. As with any release, we fixed a number of bugs, continued to improve Language Understanding (LUIS) and QnA integration, and enhanced our engineering practices. There were additional updates across the other areas like language, prompt and dialogs, and connectors and adapters. You can review all the changes that went into 4.2 in the detailed changelog. For more information, view the list of all closed issues.

Telemetry updates for SDK 4.2

With the SDK 4.2 release, we started improving the built-in monitoring, telemetry, and analytics capabilities provided by the SDK. Our goal is to provide developers with the ability to understand their overall bot-health, provide detailed reports about the bot’s conversation quality, as well as tools to understand where conversations fall short. To do that, we decided to further enhance the built-in integration with Bot FrameworkMicrosoft Azure Application Insights. For that end, we have streamlined the integration and default telemetry emitted from the SDK. This includes waterfall dialog instrumentation, docs, examples for querying data, and a PowerBI dashboard.

Bot Framework can use the App Insights telemetry to provide information about how your bot is performing and track key metrics. For example, once you enable App Insights for your bot, the SDK will automatically trace important information for each activity that gets sent to your bot. Essentially, per activity, for example, a user interacts with your bot by typing some utterance, the SDK emit traces for all different stages of the activity processing. This then can be placed on a timeline showing each component latency and performance – as you can see from the following image.

This can help identify slow responses and further optimize your bot performance.

Beyond basic bot performance analysis, we have instrumented the SDK to emit traces for the dialog stack in the SDK, primarily the waterfall dialog. The following image is a visualization showing the behavior of a waterfall dialog. Specifically, this image shows three events before and after someone completes a dialog across all sessions. The center “Initial Event” is the starting point that fans left and right showing before and after, respectively. This is great to show drop-off rate, shown in red, and where most conversation ‘flows’ by the thickness of the line. This view is a default app insight report, all we had to do is connect the wires between the SDK, dialogs, and App Insights.

Waterfall Dialog

The SDK and integration with App Insights provide a lot more capabilities, for example:

  • Complete activity tracing including all dependencies.
  • LUIS telemetry, including non-functional such as latency, error rate, and functions such as intent distribution, intent sentiment, and more.
  • QnA telemetry including non-functional such as latency, error rate, and functional such QnA score and relevance.
  • Word Cloud, common and utterances showing for top most used words and phrases – this can help in case you missed some intents or QnA.
  • Conversation length expressed in term of time and step-count.
  • Come up with your own reports using custom queries.
  • Custom logging to your bot.

Solutions

The creation of a high-quality conversational experience requires a foundational set of capabilities. To help customers and partners succeed with building great conversational experiences, we released the enterprise bot template at Microsoft Ignite 2018. This template brings together all the best practices and supporting components we've identified through the building of conversational experiences.

All Dialogs Overview

Synchronized with the SDK 4.2 release, we have delivered updates to the enterprise template which provides additional localization for LUIS models and responses including multi-language dispatcher support for customers that wish to support multiple native languages in one bot deployment. We’ve also replaced custom telemetry work with the new native SDK support for dialog telemetry and a new Conversational Analytics Power BI dashboard providing deep analytics into usage, dialog quality, and more.

The enterprise template is now joined by retail customer support focused template which provides further LUIS models for this scenario and example dialogs for order management, stock availability, and store location.

The virtual assistant solution accelerator, which enables customers and partners to build their own virtual assistants tailored to their brand and scenarios, has continued to evolve. Ignite was our first crucial milestone for virtual assistant and skills. Work has continued with regular updates to all elements of the overall solution.

We now have full support for six languages including Chinese for the virtual assistant and skills. The productivity skills (i.e., calendar, email, and tasks) have updated conversation flows, entity handling, new pre-built domain language models, and work with Microsoft Graph. This release also includes the first automotive capabilities enabling control of car features along with updates to skills enabling proactive experiences, Speech DDK integration, and experimental skills (restaurant booking and news).

Language Understanding December update

December was a very exciting month for Language Understanding in Microsoft. On December 4, 2018, we announced Docker container support for LUIS in public preview. Hosing LUIS runtime on containers provides a great set of benefits including:

  • Control over data: Allow customers to use the service with complete control over their data. This is essential for customers that cannot send data to the cloud but need access to the technology. Support consistency in hybrid environments – across data, management, identity, and security.
  • Control over model updates: Provide customers flexibility in versioning and updating of models deployed in their solutions.
  • Portable architecture: Enable the creation of a portable application architecture that can be deployed in the cloud, on-premises, and the edge.
  • High throughput/low latency: Provide customers the ability to scale for high throughput, low latency, requirements by enabling Cognitive Services to run in Azure Kubernetes Service physically close to their application logic, and data.

We recently posted a technical reference blog, “Getting started with Cognitive Services Language Understanding container.” We also posted a demo video, “Language Understanding – Container Support,” which shows how to run containers.

LUIS has expanded its service to seven new regions completing worldwide availability in all major Azure regions including UK, India, Canada, and Japan.

Among the other notables is the enhancement of the training experience. This included the improvement in the time required to train application. The team also released new pre-built entity extractors for people’s names, and geographical locations in English and Chinese, and expanded the phone number, URL, and email entities across all languages.

QnA Maker updates

In December, the QnA Maker service released an improvement for its intelligent extraction capabilities. Along with accuracy improvements for existing supported sources, QnA Maker can now extract information from simple “Support” URLs. Read more about extraction and supported data sources in the documentation, “Data sources for QnA Maker content.” QnA Maker also rolled out an improved ranking and scoring algorithm for all English KBs. Details on the confidence score can be found on documentation.

The team also released SDKs for the service in .NET, Node.js, Go, and Ruby.

Web chat speech update

Surface Pro 6 news

We now support the new Cognitive Services Speech to Text and Text to Speech services directly in Web Chat 4.2. This sample is a great place to learn about the new feature and start migrating your bot from Bing Speech to the new Speech Services.

We also added a few samples including backchannel injection and minimize mode. Backchannel injection demonstrates how to add sideband data to outgoing activities. You can leverage this technique to send browser language and time zone information alongside with messages sent by the user. Minimize mode sample shows how to load Web Chat on-demand and overlay it on top of your existing web page.

You can read more about Web Chat 4.2 in our changelog.

Get started

As we continue to improve our conversational AI tools and framework, we look forward to seeing what conversational experiences you will build for your customers. Get started today!

The biggest IoT stories of 2018

$
0
0

This blog post was authored by Peter Cooper, Senior Product Manager, Microsoft IoT.

Back in April, we announced our intention to invest $5 billion in the Internet of Things (IoT) over the next five years. The importance of this commitment has become even clearer since, as technology has already evolved, customers have innovated, and possibilities have grown. As 2018 draws to a close, here’s a look back at the topics that drove the most interest and excitement here on our blog—and a window into what’s coming for this technology in the near future.

Smart spaces

The spaces around us are coming alive with the power of data. In our post, “Smart buildings, built on Azure IoT,” we talked about how IoT and AI are helping those who own, manage, and use buildings increase efficiency to reduce cost and improve productivity. With announcements of products such as Azure Sphere and Azure Digital Twins, we empowered our partners and customers to explore new possibilities for managing and improving the built environment responsively, in real time.

Over the past year, we’ve also seen customers expand their vision of what smart spaces can do. Traditionally, these projects were heavily focused on operational aspects of building management such as infrastructure maintenance, water, and power usage. This is still the foundational use case and justification for IoT-enabled buildings, but people are increasingly excited about the transformative capabilities of smart spaces. Customers are exploring how they can use analytics to understand and optimize how people use the spaces they inhabit. Furthermore, they’re designing smart building solutions with the potential to dramatically influence day-to-day productivity and increase positive interactions.

For example, Steelcase showed how they’re creating smart and connected workplaces. As Scott Sadler, Steelcase Smart + Connected manager said, “By embedding technology into the work environment, we are enabling people to tell organizations what spaces are successful and why. We can measure and identify patterns in how and where people are working.” The ease of obtaining these insights will only increase in the coming year, and we’re thrilled to see where this field is headed. As smart space initiatives expand beyond the workplace to encompass stadiums, schools, hospitals, banks, and more—and as edge and cloud technologies connect them to the larger built environment—truly transformative possibilities are bound to emerge. 

The intelligent edge

As with smart buildings, we’ve been inspired by the visionary scope our partners and customers have for edge computing and look forward to big things in 2019. IT departments are using edge computing to solve infrastructure and security challenges to make IoT a reality. Hardware vendors are expanding the intelligence of their devices to take advantage of new functionalities. A diverse and vibrant ecosystem is arising that will push what’s possible at the edge.

We highlighted five ways edge will transform business, including reduced IoT solution costs, improved security, lower latency, greater reliability, and interoperability with legacy devices. Enabling this goodness requires a strong technology foundation, which is why the Azure IoT Edge platform garnered so much attention from the industry. Since then, the solution has moved into general availability, enabling any business to deliver cloud intelligence locally on cross-platform IoT devices.

Edge computing has depth, fueling growth in both infrastructure and IoT, which allows data processing, analytics, and advanced functionality on connected devices whether they’re connected to the cloud or not.

These innovations are many and varied. With a consistent deployment model, companies can code and test edge capabilities on any platform and launch them seamlessly. For example, some are training data models using cloud-scale machine learning engines, and then deploying those models as-is to edge devices. Others are using edge as a way to aggregate and preprocess information so that only relevant data is delivered to the cloud. Edge computing also makes it possible to build IoT solutions that are offline for extended periods of time yet deliver powerful predictive capabilities based on local data. It all adds up to more efficient, effective use of data to improve everyday lives around the world. 

Open standards and interoperability

Interoperability is a hot topic, especially in the manufacturing space, where businesses are looking for simple, comprehensive solutions that allow them to enable the connected factory with a mix of IoT-ready and legacy equipment. Our April post on OPC Unified Architecture (OPC UA) highlighted how manufacturers are using the standard to enable openness and interoperability while maintaining high standards of security.

In fact, this past year could be considered “the year of OPC UA,” with ABB, Rockwell, and Schneider Electric joining the OPC Foundation board members, alongside SIEMENS, SAP, Yokogava, Iconics, Ascolab, and, of course, Microsoft.

National industry initiatives have also continued deepening their commitment to interoperability. Germany’s Industrie 4.0 has released new testbeds and specifications based on the standard, and the China 2025 initiative has made a similar all-in commitment to OPC UA. We’ve made our own contributions to the world of OPC UA with new and updated products. Discrete manufacturing is also getting in on the interoperability act, with the German machine tool association VDW announcing the open universal machine tool interface (umati) initiative, which incorporates OPC UA into its architecture.

Looking ahead

The big lesson from all this energetic activity? IoT is a catalyst for digital transformation across traditional boundaries. We’re seeing new ecosystems and solutions emerge that unify data and insights from multiple places to enable new possibilities. As smart cities, vehicles, buildings, spaces, energy, and more converge, the opportunities grow—and so do needs for end-to-end manageability and security. We are committed to solving these challenges with built-in connectivity, real-time performance, and security innovation at the intelligent edge. Learn more about how Microsoft is helping build the connected future.

The year in review: Hybrid applications for developers

$
0
0

As 2018 comes to an end, I look at the technology landscape. I look at the kinds of hybrid scenarios our customers are developing. for example, we see Airbus transforming aerospace with Microsoft Azure Stack and I realize that this year has been amazing for developers that design, develop, and maintain cloud-based apps. Azure Stack has improved support for DevOps practices. You can use Kubernetes containers. You can use API Profiles with Azure Resource Manager and the code of your choice. You can review walkthroughs and tutorials on getting up and running with a development practice using a continuous integration pipeline. With Azure Stack, your apps can be developed in the cloud. You can code once and deploy to environments in Azure or in your local data center.

We are now seeing some of your favorite services from Azure arrive on Azure Stack. The Azure Stack team is also excited to come together with other members of the Azure Edge family, which include Data Box Edge, IoT Edge, and Azure Sphere. If you didn’t get a chance to attend Ignite 2018’s session on the Intellgent Edge check out the “Delivering Intelligent Edge and Microsoft Azure Stack and Data Box” session. The Edge closes the gap between on-premises solutions and the cloud. You can write applications based on a consistent Azure model. You can deploy different parts of your apps to different locations that make the most sense for each solution.

Over the course of this year we demonstrated that all of this is indeed a reality and not a distant vision, it is available today. Azure Stack is available in many regions throughout the world. Data Box Edge is also available together with other members of the Data Box family. You can also use exciting new services like IoT Edge and Azure Sphere on Azure Stack and Azure for a comprehensive hybrid platform.

Over the course of this year, both my team and our partners delivered capabilities that make it easier for you to use our Edge offerings in your apps. Early this year, Pivotal announced the general availability of Pivotal Cloud Foundry (PCF) in Azure Stack. PCF streamlines the way you push your.NET and Java code. You do not need to focus on where or how your code runs. The combination of PCF with Azure and Azure Stack opens opportunities for your hybrid app development.

As developers, we love when we can focus more on building our apps and worry less about infrastructure. That’s where infrastructure as code comes into play. The Azure Resource Manager (ARM) is the perfect answer for this need. The same ARM that is available in Azure is also available in Azure Stack. Your scripts and templates are consistent no matter where you point your deployment target. My team has worked on many improvements for ARM to make your hybrid app development experience easier. First among those improvements are the ARM API Profiles. API Profiles expose a set of resource types and API versions that are consistent across the different Azure clouds.

Back in June 2018, Hashicorp also released the Terraform provider for Azure Stack allowing you to provision and manage the infrastructure the same way you do in Azure.

During the second half of the year, we kept busy continuing to deliver services that our developers use. In partnership with the Azure SDK Team, we delivered API profile support for both .Net and Java alongside existing support for Python, Ruby, and Go.

During Ignite 2018 we had a variety of hybrid development-oriented talks. If you missed them, watch them! A few good ones include:

Finally, we provided new capabilities to make Azure Stack the best place for you to host your Kubernetes containers on-premise. You can use Kubernetes containers in Azure Stack through the Kubernetes Marketplace Item and through Red Hat OpenShift support. If you are a developer creating containerized applications, you have options. You have a choice on where to host your containers, a choice of different Azure services to integrate with your containers, and a choice to host your containers on-premises, the public cloud, and in Sovereign Azure clouds. On top of that, you can leverage the Open Source Service Broker to making it easy to consume Azure services and the Cloud Native Application Bundles to streamline your package dependencies.

This has been a great year as we worked with our customers to deliver innovation that will make your job easier. We have a pipeline full of exciting new capabilities coming in 2019, and as usual I will continue to post about them twice a month!

Now it’s time to unwind and enjoy the holidays! To learn more about hybrid application development read the previous blog post in this series, “A hybrid approach to Kubernetes.”


Build Visual Studio extensions using Visual Studio extensions

$
0
0

What if the community of extension authors banded together to add powerful features to Visual Studio that made it easier to create extensions? What if those features could be delivered in individually released extensions, but joined by a single installation experience that allows the user to choose which of the features to install? That’s the idea behind Extensibility Essentials – an extension pack that ships community-recommended extensions for extension authors.

Extension authors are usually interested in improving their own tooling in Visual Studio – either by installing extensions created by others or by building some themselves. By banding together, we can create the best and most comprehensive tooling experience for extension authoring. So, let’s test that theory by creating a range of extensions published to the Marketplace under our own accounts, and reference them in Extensibility Essentials to provide a unified and simple installation experience.

The individual extensions can and probably should be single purpose in nature. This prevents feature-creep where additional features are added that may or may not be useful for extension authors. If additional features are not closely related to the extension, then simply create a new extension for them. That way it is up to the individual extension author to decide if they wish to install it. It is also crucial that the extensions follow all the best practices.

Once the individual extension is stable, it can be added to Extensibility Essentials.

The extension pack Extensibility Essentials doesn’t do anything by itself. It is a barebone extension pack that just references the individual extensions. When installing the extension pack, the user can choose which of the referenced extensions to install. At the time of this writing, there are 9 individual extensions.

How the individual extensions are listed before installing

Ideas for new extensions can be centralized to the GitHub issue tracker. By collecting ideas in a central location, it provides a single location to comment on and potentially design features ahead of implementation.

The issue tracker is for both bugs and suggested features

It would be cool if…

So next time you’re sitting in Visual Studio working on an extension, think about what feature you’d like that would make you more productive. If you can’t think of a feature, but feel there is a scenario that is particularly problematic, then open a bug on the GitHub issue tracker and let other people try to figure out how an extension could perhaps solve the issue.

Thinking “it would be cool if…” is the first step to make it possible and with the Extensibility Essentials, it might be closer to becoming reality than imagined.

Does this idea resonate with you? Let me know in the comments.

Mads Kristensen, Senior Program Manager
@mkristensen

Mads Kristensen is a senior program manager on the Visual Studio Extensibility team. He is passionate about extension authoring, and over the years, he’s written some of the most popular ones with millions of downloads.

Azure.Source – Volume 63

$
0
0

Now in preview

Transparent Data Encryption (TDE) with customer managed keys for Managed Instance

Announces the public preview of Transparent Data Encryption (TDE) with Bring Your Own Key (BYOK) support for Microsoft Azure SQL Database Managed Instance. Azure SQL Database Managed Instance is a new deployment option in SQL Database that combines the best of on-premises SQL Server with the operational and financial benefits of an intelligent, fully-managed relational database service. TDE with BYOK support has been generally available for single databases and elastic pools since April 2018. TDE with BYOK support is offered in addition to TDE with service managed keys which is enabled on all new Azure SQL Databases, single databases, pools, and managed instances by default.

Access control transition from Azure SQL Database to Azure Key Vault flow chart

Transforming your data in Azure SQL Database to columnstore format

Announces a public preview of a new feature in Azure SQL Database, both in logical server and Managed Instance, called CLUSTERED COLUMNSTORE ONLINE INDEX build. This operation enables you to migrate your data stored in row-store format to the columnstore format and maintain your columnstore data structures with the minimal downtime on your workload. Learn why this format is valuable and how it can compress data and boost the performance of your analytical queries. This feature is currently in preview in all flavors of Azure SQL Database including logical servers, elastic pools, and Managed Instances.

Also in preview

Now generally available

Virtual Network Service Endpoints for serverless messaging and big data

Virtual Networks and Firewall rules for both Azure Event Hubs and Azure Service Bus are now generally available. This feature adds to the security and control you have over your cloud environments. Now, traffic from your virtual network to your Azure Service Bus Premium namespaces and Standard and Dedicated Azure Event Hubs namespaces can be kept secure from public Internet access and completely private on the Azure backbone network. Customers dealing with PII (Financial Services, Insurance, etc.) or looking to further secure access to their cloud visible resources will benefit the most from this feature.

Also now available

News and announcements

Microsoft open sources Trill to deliver insights on a trillion events a day

An internal Microsoft project known as Trill for processing “a trillion events per day” is now being open sourced on GitHub to address the need to process massive amounts of data each millisecond is becoming a common business requirement. Trill started as a research project at Microsoft Research in 2012, and since then, has been extensively described in research papers. The roots of Trill’s language lie in Microsoft’s former service StreamInsight, a powerful platform allowing developers to develop and deploy complex event processing applications. Both systems are based off an extended query and data model that extends the relational model with a time component. By open-sourcing Trill, we want to offer the power of the IStreamable abstraction to all customers the same way that IEnumerable and IObservable are available. Trill powers internal applications and external services, reaching thousands of developers. A number of powerful, streaming services are already being powered by Trill, such as Bing Ads, Azure Stream Analytics, and Halo.

Conversational - AI updates December 2018

Bot Framework SDK version 4.2 is now available. The team used this opportunity to provide additional updates on Conversational-AI releases from Microsoft. In the SDK 4.2 release, the team focused on enhancing monitoring, telemetry, and analytics capabilities of the SDK by improving the integration with Azure App Insights. As with any release, we fixed a number of bugs, continued to improve Language Understanding (LUIS) and QnA integration, and enhanced our engineering practices. There were additional updates across the other areas like language, prompt and dialogs, and connectors and adapters.

Screenshot showing a default app insight report showing connections between the SDK, dialogs, and App Insights

Azure PowerShell ‘Az’ Module version 1.0

Az is a new Azure PowerShell module that is built to harness the power of PowerShell Core and Cloud Shell and maintain compatibility with Windows PowerShell 5.1. Az ensures that Windows PowerShell and PowerShell Core users can get the latest Azure tooling in every PowerShell on every platform. Az also simplifies and normalizes Azure PowerShell cmdlet and module names. Az is open source and ships in Azure Cloud Shell and is available from the PowerShell Gallery. The Az module version 1.0 was released on December 18, 2018, and will be updated on a two-week cadence in 2019, starting with a January 15, 2019 release.

Participate in the 16th Developer Economics Survey

The Developer Economics Q4 2018 survey is an independent survey from SlashData, an analyst firm in the developer economy that tracks global software developer trends. Every year more than 40,000 developers around the world participate in this survey, so this is a chance to be part of something big, voice your thoughts, and make your contribution to the developer community. The Developer Economics Q4 2018 survey is for all developers (professionals, hobbyists, and students) engaging in the following software development areas: web, mobile, desktop, backend services, IoT, AR/VR, machine learning and data science, and gaming.

The biggest IoT stories of 2018

As 2018 draws to a close, the IoT Team took a look back at the topics that drove the most interest and excitement here on the Azure blog—and a window into what’s coming for this technology in the near future, covering everything from smart spaces, to the intelligent edge, to open standards and interoperability. We’re seeing new ecosystems and solutions emerge that unify data and insights from multiple places to enable new possibilities. As smart cities, vehicles, buildings, spaces, energy, and more converge, the opportunities grow—and so do needs for end-to-end manageability and security. We are committed (in April, we announced our intention to invest $5 billion in IoT over the next five years) to solving these challenges with built-in connectivity, real-time performance, and security innovation at the intelligent edge.

The year in review: Hybrid applications for developers

Ricardo Mendes Principal Program Manager, Azure Stack, takes a look at the technology landscape supporting hybrid scenarios and does a retrospective of the myriad announcements throughout 2018 that enabled developers to focus more on building apps and worry less about infrastructure. This year has been amazing for developers that design, develop, and maintain cloud-based apps. Azure Stack has improved support for DevOps practices. You can use Kubernetes containers. You can use API Profiles with Azure Resource Manager and the code of your choice.

Additional news and updates

Technical content

Fine-tune natural language processing models using Azure Machine Learning service

Learn how you can fine-tune Bidirectional Encoder Representations from Transformers (BERT) easily using the Azure Machine Learning service, as well as topics such as using distributed settings and tuning hyperparameters for the corresponding dataset. In this post, you’ll see some preliminary results to demonstrate how to use Azure Machine Learning service to fine tune the NLP models. After BERT is trained on a large corpus (for example, English Wikipedia), the assumption is that because the dataset is huge, the model can inherit a lot of knowledge about the English language. In addition to tuning different hyperparameters for various use cases, Azure Machine Learning service can be used to manage the entire lifecycle of these kinds of experiments. Azure Machine Learning service provides an end-to-end cloud-based machine learning environment, so customers can develop, train, test, deploy, manage, and track machine learning models  All the code is available on the GitHub repository.

Diagram providing an Azure Machine Learning Service Overview

Anatomy of a secured MCU

Azure Sphere is an end-to-end solution containing three complementary components that provide a secured IoT platform. They include an Azure Sphere microcontroller unit (MCU), an operating system optimized for IoT scenarios that is managed by Microsoft, and a suite of secured, scalable online services. Broadly, any MCU-based device belongs in one of two categories – devices that may connect to the Internet and devices designed to never connect to the Internet. Connecting an MCU-based device to the Internet is a watershed moment because any MCU can become a potential general-purpose digital weapon in the hands of an attacker. Learn how Azure Sphere-certified MCUs go beyond a typical hardware root of trust used in an MCU. This post discusses what puts the “secured” in a secured Azure Sphere MCU. Specifically, the Pluton Security Subsystem design details, as well as some other general silicon security improvements.

How to migrate from AzureRM to Az in Azure PowerShell

As noted above, the Azure PowerShell team released Az, a new cross-platform PowerShell module that will replace AzureRM. You can install this module by running Install-Module Az in an elevated PowerShell prompt. With the introduction of PowerShell Core, PowerShell is a cross-platform product. Therefore, it became a priority for Azure PowerShell to have cross-platform support. Because of the changes required to support running Azure PowerShell cross-platform, we created a new module rather than modifying the existing AzureRM module. Moving forward, all new functionality will be added to the Az module, while AzureRM will only be updated with bug fixes. In this post, you’ll learn how to migrate from AzureRM to Az in Azure PowerShell.

Top 3 free resources developers need for learning Azure

I wrote this post, which covers three free resources every developer needs for learning Azure. Dan Fernandez leads the team responsible for bringing our technical documentation and learning resources into a more modern experience that supports new capabilities that were impossible to deliver via MSDN. Recently, I invited Dan to record a few episodes of Azure Friday with Donovan Brown and spend some time showing off the work his team is doing to provide the best doc and learning experience.

Thumbnail from the Learning Azure series on Azure Friday with Donovan Brown and Dan Fernandez

Best practices for queries used in log alerts rules

There are several "Dos and Don'ts" you can follow to make you query run faster. Yossi Yossifon, Senior Program Manager on Microsoft Azure, provides some best practices for Log alerts rules queries in Log Analytics and Application Insights. Check his post for a few tips and a link to the query best practices in the Azure documentation.

Connect Azure Data Explorer to Power BI for visual depiction of data

Azure Data Explorer (ADX) is a lightning-fast indexing and querying service helps you build near real-time and complex analytics solutions for vast amounts of data. ADX can connect to Power BI, a business analytics solution that lets you visualize your data and share the results across your organization. The various methods of connection to Power BI enable interactive analysis of organizational data such as tracking and presentation of trends. Learn the various ways to query data from Azure Data Explorer to Power BI. Additional connectors and plugins to analytics tools and services will be added in the weeks to come.

Azure shows

Episode 258 - Live from KubeCon 2018 | The Azure Podcast

We are live at KubeCon+CloudNative in Seattle where Microsoft, together with the whos-who of the tech world, are talking about Kubernetes, We are very fortunate to get Lachie Evenson, Principal PM in the Azure team, Tommy Falgout, a Cloud Solution Architect and Daniel Selman, a Kubernetes Consultant, together in a room to discuss the current state of Kubernetes and AKS.

Pix2Story- Neural AI Storyteller | AI Show

We are live at KubeCon+CloudNative in Seattle where Microsoft, together with the whos-who of the tech world, are talking about Kubernetes, We are very fortunate to get Lachie Evenson, Principal PM in the Azure team, Tommy Falgout, a Cloud Solution Architect and Daniel Selman, a Kubernetes Consultant, together in a room to discuss the current state of Kubernetes and AKS.

Building a Pet Detector in 30 minutes or less! | AI Show

Storytelling is at the heart of human nature and Natural Language Processing is a field that is driving a revolution in the computer-human interaction. That is why we decided to explore AI Pix2Story to see if we could teach an AI to be creative, be inspired by a picture and take it to another level.

Connect devices from other IoT clouds to Azure IoT Central | Internet of Things Show

Learn about how to connect other IoT clouds like Sigfox, Particle, and The Things Network to IoT Central with the IoT Central device bridge open-source solution. We'll talk about what the device bridge is and how it works, and demo a device connected to The Things Network use the device bridge to connect to your IoT Central app.

Running your First Docker Container in Azure | The DevOps Lab

Damian catches up with fellow Cloud Advocate Jay Gordon at Microsoft Ignite | The Tour in Berlin. Containers are still new for a lot of people and with the huge list of buzzwords, it's hard to know where to get started. Jay shows how easy it is to get started running your first container in Azure, right from scratch.

Introduction to Multi-Signature Wallets | Block Talk

This video provides an overview of multi-signature wallets (smart contract) along with a walkthrough of simple multi-signature wallet written in Solidity language. The topics covered in this video include adding owners to the wallet and the workflow that takes place in order to capture multiple signatures from owners before the transfer of value can be completed.

How to run an app inside a container image with Docker | Azure Tips and Tricks

Learn how to create a container based on an image, and then create a running app inside of it. Once you get set up with Docker on your local dev machine by installing the Docker desktop application for your operating system, you can easily run an app.

Thumbnail from Azure Tips and Tricks - How to run an app inside a container image with Docker

Chris Patterson on the Future of Azure Pipelines - Episode 015 | The Azure DevOps Podcast

Jeffrey Palermo and Chris Patterson, Principal Program Manager at Microsoft, discuss how the infrastructure of Azure Pipelines is changing, what a build will mean in the future, the goal of Azure Pipelines evolution, and more.

Customers, industries, and partners

A fintech startup pivots to Azure Cosmos DB

Fintech start-up and Microsoft partner clearTREND Research had a plan was to commercialize a financial trend engine and provide a subscription investment service to individuals and professionals. Learn how their reasons for choosing Azure Cosmos DB as the best solution that could adapt, evolve, and enable their business to innovate faster in order to turn opportunities into strategic advantages. You’ll also get some tips from the clearTREND team to consider when designing and implementing a solution with Azure Cosmos DB. The team that designed and implemented the clearTREND solution are architects and developers with Skyline Technologies.

IoT in Action: New insights for retail

For in-depth insights around the latest developments in IoT for retail, including how customer expectations are changing and how IoT investments can impact store profitability, you can register for a live IoT in Action event in New York (co-located with NRF 2019) on January 14, 2019, or sign up for our industry-specific retail webinar on January 8, 2019. You will get insights into how IoT can help you delight customers, improve the effectiveness of your associates, and increase the efficiency of your operations. You can also take a deep dive into building retail IoT solutions at our upcoming 2-day Virtual Bootcamp in late January and early February.

Azure Marketplace new offers – Volume 28

The Azure Marketplace is the premier destination for all your software needs – certified and optimized to run on Azure. Find, try, purchase, and provision applications & services from hundreds of leading software providers. You can also connect with Gold and Silver Microsoft Cloud Competency partners to help your adoption of Azure. In the second half of November we published 80 new offers.


A Cloud Guru's Azure This Week - 21 December 2018 (Christmas special!)

In this Christmas special edition of Azure This Week, Lars talks about static websites on Azure Storage now being generally available, the preview of neural network text-to-speech with Jessa and Guy, and some more serverless Azure news with Azure Functions.

Thumbnail from A Cloud Guru's Azure This Week - 21 December 2018 (Christmas special!)

R 3.5.2 now available

$
0
0

R 3.5.2, the latest version of the R language for statistical computation and graphics from the R Foundation, was released today. (This release is codenamed "Eggshell Igloo", likely in reference to this or this Peanuts cartoon.) Compared to R 3.5.1, this update includes only bug fixes, so R scripts and packages compatible with R 3.5.0 or R 3.5.1 should work without modification.

Installers for R on Windows and Debian Linux are available now; other platforms should be available from CRAN within the next few days. If you want to see the full list of fixes, check out the announcement from the R Core Group at the link below.

R-announce mailing list: R 3.5.2 is released

Because it’s Friday: Happy Holidays

$
0
0

🎵 He'd better watch out // he'd better comply 🎵:

This is my favourite festive GDPR gag of Christmas 2018 so far. pic.twitter.com/2HheWIkviV

— Charlie King (@charlietheking) November 7, 2018

With the holiday season upon us, we'll be taking a little break here at the blog to relax and spend time with the family. We hope you get to do the same, and best wishes to all our readers for the season. We'll be back early in the new year: Happy New Year to all!

The Fun of Finishing – Exploring old games with Xbox Backwards Compatibility

$
0
0

Star Wars: KOTORI'm on vacation for the holidays and I'm finally getting some time to play video games. I've got an Xbox One X that is my primary machine, and I also have a Nintendo Switch that is a constant source of joy. I recently also picked up a very used original PS4 just to play Spider-man but expanded to a few other games as well.

One of the reasons I end up using my Xbox more than any of my other consoles is its support for Backwards Compatibility. Backwards Compat is so extraordinary that I did an entire episode of my podcast on the topic with one of the creators.

The general idea is that an Xbox should be able to play Xbox games. Let's take that even further - Today's Xbox should be able to play today's Xbox games AND yesterday's...all the way back to the beginning. One more step further, shall well? Today's Xbox should be able to play all Xbox games from every console generation and they'll look better than you imagined them!

The Xbox One X can take 720p games and upscale them to 4k, use higher quality textures, and some games like Final Fantasy XIII have even been fully remastered but you still use the original disc! I would challenge you to play the original Red Dead Redemption on an Xbox One X and not think it was a current generation game. I recently popped in a copy of Splinter Cell: Conviction and it automatically loaded a 5-year-old save game from the cloud and I was on my way. I played Star Wars: KOTOR - an original Xbox game - and it looks amazing.

Red Dead Redemption

A little vacation combined with a lot of backwards compatibility has me actually FINISHING games again. I've picked up a ton of games this week and finally had that joy of finishing them. Each game I started up that had a save game found me picking up 60% to 80% into the game. Maybe I got stuck, perhaps I didn't have enough time. Who knows? But I finished. Most of these finishings were just 3 to 5 hours of pushing from my current (old, original) save games.

  • Crysis 2 - An Xbox 360 game that now works on an Xbox One X. I was halfway through and finished it up in a few days.
  • Crysis 3 - Of course I had to go to the local retro game trader and pick up a copy for $5 and bang through it. Crysis is a great trilogy.
  • Dishonored - I found a copy in my garage while cleaning. Turns out I had a save game in the Xbox cloud since 2013. I started right from where I left off. It's so funny to see a December 2018 save game next to a 2013 save game.
  • Alan Wake - Kind of a Twin Peaks type story, or a Stephen King with a flashlight and a gun. Gorgeous game, and very innovative for the time.
  • Mirror's Edge - Deceptively simple graphics that look perfect on 4k. This isn't just upsampling, to be clear. It's magic.
  • Metro 2033 - Deep story and a lot of world building. Oddly I finished Metro: Last Light a few months back but never did the original.
  • Sunset Overdrive - It's so much better than Jet Set Radio Future. This game has a ton of personality and they recorded ALL the lines twice with a male and female voice. I spoke to the voiceover artist for the female character on Twitter and I really think her performance is extraordinary. I had so much fun with this game that now the 11 year old is starting it up. An under-respected classic.
  • Gears of War Ultimate - This is actually the complete Gears series. I was over halfway through all of these but never finished. Gears are those games where you play for a while and end up pausing and googling "how many chapters in gears of war." They are long games. I ended up finishing on the easiest difficulty. I want a story and I want some fun but I'm not interested in punishment.
  • Shadow Complex - Also surprisingly long, I apparently (per my save game) gave up with just an hour to go. I guess I didn't realize how close I was to the end?

I'm having a blast (while the spouse and kids sleep, in some cases) finishing up these games. I realize I'm not actually accomplishing anything but the psychic weight of the unfinished is being lifted in some cases. I don't play a lot of multiplayer games as I enjoy a story. I read a ton of books and watch a lot of movies, so I look for a tale when I'm playing video games. They are interactive books and movies for me with a complete story arc. I love it when the credits role. A great single player game with a built-up universe is as satisfying (or more so) as finishing a good book.

What are you playing this holiday season? What have you rediscovered due to Backwards Compatibility?


Sponsor: Preview the latest JetBrains Rider with its Assembly Explorer, Git Submodules, SQL language injections, integrated performance profiler and more advanced Unity support.



© 2018 Scott Hanselman. All rights reserved.
     

Q# – a Wish List for the New Year

$
0
0

In previous blog posts you have read about some of the ideas behind Q#, how it came into existence, and its development over the past year. You have read about quantum computing, quantum algorithms and what you can do with Q# today. With the end of the year approaching, there is only one more thing to cover: What is next?

This blog post is about our aspirations for the future and how you can help to accomplish them. It contains some of our visions going forward, and we would love to hear your thoughts in the comment section below.

Community

One of the most exciting things about Q# for us is the growing community around it. Being rooted in the principles of quantum mechanics, quantum computing tends to have this air of unapproachability to the “uninitiated”. However, quantum computing builds on the notion of an idealized quantum system that behaves according to a handful of fairly easy to learn principles. With a little bit of acquired background in linear algebra, some persistence, and patience when wrapping your head around how measurements work it is possible to get knee-deep into quantum algorithms reasonably quickly!

Of course, a couple of good blog posts on some of these principles can help. We strive to actively support you in the adventure of exploring quantum algorithms by providing materials that help you get started, like our growing set of quantum katas. Our arsenal of open source libraries provides a large variety of building blocks to use in your quest of harnessing the power of quantum. One of the main benefits of open source projects is being able to share your work with all the people brave enough to explore the possibilities that quantum has to offer. Share your progress and help others build on your achievements! Whether in kata or library form, we welcome contributions of any size to our repositories. Let us know how we can help to make contributing easier.

Exchange among developers is one of the most important aspects of software development. It is omnipresent and vital to building a sustainable environment around a particular toolchain and topic. Thankfully, modern technology has made that exchange a lot easier than when the first computer programmers started their careers. We intend to make full use of the power of the internet and give a voice and a platform for discussions on topics related to Q# and quantum computing to developers around the world. The Q# dev blog is part of this effort. Contact us or comment below if you have an idea for a blog post or would like to hear more about a specific topic related to Q#. Establishing good feedback channels is always a challenging endeavor and in particular for a small team like ours. We would like this place to become a source of knowledge and exchange, a place where you can find the latest news and voice your take on them.

Growth

This brings us back to our plans for Q#. We have built Q# to make quantum development easier and more accessible. Of course, there were also a couple of other considerations that have played into that decision. For instance, we are anticipating the need to automate what is largely done in manual labor today, e.g. qubit layout and gate synthesis that are often still done on a case-by-case basis for each program and targeted hardware. When is the last time you worried about how error correction works on the hardware your code gets executed on? With qubits being an extremely scarce resource, and the long-term ambition to use quantum computing to address the most computationally intensive tasks that cannot be tackled with current hardware, the optimization of large-scale quantum programs needs to be a priority. We chose to develop our own language in order to have full control and flexibility over what information is represented how, and when it is used during compilation in order to be able to support a modular and scalable software architecture for executing quantum programs. But that’s a tale for another time. What is important is that these considerations are key factors in how we design and develop the language going forward.

A programming language is more than just a convenient set of tools for expressing an algorithm. It shapes the way that we think and reason about a problem, how we structure it and break it down into tasks when building a solution. A programming language can have a tremendous impact on our understanding of existing approaches, as well as how to adapt and combine them for our purposes. Particularly so when venturing into new territory.

Our goal is therefore to build a shared understanding of what it is we strive to accomplish, and to evolve Q# into the powerful language needed to drive progress in quantum programming. Our goal is to leverage the expertise of a community of language designers, compiler veterans, quantum physicists, algorithms and hardware experts, and a variety of software developers to shape a new kind of computing architecture. And we want you to be part of it.

Transparency

Since our 0.3 release at the beginning of November we have been eagerly working on not just the next release, but on defining and preparing the next steps in 2019. While we are in the middle of formulating our plans for the future, I want to give you a brief insight into some of our considerations.

As I am sure you have noticed, the support for data structures in Q# is minimal. While we do provide quite a few high-level language features for abstracting classical and quantum control flow, we intentionally omit some of the more object-oriented mechanisms such as classes. We anticipate remaining heavily focused on transformations that modify the quantum state, expressed as operations in Q#, as well as their characteristics and relations in the future. However, basic bundling of data and manipulations of such is of course an important aspect of many programs and we want to provide suitable mechanisms to express these in a way that allows to make abstractions, is convenient, and is resistant to coding errors. User defined types in the current setting have limited power besides an increased type safety. The “black box approach” to type parameterization currently restricts their usefulness; we do not provide a mechanism for dynamic reflection and it is not possible to apply operators or other type specific functionalities to argument items whose type is resolved for each call individually. In that sense, these items are “black boxes” that can merely be passed around. We want to do as much of the heavy lifting as possible statically in particular since debuggability of quantum devices is a huge challenge. There are several mechanisms one might consider alleviating the consequences of these decisions. On one hand, type constraints are a common mechanism used in several popular languages. In a sense, they can be seen as “specializations based on the properties of a type”. One could also pursue the stricter path of specializing based on the concrete type itself, de-facto adding a form of overloading that we currently explicitly prevent from being used. Either way, by clearly separating user defined types from tuples in the type system we have made a first step towards extending their power.

If you are curious to hear more about possible ideas for Q#, their benefits and caveats, or want to share some thoughts of your own, comment below! Contribute to the discussion and post your speculations to the question: What makes a quantum programming language “quantum”, i.e. what makes it particularly suited for quantum computing?

Join us

I hope you join us into a new year of pushing the boundaries of computation by participating in our coding competitions, contributing to our open source repositories, commenting on or writing blog posts and sharing your ideas and experiences!

How about a new year’s resolution of your own? Let us know what you expect to accomplish and how we can help you achieve your new year’s resolution around quantum programming in Q#!

Bettina Heim, Senior SDE, Quantum Software and Application
@beheim
Bettina Heim is a quantum physicist and software engineer working in the Quantum Architectures and Computation Group at Microsoft Research. She is responsible for the Q# compiler and part of the Q# language design team. Prior to joining Microsoft she worked on quantum algorithms, adiabatic quantum computing, discrete optimization problems, and the simulation and benchmarking of quantum computing devices.

Using Visual Studio Code to program Circuit Python with an AdaFruit NeoTrellis M4

$
0
0

My son and I were working on an Adafruit NeoTrellis M4 Mainboard over the holidays. This amazing little device puts a NeoPixel + an Audio board + a USB port along with a 120 MHz Cortex M4 Core and a mic amplifier and you can program it with CircuitPython. CircuitPython is open source and on Github at https://github.com/adafruit/circuitpython. "CircuitPython is an education friendly open source derivative of MicroPython." It works with a bunch of boards including this NeoTrellis and it's just lovely for teaching and learning.

This item is just the mainboard! You'll almost certainly want two Silicone Elastomer 4x4 Pads and an enclosure to go along.

Circuit PythonAs with a lot of these small boards, when you plug a NeoTrellis into a your machine via USB you'll get new disk drive that pops up. All you have to do to "deploy" your code is copy it to your drive. Even better, why not just edit the code place?

There's a great Python editor called Mu that works well with Circuit Python. However, my son and I are more familiar with Visual Studio Code so we wanted to see how it worked with Circuit Python.

We installed the Python extension for VS Code as well as the Arduino extension for VS Code and the Arduino IDE directly from the Windows Store.

Fire up VS Code and File | Open Folder and open the Disk Drive of the NeoTrellis and open (or create) a code.py file. Then from the Command Palette (Ctrl-Shift-P) in VS Code select Arduino > Initialize. If you get an error you may need to set up the path to your Arduino IDE. If you installed it from the Windows Store like we did you may find it in a weird path. We set the arduino.path like this:

"arduino.path": "C:\Program Files\WindowsApps\ArduinoLLC.ArduinoIDE_1.8.19.0_x86__mdqgnx93n4wtt"

The NeoTrellis M4 also shows up as a COM port so you can look at its Serial Output for debugging purposes as if it were an Arduino (because it is underneath). You then Arduino > Select a COM Port from the Command Palette and it will create a file called .vscode/arduino.json in your folder that will look like this:

{

"port": "COM3"
}

Trellis M4 is awesomeNow, within Visual Studio Code select Arduino > Open Serial Monitor and all of your print("") methods will output to that bottom pane.

Of course, we could putty into the COM Port but since I'm using this as a learning tool with my 11 year old, I find that a single window that shows both the console and the code help them focus, rather than managing multiple windows.

At this point we have a nice Developer Inner Loop going. That inner loop for us (the developers) is that we can write some code, hit save (Ctrl-S) and get immediate feedback. The application restarts when it detects the code.py file has changed and any debug (print) statements appear in the console immediately.

Visual Studio Code doing some Circuit Python

We are really enjoying this Adafruit NeoTrellis M4 Express kit. Next we're going to make a beat sequencer since the Christmas Soundboard was such a hit with mom!


Sponsor: Me! Hey friends, I've got a podcast I'm very proud of where I interview amazing people every week. Check it out at https://www.hanselminutes.com and please not only subscribe in your favorite podcasting app, but also tell your friends! Tweet about it and review it on iTunes/Google Play. Thanks!



© 2018 Scott Hanselman. All rights reserved.
     

Introducing new advanced security and compliance offerings for Microsoft 365

Introducing new advanced security and compliance offerings for Microsoft 365

MyAnalytics, the fitness tracker for work, is now more broadly available

MyAnalytics, the fitness tracker for work, is now more broadly available

Notebooks from the Practical AI Workshop

$
0
0

Last month, I delivered the one-day workshop Practical AI for the Working Software Engineer at the Artificial Intelligence Live conference in Orlando. As the title suggests, the workshop was aimed at developers, bu I didn't assume any particular programming language background. In addition to the lecture slides, the workshop was delivered as a series of Jupyter notebooks. I ran them using Azure Notebooks (which meant the participants had nothing to install and very little to set up), but you can run them in any Jupyter environment you like, as long as it has access to R and Python. You can download the notebooks and slides from this Github repository (and feedback is welcome there, too). 

The workshop was divided into five sections, each with its associated Notebook:

  1. The AI behind Seeing AI. Use the web interfaces to Cognitive Services to learn about the AI services behind the "Seeing AI" app
  2. Computer Vision API with R. Use an R script to interact with the Computer Vision API and generate captions for random Wikimedia images.
  3. Custom Vision with R. An R function to classify an image as a "Hot Dog" or "Not Hot Dog", using the Custom Vision service.
  4. MNIST with scikit-learn. Use sckikit-learn to build a digit recognizer for the MNIST data using a regression model.
  5. MNIST with Tensorflow. Use Tensorflow (from Python) to build a digit recognizer for the MNIST data using a convolutional neural network.

The workshop was a practical version of a talk I also gave at AI Live, "Getting Started with Deep Learning", and I've embedded those slides below. 

This is an embedded Microsoft Office presentation, powered by Office Online.

Azure Notebooks: Practical AI for the Working Software Engineer

Viewing all 5264 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>