What Is the Ideal Microsoft Azure Cloud Stack

Software Development, Technology

What Is the Ideal Microsoft Azure Cloud Stack?

Microsoft Azure offers a set of cloud solutions to build, deploy, and manage applications. Azure continues to evolve and offer new features for existing services, as well as roll out entirely new services. It can be a challenge for your development team to keep up with this constant rollout, so we do it for you. At csg, we’ve implemented solutions on Azure that take advantage of its many unique services, and developed best practices and methodologies to create powerful solutions. So, with our broad and deep experience in mind, a picture of what makes for the ultimate Azure stack has slowly materialized within the four walls of our organization. While the best option will differ depending on your company’s wants and needs, and will need to be constructed by someone with the appropriate expertise, we’ve found that there are some basic guidelines that apply to a many of situations. And they are guidelines that we’re more than happy to share. Regardless of whether you capitalize on our services or choose another solutions provider, here are our guidelines for what to consider when constructing the ideal Azure cloud stack for your organization.

Azure SQL Data Warehouse

The inherent flexibility that is offered by Microsoft’s Azure SQL Data Warehouse makes it the wisest choice for any business looking to process its data in the cloud. Microsoft has structured its data warehouse offering in an incredibly clever way. A user can grow and shrink compute in seconds independent of data storage, enabling you to only pay for the query performance that you actually need. You can also pause this computation at any time, meaning that you only incur these throttled computing costs at a time you choose. It’s true pay-as-you-go. In essence this inextricably links Microsoft’s success with that of Azure users; if the product works as it should then developers will utilize it often, and both Microsoft and businesses are better off. If it doesn’t, developers won’t use it and Microsoft will find itself in a hole. It’s this aligning of interests that is perhaps the Azure SQL Data Warehouse’s defining feature. Azure has made a name for itself as a far cleaner and smoother utility for developers, helped by the fact that the data warehouse is offered as an integrated part of many Dell, Lenovo and HPE products.

Tableau

The search for business intelligence (BI) or data visualization tools inevitably brings up two names – Power BI and Tableau. Tableau is the more established of the two; it was founded in 2003 and has been seen as the gold standard ever since. Power BI is quickly gaining ground, despite giving Tableau a 10 year head start. Power BI is a Microsoft product, making integration into the Azure framework seamless. At CSG, we continue to use both Tableau and Power BI as we develop solutions for our customers. We like both of them, for different reasons, and for specific situations, but if we were to cast a vote, we'd probably say Tableau.

Tableau offers an optimized connector to Azure SQL Data Warehouse. They make it easy for our customers to interact with data and share their findings. The scalability of Azure SQL Data Warehouse compliments Tableau’s powerful visualization and analytics capabilities. At this point in time we think Tableau is the clear winner in the head-to-head battle between it and Power BI. Tableau offers greater ease of use, an active user community, unparalleled support for their user base, and pre-built connectors to Azure and a variety of other data sources. While Power BI may well be a Microsoft product, it’s clear that the rest of the Microsoft Azure services and Tableau software compliment each other and work incredibly well together. Cosmos DB If ‘managing data at planet scale’ (as Microsoft puts it) is a necessity for your business, or indeed is something that may prove necessary in the future, Microsoft Azure’s Cosmos DB is a fantastic solution. A multi-model database service that makes it simple to scale and replicate your data wherever your users are, Cosmos DB makes it simple to build responsive and scalable applications for even the most scattered and far-flung teams. Cosmos DB takes cloud computing platforms as a service (PaaS) to the next level. As a turnkey system that distributes data to any number of global regions, latency is kept to an absolute minimum – Microsoft quotes expected latency at less than 10 milliseconds for reading data, and less than 15 milliseconds for writing data. With 99.999% availability guaranteed, it doesn’t matter whether a user is in New York, Paris or Delhi; collaboration will always be a seamless experience. Other Cosmos DB benefits include:

  • It’s a complete service, and is ready to use instantly

  • Users can access data with their API of choice, including JavaScript, SQL, Azure Table Storage, Gremlin and MongoDB

  • 5 consistency levels are offered, making for a truly intuitive programming model - strong, bounded staleness, consistent-prefix, session and eventual

  • Industry leading Service Level Agreements (SLAs) have been crafted for enterprise-grade use

Azure Batch

For extra computing power when you need it, there’s no better service in our eyes than Azure Batch. From humble punch card beginnings, over the decades batch computing has enabled organizations to access the required computing power to efficiently process automated tasks. But while this capability has historically been limited to a select few organizations and governmental bodies, the onset of cloud computing has brought batch processing to the masses. Imagine having access to 10, 1000 or even 100,000 cores whenever you needed them. Imagine only ever paying for the computing power that you require at a certain moment. That’s exactly what Azure Batch offers - a pay-as-you-go service with no capital investment required, that can scale to thousands of virtual machines, auto-scale work in the queue and cloud-enable HPC and batch applications.

Azure Data Lake

Another powerful element of the stack is a data lake. Where a data warehouse neatly organizes your data like books on a library shelf, extracting, transforming and loading it for quick and easy use, a data lake instead leaves all files in their native format. ETL processes are conducted on an ad hoc basis – schema and data requirements remain undefined until the data is queried. Data lakes have many benefits, most notably:

  • Ultimate scalability

  • Low-cost storage

  • High agility and easy reconfiguration

  • Equally supportive of all users and all data types

  • Potentially faster insights

In terms of the Azure Data Lake specifically, you’ll enjoy the ability to store and analyze petabyte-size files and literally trillions of objects. You can start a data lake in just seconds, scale it instantly and enjoy enterprise-level security, auditing and support. In their HDInsight service Microsoft has created the only fully managed Cloud Hadoop offering to provide optimized open-source analytic clusters for Spark, Kafka, Map Reduce, HBase, Hive, Storm and R-Server, all guaranteed by a 99.9% SLA. And as with all other Azure products, you’ll only ever pay for what you use.

Azure Data Factory

Just as a factory made of bricks and mortar will turn raw materials into something useful, so too does Azure Data Factory turn your business’s zeroes and ones into usable data ready for querying. The ETL/ELT processes necessary for the organization and integration of data in a repository can be a complex and taxing task, but with Azure Data Factory it has never been simpler or quicker. Described by Microsoft as a hybrid data integration service, Azure Data Factory utilizes a four step process to create, schedule and orchestrate ETL/ELT workflows at whatever scale is required, and no matter whether the data resides in the cloud or on your business’s own self-hosted network. Data Factory first ingests and prepares the data, before transforming and analyzing it, and finally publishing it for consumption. Once published a user can monitor and manage data pipelines to ensure that everything is running smoothly. Data Factory can alert a user to any potential issues and can automate a huge amount of previously manual tasks and checks.

Azure Cognitive Services

With potentially petabytes of data to deal with, the task of analyzing your business’s data and gaining insights from it is an imposing one. The human mind is amazing at many things, but gaining knowledge from a sea of binary code is not one of them. So why not leave this task to a better suited brain? Azure Cognitive Services brings Artificial Intelligence to the stack. Used by the likes of Uber to gain insights from data, the AI can solve an incredible variety of business problems through a range of APIs. Cognitive Services have the ability to identify visual content in an image or video, to map complex data and offer up intelligent recommendations, and to even pick the identity of a speaker by analyzing nothing more than their speech. With Azure Cognitive Services the potential to gain insights from your data, and therefore the potential for your organization to grow and develop, is limitless. That’s our 2018 take on the Ideal Microsoft Azure Cloud Stack. As you can see, there are wide variety of services available (and more than we covered here) that are designed to help you implement your strategy with cutting edge technology that scales to your needs. What’s your ideal Azure stack? In the end, it really depends on the solution you are trying to implement. These tools evolve constantly, so check back for more insights from CSG.

 

Considering moving your BI to the cloud?

Read our article about what to consider before your make the move.

read more