Crayon Logo
Authors Posts by Ashley Gatehouse

Ashley Gatehouse

145 POSTS 0 COMMENTS
I am the Group Chief Marketing Officer at Crayon. My team are focused on driving enhanced lead generation campaigns and nurturing for our sales organisations across multiple geographies though the utilisation and coordination of all online and offline communication channels. We are driving increased brand awareness in the business's core competency areas of Software Asset Management (SAM), cloud and volume licensing solutions and associated consultancy services. I have over 20 years of senior business leadership experience within direct marketing/direct sales and mass distribution businesses, in both the B2B and B2C markets serving on the boards of both private and public multinational corporations.

by -
0 53

Today, CIOs are under more pressure than ever to deliver agile IT transformation that aligns with and accelerates business imperatives. Fortunately, there are tools at hand to help them on this journey, says Ashley Gatehouse.

Digital 360 & The Journey Beyond Cobb’s Paradox

Back in 1995, a Canadian Government IT specialist was reviewing the outcome of a series of complex information communication technology (ICT) projects when he came out with a phrase that would come to haunt enterprises for years to come.

“We know why projects fail; we know how to prevent their failure – so why do they still fail?” mused Cobb.

Today, that all too familiar persistent and common phrase is known as ‘Cobb’s Paradox’ and over the years evidence suggests that rather than improving, things may have been getting worse when it comes to the delivery of IT projects.

With so much at stake and large IT projects often coming in way over budget, there are also very real operational risks posed when things goes wrong. By 2012, a study from McKinsey in conjunction with the University of Oxford suggested that things hadn’t improved much. At that point, with 17 percent of large IT projects veering so far off course they threatened the very existence of the companies involved. These large IT projects with prices tags of over $15 million were also running at nearly 50 percent over budget whilst delivering 56 percent less value than originally predicted. To put that into perspective, at the time the aggregate cost overrun of these IT projects eclipsed $66bn – more than the GDP of Luxembourg.

Fast forward to 2017 and although there has been some improvement, a report from the Project Management Institute (PMI) suggests that around 14 percent of IT projects still end in failure. Adding further context to this is the fact that of those projects that didn’t fail outright, 31 percent didn’t meet their goals either, with 43 percent coming in over budget. A further 49 percent were delivered late. So, for every $1 billion invested, firms were wasting $97m due to poor project performance.

Such statistics make tough reading for CXOs as they attempt to deliver increasingly complicated projects against a background of belt tightening and budget cuts that has typically seen a reduction of in house specialist staff and skills within the IT function.

Of course, the nature of delivery has also changed and new techniques and methodologies such as agile and DevOps are helping enterprises to deliver upon board requests and management priorities. Yet projects are still failing, with reasons ranging from lack of executive sponsorship and user buy-in to poor collaboration between the IT and business functions, as firms falter from a lack of direction and consequently distrust from users.

Clearly, strategic planning that defines key metrics and marries these with business objectives is key. Enterprises must think about how projects are resourced, how they align and stay aligned with desired outcomes, the timeline for doing so, how they avoid poor governance and critically, how they keep on budget.

For the beleaguered contemporary CIO and their team, it’s a lot to think about, so what should businesses be considering when they embark upon their digital transformation journey?

Firstly, remember you are not alone. Success is only truly achievable when technologies, stakeholders and business needs are all aligned, so it’s important that you have a formula for keeping these on track.

The good news is help is at hand in the form of new and FREE to access digital transformation frameworks, such as Digital 360, that provide solid guides for CIOs and their management teams attempting to take enterprises on this journey.

Consider how such methodologies can help keep project lifecycles on track. One of the main problems associated with digital transformation and delivery is the changes that occur in business priorities and objectives over time as builds take place. By being able to closely monitor where you are in your project lifecycle you can ensure effective communication and collaboration between IT and the business it serves to prevent the risk of a misalignment developing.

Moreover, by aligning with worldwide industry standards and best practice – such as ISO and ITAM – Digital 360 will allow you to regularly self-certify where you are within your project lifecycle and provide automated suggestions to ensure that no stages or elements of your project plan remain incomplete. In addition, it provides technical service proposals to help you at each stage of the project in case you lack relevant in-house expertise.

So how does it do this? Firstly, the solution breaks your proposed IT project workflow up into five core actionable steps based on accepted best practice frameworks – Business Needs, Solution Design, Transition to Cloud, Manage and Governance.

This allows you to devise and plan every stage, minimising any chance of a disconnect from the start. In Business Needs, you can set out the scope, constraints, and expectations for a project. It’s here that you can create your vision for the business, define who the stakeholders are, validate any solution with the business context, and create a statement/s of work before obtaining approvals.

The next stage is Solution Design. This is where you can develop target architectures and define the types and sources of data needed to support the business in a way that can be understood by your stakeholders. Here you’ll also define the kinds of application systems necessary to process the data and support the business. Furthermore, Digital 360 allows you to develop data architecture reports and applications architecture reports, as well as showing gap analysis and impact analysis results.

Once you’ve done that you can move onto the next stage: Transition to Cloud. At this point you’ll be able to sort the various implementation projects into a series of priorities for the business. In doing so, you’ll produce a prioritised list of projects that will form the basis of the detailed implementation and migration plans whilst generating an impact analysis, including an architecture implementation contract.

By the time you get to stage four, Manage, you’ll be concerned specifically with the services offered and what will be delivered. In particular, this deals with the required service levels, availability, continuity, financial viability, security, and the necessary IT infrastructure capacity that you need to make your project a success.

Governance is the last stage of your digital transformation journey, you will be looking at how to monitor and move in line with everyday changes to the business that may affect the delivery of your project. This enables you to assess those changes and ensure you have developed a position to act and perform appropriate governance functions, including budget management, contract compliance, data management and cyber security, whilst the system is being implemented and deployed.

By focusing on what you are trying to achieve and breaking it down into these five core steps, CIOs can minimise the risk of failure and keep projects on-time and on-budget to mitigate risk and realise business objectives.

One of the major challenges faced by organisations is how to efficiently manage their data centres and public cloud offering infrastructures. We look at how hyperconverged infrastructure and Azure could solve this challenge.

Can Azure and hyperconvergence make your enterprise more efficient?

Hyperconvergence is gaining traction in the enterprise as more organisations look to it to bring increased efficiency to their operations. The size of the hyperconverged systems market is set to reach $3.9 billion by 2019.

According to figures from IDC, worldwide converged systems market revenue increased 10.8% year over year to $2.99 billion during the third quarter of 2017 (3Q17). The market consumed 1.96 exabytes of new storage capacity during the quarter, which was up 30% compared to the same period a year ago.

Hyperconvergence offers enterprises a simplified way of managing infrastructure as it enables integrated technologies to be run as a single system through a shared tool set. It is more than just converged systems that brings together discrete hardware products sold as a single SKU; hyperconverged systems are integrated to the point the separate components cannot be broken down.

The hypervisor is more tightly coupled in hyperconverged systems. As well as running hypervisors, a hyperconverged system will manage essential datacentre functions as software on the hypervisor.

Advantages and disadvantages

As briefly mentioned above, as the technologies within a hyperconvergence system are managed from one tool – this ease of use is a primary benefit for organisations looking to simplify operations. The console can run all functions including compute, storage networking and virtualisation.

Another advantage of hyperconverged systems is the ability to scale up. When an organisation needs more resources, it is simply a case of adding more nodes. However, this also highlights a drawback of such infrastructure; all resources need to be increased when any one resource needs increasing. Newer hyperconverged systems overcome this by being either storage- or compute-centric.

Virtualisation

Hyperconverged systems are designed to run virtual machines on a hypervisor that is intended to specifically work with the system. As the system virtualises all resources, these can be modified to accommodate greater or fewer virtual machines without having to suspend these. Once virtual machine capacity has been reached, more nodes have to be added, increasing compute, storage and networking resources that can be shared with the remaining VMs.

Hyperconvergence with Azure

Microsoft has improved Storage Spaces Direct in Windows Server as part of its hyperconvergence endeavours to decrease costs and ease complexity in its Azure cloud. Windows Server Storage Spaces is software-defined storage, or virtualised storage for Windows Server.

With Microsoft’s hyperconvergence solution, two identical nodes are required in each cluster at a minimum and each node has its own storage. This combines compute and storage in one cluster. This is great for SMEs as this lessens complexity and hardware costs.

For larger organisations, storage and compute can be separated by building a Scale-Out File Servers (SOFS) on Storage Spaces Direct.

This helps Microsoft deliver private cloud solutions to smaller enterprises. Among these private cloud solutions is Azure Stack. This is an on-premises version of its public cloud that allows organisations to use Azure without porting sensitive data into a multi-tenant environment.

Azure Stack is certified to run on select hardware and with run in the same way as Azure. It also provides a shared management platform between the public and private cloud. Developers can build on-premise cloud-native applications that can run either on-premise or in the cloud elsewhere.

The smallest Azure Stack deployment is a four-server rack with three physical switches and a lifecycle management server host. These racks can scale up to 12 servers, and racks can be scaled together. An Azure Stack composed of a full-sized, 12-rack server unit can service 400 virtual machines with 2 CPUs and 7 GB of RAM, with resiliency.

Combing the hardware with the software gives rise to an offering called an Azure Stack Integrated System. Organisations can purchase hardware from one of the certified vendors (Dell EMC, HPE or Lenovo) and license Azure Stack software to run over it.

Organisations can use existing Microsoft licenses, including Windows Server, SQL Server and MSDN subscriptions to pay for Azure Stack.

For more information on how Azure can help your business achieve its goals click here.

Machine learning can crunch through your corporate data and unearth a few home truths

How can machine learning help the enterprise architect

Organisations that let data fuel their business decisions are more successful. So how can machine learning take data and use it to manage and organise enterprise systems?

Digital transformation has the potential to overhaul the business practices of many organisations, but the key to its success is data and how it is used to effect change. Depending on who you believe, around 90 percent of all data that has ever existed has been created within the last two years. Only one percent of that data has been analysed, according to McKinsey.

This data has the power to help change organisations for the better. Organisations that enable data to help in making business decisions are often more successful as a result.

Leaving business improvement plans to trial and error and gut feelings often fare worse than those that embrace data. However, there is a problem.

Enterprise architects can use some data to create models, but the data within them can and will age meaning conditions that model tackles will have changed during the planning and implementation phase of a transformation.

Digital transformation threatens to make inflexible architectures and data models obsolete before they have even begun.

The way to tackle this is through machine learning. It can take the guesswork out of essential business decisions. The insight from machine learning can assist organisations to make better predictions by continuous learning and adaptation for models using real-time data.

Why use machine learning in the enterprise?

Machine learning is relevant to many industries today and is having a huge impact on enterprise architecture. The technology is being used more and more in enterprises to optimise back-office and consumer facing processes and systems. It will also assist in managing and organisation complex enterprise systems.

For the enterprise architect, machine learning will be all about creating models that can use algorithms to ingest large quantities of data, something that is beyond manual processes for a human, in order to provide insights and actionable strategies. From data it can learn what is happening within the organisation and see the patterns that maybe invisible to humans.

While machine learning can be thought of as particularly clever and complex pattern recognition, enterprises can use what is called “deep learning” to solve problems that have been beyond the reach of humans.

Using a framework for machine learning

It is an important caveat to mention here that machine learning cannot be used in every situation. It is essential that a framework is in place so that the enterprise architect can provide guidance to the rest of the organisation about the problems needing to be solved.

First, what are the problems that a business is trying to solve? The ones that are data intensive are good areas for machine learning.

Second, machine learning is not just about automating manual processes. The processes that are ripe for machine learning are connected to decision-making and have a cognitive aspect.

Third, enterprise architect should discover those areas where decisions need to be made in real time and where machine learning can be used to decrease response times.

The operational challenge for enterprise architects when dealing with machine learning is where you can increase the intelligence of a system so that enterprises can make faster decisions to gain a competitive advantage.

When embarking on using machine learning in the enterprise, architect need to ensure that the data going into training such systems is high quality in order to get high quality results. Get that right and your investments will pay off quickly.

As the countdown to one of the most important pieces of European legislation continues, new options are arriving to facilitate easy management of the new GDPR, data privacy and compliance environment, says Ashley Gatehouse.

GDPR: To Infinity and Beyond

For CIOs and those in IT management positions, the nature of compliance means that there is always something new to consider and the impact of new legislation on the horizon. So, when the EU announced the final approval of its General Data Protection Regulation (GDPR) two years ago, the countdown began to one of the most important changes in data privacy regulation over the last 20 years.

Since then, firms have been on a race to align with the stringent new regulations that are set to reshape the way businesses think about data privacy and which replace the existing Data Protection Directive 95/46/EC.

One of the biggest problems faced by firms handling vast amounts of personal data is the task of the operation they face and the impact that the GDPR will have on their operating activities – one reason it’s important for enterprises to have a game plan and partner they can trust.

The trouble is that for those that have yet to finalise their lines of risk assessment and decision-making, time is short.

If you think about the personal data you hold across various applications, networks and cloud infrastructure, the GDPR holds you accountable for that regardless of where it is stored. Not only that, but when asked, you must be able to explain the lifecycle of that data, whether consent if applicable has been obtained or whether your organisation has legitimate Interest to use the data. And when requested you must also be able to remove all personal information and also be able to report any data breaches to the relevant supervisory authority within 72 hours of discovery.

However, the good news for data processors and controllers (DPOs) is that there are welcome options on the table that can quickly bring alignment and stave off the potential threat of those EUR 20,000,000 fines or bans on the processing of personal data.

As a starting point there are various free solutions that enable you to perform the first part of your risk assessment in determining whether or not you are ready for the GDPR by taking you through a series of questions relating to areas of the legislation. But once you’ve determined where you are on your journey what else can be done?

A platform, such as GDPR Infinity – as the name suggests – has been designed to help businesses and their designated Data Protection Officers (DPOs) move beyond deadline day with confidence by providing users with a comprehensive set of tools that enable a simple and straightforward long-term approach to data privacy and management excellence through a tailored self-service portal available via one simple subscription.

A key part of this process is its ability to provide enterprises with a Gap Analysis – the steps that need to be taken to enable alignment – with GDPR. First and foremost, that provides an accurate analysis of where you currently sit as an organisation, indicating what solutions and controls must be implemented to bring you into compliance. Organisations will then be able to track progress being made across their four main areas of focus: Accountability & Governance, Lawfulness & Transparency, Security & Safeguards, and Verification & Assurance.

Yet, GDPR Infinity is more than just a framework and tool for data protection officers. The inclusion of an Audit Manager solution enables the monitoring and tracking of compliance activities so that organisations can demonstrate in a timely manner how they are going about the management of personal data. When the auditors come calling the ability to quickly correlate GDPR activities and accurately document evidence of firm controls from a central repository will be crucial.

By providing essential tools needed by DPOs and their equivalents when going about their daily tasks – and those set out in Article 39 of the GDPR in particular – the solution will save you hundreds of man hours by providing essential strategic management in terms of roles and responsibilities, as well as providing access to Policy Templates, Codes of Practice and Demand Generation of Records as and when required.

Consider the impact GDPR Infinity could have upon the processing of records and data processing agreements (DPAs) for firms needing to sign and create hundreds of these on a monthly basis. Remember, if you enter into a contract where processing of personal data is likely to occur, this applies to you. By using the solution’s own PDPA Creation Tool, Called Agreement Manager and for internal business processing, Processing Manager, this can all be automated, allowing for the import and export of processing records that can then be directly annexed to the DPA. There is also Subject Access Request (SAR) Manager and Incident Manager to record when data subjects request access to data and if any breaches occur. Furthermore, for those tasks that need to be addressed on a yearly basis, GDPR Infinity provides an annual cycle of key repeatable tasks that must to be performed.

Of course, the GDPR represents a fundamental shift in the way firms operate with respect to data privacy management but rather than being viewed as a problem that needs solving we believe it should be a way to further improve the relationship that businesses have with both their data and their customers. Solutions such as GDPR Infinity are designed with the future in mind and allow enterprises to plan far beyond this May’s deadline, giving them solid foundations for ongoing compliance plus data privacy and access excellence.

As AI moves beyond experimentation to deployment, those in the boardroom are already realising the benefits it brings in terms of innovation, automation and more advanced workloads, says Ashley Gatehouse

AI Taking flight within the Enterprise

Much has been made of the arrival of Artificial Intelligence (AI) within the enterprise. In recent months the subject has been much debated everywhere from boardrooms to the World Economic Forum (WEF).

As the British Prime Minister, Thersea May, told those recently gathered in Davos, “We are only at the beginning of what AI could achieve,” explaining that in the UK new AI startups were being created every week.

May went further though, suggesting the echoing of moves from within the enterprise, as she said she was prepared to “bring AI into government” as part of a wider strategy to ensure the UK was seen as a leader in this area – one reason she intended to press on with plans for the world’s first national advisory body for AI, committing £9m to the cause as it also announced that the UK will join the WEF council on AI.

As we know, many enterprises are already moving well beyond the experimentation phase to deployment, with recent research from Infosys confirming that AI deployments have already altered the way they operate. Indeed, having asked the question of more than 1,000 CXOs with decision-making power around AI, 90 percent said it had already had an impact on their organisation, with almost three quarters of those questioned stating it had transformed the way they went about their business.

These changes in such a short space of time also suggest that those firms not making provisions or plans for AI within their enterprise strategy would be making a grave mistake. With 80 percent of respondents recognising that AI would have some part to play in shaping their future strategy they indicated that they would almost certainly have to follow suit or be left behind.

And whereas once there was caution and scepticism when AI was mentioned – particularly around jobs and how they would be affected – there has been a general shift in mentality on the subject with enterprises making moves to harness such technologies to work in parallel with human masters to achieve things not possible just a few years back.

Consider the impact artificial intelligence and automation brings to various industries; the barriers it breaks down. Rather than replace, it fosters many new ways of working, bringing in other disciplines and human skillsets to help manage the process. In that respect the innovation and automation it is rapidly ushering into areas such as product design, engineering, predictive modelling and decision making itself, means it will certainly help to create jobs in other areas such as compliance.

Let’s not forget the innovation AI can bring when it comes to optimising insights and things like consumer experience. According to Infosys 80 percent of IT decision makers at organisations in later stages of AI deployment reported that they are using AI to augment existing solutions, or build new business-critical solutions and services to do this. By forcing us to think in new ways AI is helping firms to make rapid gains in both productivity and efficiency when deployed within the enterprise.

At the heart of this data remains the most precious of assets, but in some cases, for the moment, an obstacle. While the technology has moved swiftly on to allow enterprises to use it in a multitude of new ways, almost half of IT decision makers have reported that they’re not quite there in terms of being able to support the new technologies and this is forcing them to invest in data management as they bid to harness the benefits it will bring.

However, once there, AI will help to lessen the load for CIOs and IT directors and open up business intelligence (BI) and data analytics to a whole new stream of non-data professionals within firms. In turn that will allow for new insights when it comes to BI, whilst reducing the reliance on custom build solutions as AI reduces the barriers between data formats opening it up for analysis and application alongside machine learning for new and profitable areas such as data discovery, cleansing and curation.

Of course, the next generation of cloud tools are already here for enterprises to start developing new strategies and methods of IT delivery. Witness the application of Microsoft’s Azure ecosystem and how this is helping firms to handle more advanced workloads, such as big data analytics and AI-enabled applications.

Indeed, today’s intelligent cloud and edge solutions allow organisations to harness enterprise-grade AI infrastructure running AI workloads anywhere at scale, something few would have imagined just a few years ago. And that can only be a good thing for those within the enterprise.

Crayon Strengthens Presence in Middle East Following Strategic Contract Win With the Government of Dubai

Oslo, 29 November 2017: Crayon Group Holding ASA (OSE: CRAYON) today announced the signing of a major IT service agreement with the Government of Dubai (RTA) for the provision of Software Asset Management (SAM) services.

The scope of the agreement includes the use of Crayon’s own IP, SAM-iQ and SAM framework and is for an initial contract period of one year. 

Torgrim Takle, Crayon Group CEO commented: “This agreement will help Crayon to materially accelerate our business within the UAE market and manifests Crayon’s position in the Middle East as the leading SAM and cloud economics player.”   

 

Investor Relations Contact:

Magnus Hofshagen, FP&A Director

Email: magnus.hofshagen@crayon.com

Mobile: +47 48 49 91 95

 

About Crayon:

Crayon Group Holding ASA is a global leader in technology and digital transformation services. Crayon is a trusted advisor to many of the world’s leading organisations. Through its unique people, tools and systems Crayon helps optimise client ROI from complex technology investments. With deep experience within volume software licensing optimisation, digital engineering and predictive analytics, Crayon assists their clients through all phases of the digital transformation process. Headquartered in Oslo, Norway, the company has approximately 1,100 employees in 43 offices worldwide.

NOK 300m set aside to fuel expansion and drive growth of world’s largest independent Cloud Economics business

Crayon Listed on the Norwegian Stock Exchange

Today is a proud moment for the whole Crayon team, from our founders who set up the business back in 2002, to each and every one of our 1,100 teammates working across three continents and 21 locations worldwide today.

Our listing on Oslo Børs marks the next step in the evolution of the Crayon business and will enable us to accelerate our focus on driving value for our customers, partners and teammates as we continue our mission to help international industry and commerce optimise investment in complex technology.

It marks a significant achievement for the business that we have reached this point with revenues growing from 2,047 NOKm in 2012, to 6,015 NOKm in 2016. Indeed, during the past five years we grew by a remarkable compound annual growth rate of 31% and now provide services to over 81% of the global addressable IT market. In that time, we more than doubled the number of teammates as NOK 280m (€30m) was invested in new intellectual property and growth.

This significant geographic expansion has been driven both organically and through acquisition and has led to Crayon developing deep relationships with all the major global software publishers. We are one of Microsoft’s top ten global managed partners and last year helped the Redmond firm generate $1Bn USD. In 2017, Crayon also became one of only three global Microsoft GDPR partners, offering data compliance and risk mitigation solutions to businesses as a natural extension to its heritage in Software Asset Management (SAM).

Following intensive investment in its own IP and consultancy business, Crayon now boasts over 60 service areas including IOT and Machine Learning, alongside its established cloud economics practice. What makes us unique in the market is that we underpin all this with our deep understanding of SAM and use that as the intelligent foundation to map out the customer’s digital transformation journey, ensuring that their technology ecosystem remains optimised.

Torgrim Takle, CEO, Crayon, says: “This is a proud moment for all of us associated with the business, one that firmly puts us on the map in terms of visibility, but that also takes us to the next level in terms of our financial flexibility to compete at the highest level in the global marketplace. It really is an exciting day for all of us and I’d like to thank everyone for all their hard work in bringing us to this point.”

Experts in Risk Assessment and Technology Optimisation Planning to drive compliance with GDPR for businesses worldwide

London July 10th – Crayon, the global leader in Risk Assessment and Technology Optimisation Planning (RATOP), has launched GDPR services based on Microsoft technology to help ensure businesses comply with the new European regulations arriving next year.

With the GDPR coming into force on May 25th 2018, businesses of all shapes and sizes handling data on EU residents must comply with new rules on the protection of this and its privacy across EU member states. However, this is further complicated by regulations also applying to anywhere where EU resident’s personal data is processed or monitored. Should that data become compromised, then businesses must also report any data breaches within 72 hours. Failure to do so could see them being fined €20,000,000 or 4% of their global annual turnover in the preceding financial year, depending on which is higher. The GDPR also introduces a statutory basis for the role of data protection officer (DPO).

To meet these demands Crayon’s specialist team of GDPR practitioners will advise and assist businesses consuming Microsoft solutions on their strategy and approach when it comes to complying with GDPR via its new GDPR Governance Service, a comprehensive GDPR management and risk mitigation solution. Furthermore, Crayon will offer managed services and training for DPOs around GDPR for clients and partners alike.

Already trusted by many of the world’s leading organisations as the go-to experts for Software Asset Management (SAM), the move is a natural extension for the Crayon business in deepening customer engagement en route to compliance and technology optimisation.

By undertaking a full risk assessment analysis with Crayon and utilising the embedded data governance qualities in Microsoft solutions such as Azure, Office 365 and SQL Server, businesses will be able to address areas of risk in their IT environment, with Crayon’s specialist GDPR team providing the expertise to bring those areas into compliance.

Says Torgrim Takle, CEO, Crayon Group: “By providing businesses with detailed insight into areas of risk and risk mitigation in relation to their IT environments, Crayon’s team of GDPR experts can help make the road to GDPR compliance a smooth journey.

“However, GDPR compliance should be part of an overall governance strategy and not seen as an end in itself. We believe cloud services, such as Azure, can offer a more streamlined way for customers to meet their GDPR compliance obligations.”

by -
0 465

If you want to get a head start on the competition, this guide will show you what you need to know to create enterprise-class database applications with the new software.

Getting started with SQL Server 2016

With the release of SQL Server 2016, Microsoft has added a ton of new features to make it easier than ever to create enterprise-grade applications. These features can help in organising (structured, semi-structure or even unstructured) and transforming data from raw into actionable knowledge.

With that in mind, let’s start by spinning up a virtual machine running SQL Server 2016. This provides an isolated environment that will allow us to get to grips with how SQL Server 2016 works. It is also quicker and easier than running it on local hardware.

Creating a new SQL Server 2016 Enterprise Virtual Machine 

You can create a new virtual machine running SQL Server on Azure. For more details on how to do that, click on this link. Note that SQL VM images include the licensing costs for SQL Server into the per-minute pricing of the VM you create. As we are getting started on SQL Server 2016, we should use the SQL Server Developer, which is free for development/testing (not production) or SQL Express and free for lightweight workloads (less than 1GB memory, less than 10 GB storage).

You can then configure basic settings (i.e. unique virtual machine name, a username for the local administrator account on the VM and provide a strong password). You will also need to type a name for a new resource group in the Resource group box. A resource group is a collection of related resources in Azure. Then you can choose a location for the deployment and click on OK.

The purpose of the virtual machine is to experiment with features, but not assess performance characteristics. Once this virtual machine is created, you can then connect to it through a RDP session or from your local SQL Server Management Studio client.

Uploading sample databases

When getting started with SQL Server 2016, you can use the AdventureWorks sample databases. These have new features already implemented, such as Always Encrypted, Row Level Security, Dynamic Data Masking, and so on. These sample databases can be found here. There are two relational databases and a zip file of samples to be downloaded.

You can then use the SQL Server Management Studio client to restore these backup files to the SQL Server instance running in the virtual machine. The SQL Server Management Studio can help in creating databases, tables, functions, views, stored procedures and other database objects using the GUI or with T-SQL code.

There is also a new Wide World Importers Inc. samples to upload and experiment with. These can be found here.

Working with sample databases

The AdventureWorks samples contain snippets of the following: Advanced Analytics (R Services); Always Encrypted; Data Masking; In-Memory Analytics; In-Memory OLTP; JSON; Polybase; Query Store; Row-Level Security; Stretch DB; and Temporal. There is a readme file that shows you the location of documents that detail how each new feature works and how to use them. These documents and the samples are a fantastic way to get to grips with the new features of SQL Server 2016.

These databases can be useful in testing new functionality. These include: Query Store, which is used to keep track of query performance; Temporal tables to keep track of the history of reference data; JSON to enable AJAX calls to some of the key tables; and In-Memory OLTP to optimise performance of table-valued parameters (TVPs) and consumption of sensor data, to name but a few.

Once you have read through the documents and tried out the features, remember to halt the virtual machine you are working on in order to prevent incurring additional charges.

As a global cloud and software expert, Crayon works with many of the world’s leading organisations to optimise their investment in complex technology. For example, after a 15-year absence, we’re helping Burger King re-establish itself in France by providing the firm with a technology optimisation strategy to support the rapid roll out of over 600 restaurants across the country. To read more about this or other Crayon projects click here or call one of our experts today.  

by -
0 302

We’re very excited to announce that Jan Rylund is now an official Speaker at Inspire 2017 – the Microsoft World Wide Partner Conference in Washington.

On 12th July, Jan will be presenting his thoughts on who we are, our strengths in the international market and our strong position at Microsoft.

Find out how we’re working with customers to form a deeper understanding in Artificial Intelligence and advanced analytics.

Also, be introduced to our project based on Homomorphic Encryption which we are working on with the Norwegian Cancer registry in co-operation with Microsoft Research. The session will be led by Tom Lawry, Director of Health Analytics in Microsoft Corp who will be interviewing Jan.

Looking forward to telling our story!

Register your attendance here

For further info on Inspire, take a look here