Quantcast
Channel: MSDN Blogs
Viewing all 12366 articles
Browse latest View live

Getting started with Azure App Services Development

$
0
0

In this post, Application Development Manager, Vijetha Marinagammanavar, demonstrates how to get started with Azure App Services.


To get started with Azure development we need to have Visual Studio 2013 or later, Azure SDK, and an active Azure subscription. We are using Visual Studio 2017 with our demo.

If using VS2013 then download the SDK from https://azure.microsoft.com/en-us/downloads/


clip_image002

Figure 1 Download Azure SDK


Follow instructions below to deploy web application to Azure by creating App Service Plan

Azure App Service is a fully managed "Platform as a Service" (PaaS) that integrates Microsoft Azure Websites, Mobile Services, and BizTalk Services into a single service, adding new capabilities that enable integration with on-premises or cloud systems.


1. Open VS2017

Note: Same steps will apply to all previous versions of the VS.


2. Create a new Web Application in Visual Studio by following the path

Click on File –> New –> Project –> Templates –> Visual C# –> Web

clip_image004

Figure 2 Create new Web Application in Visual Studio


3. Create MVC application with authentication set to Individual User Account, this authentication type will allow us to register users and maintain the profile in SQL Server database.

clip_image006

Figure 3 MVC project with Authentication set to Individual User


4. The project would look like below after creating it

clip_image008

Figure 4 MVC Web Application


5. Now let’s build the project.

Build: Click on Build then Build Solution

clip_image010

Figure 5 Build the project


6. Next step is to publish the application to Azure using an active subscription. Right click on the project in solution explorer and select Publish option.

clip_image012

Figure 6 Publishing MVC project from Visual Studio 2017


7. Now choose Microsoft Azure App Service then hit Publish. Then add your Microsoft Account on which you have your Active Azure Subscription.

clip_image014

clip_image016

Figure 7 Create Microsoft Azure App Service Plan using VS2017


8. Now name your Web Application and choose active subscription.

Create new Resource Group if needed. Same applies to App Service Plan. You can create new in case one does not exist. In your case, you will create new one, choose the Location that is near to you, check http://azurespeedtest.azurewebsites.net/ for response time of different data center to find out best location to use according to your location.

An App Service plan is the container for your app. The App Service plan settings will determine the location, features, cost and compute resources associated with your app.

After some time, it will automatically take you to the website, you have created.

clip_image018

Figure 8 Web App is created using Visual Studio 2017


9. Now it is time to publish our Application to the server

Just change the Home page view which is under Views –> Home –> Index.cshtml and make some changes to this page.

Note: I have made changes to the text inside the jumbotron css class.

clip_image020

Figure 9 Change the Index.html in MVC Project


10. Right click on the Project and hit Publish.

clip_image022

Figure 10.a Publishing Profile connected to Azure


Hit the publish button once you reach above screen, it may take some time to upload the files for first time and depending upon your Internet Connection Upload speed. Once uploaded we can see our change is reflected.

clip_image024

Figure 10.b Publishing Change using Visual Studio 2017 to Azure App Service

You have successfully created new App Service Plan using Visual Studio and Azure SDK and published your very first change to Azure App Service.


Premier Support for Developers provides strategic technology guidance, critical support coverage, and a range of essential services to help teams optimize development lifecycles and improve software quality.  Contact your Application Development Manager (ADM) or email us to learn more about what we can do for you.


Export Azure SQL Database to local path

$
0
0

We noticed a few requests come to our support queue asking for a feasibility to export Azure SQL Database to local path directly, so following steps below to build  PowerShell script that can do that job to copy Azure SQL Database to another db for consistency, then export Azure SQL Database to blob storage, later connect to a single or all storage container and download blob files locally:

First you need to save your Azure login credential to be able to use saved profile credential later to automate login to Azure subscription, to do so, please follow below steps:

  1. Open Windows PowerShell ISE
  2. Copy/past below command 
# Setup – First login manually per previous section 
Add-AzureRmAccount

# Now save your context locally (Force will overwrite if there)
$path = "C:AzurePSProfileContext1.ctx" 
Save-AzureRmContext -Path $path -Force

# Once the above two steps are done, you can simply import 
$path = ‘C:AzurePSProfileContext1.ctx’ 
Import-AzureRmContext -Path $path

 

 

A new window opened to enter username & password to login to Azure Subscription

 

 

Once you can authenticated, your Azure subscription information listed and saved as shown below.

 

 

To verify, navigate to local path, you will find the ProfileContext.ctx file created as shown below.

 

  1. Now as ProfileContext data saved locally, please copy / past below PowerShell script to a new notepad and save it as CopyFilesFromAzureStorageContainer.ps1
  2. Note that values highlighted in Yellow need to fill before executing this PowerShell script Manually as in the below example

PS C:bacpac> .CopyFilesFromAzureStorageContinaer.ps1 -ResourceGroupName $ResourceGroupName -ServerName $SeverName -DatabaseName $DatabaseName -CopyDatabaseName $CopyDatabaseName -LocalPath $LocalPath -StorageAccountName $StorageAccountName -ContainerName $ContainerName

 

 

<#
.SYNOPSIS
    Export Azure SQL Database to Blob storage and download the exported *.bacpac file from blob to local path
.DESCRIPTION
    This PowerShell Script to export Azure SQL DB to a blob storage and then copies blobs from a single storage container to a local directoy. 
   
    The script supports the -Whatif switch so you can quickly see how complex the copy
    operation would be.

.EXAMPLE

    .CopyFilesFromAzureStorageContainer -LocalPath "c:usersmyUserNamedocuments" `
        -ServerName "myservername" -DatabaseName "myDBname" -ResourceGroupName "myresourcegroupname" -StorageAccountName "mystorageaccount" -ContainerName "myuserdocuments" -Force
#>;
[CmdletBinding(SupportsShouldProcess = $true)]
param(
    # The destination path to copy files to.
    [Parameter(Mandatory = $true)]
    [string]$LocalPath,

    # The name of the SQL Server to connect to.
    [Parameter(Mandatory = $true)]
    [string]$ServerName,

    # The name of the SQL database to export.
    [Parameter(Mandatory = $true)]
    [string]$DatabaseName,

    # The name of the resource group contians (SQL Server, SQL Database and Storage account name).
    [Parameter(Mandatory = $true)]
    [string]$ResourceGroupName,

    # The name of the storage account to copy files from.  
    [Parameter(Mandatory = $true)]
    [string]$StorageAccountName,

    # The name of the SQL database to export.
    [Parameter(Mandatory = $true)]
    [string]$CopyDatabaseName,

    # The name of the storage container to copy files from.  
    [Parameter(Mandatory = $true)]
    [string]$ContainerName
)
        # Login to Azure subscription
        $path = ‘C:AzurePSProfileContext.ctx’ 
        Import-AzureRmContext -Path $path
        
        # $DatabaseName = "DBName"
        # $ServerName = "ServerName"
        # $ResourceGroupName = "ResourceGroupName"
        # $StorageAccountName = "StorageAccountName"
        # $ContainerName = "StorageContainerName"
        # $LocalPath = "C:LocalPath"
        
        
        # Create a credential
        $ServerAdmin = "serverlogin"
        $Password = ConvertTo-SecureString –String 'password' –AsPlainText -Force
        $Credential = New-Object –TypeName System.Management.Automation.PSCredential `
        –ArgumentList $ServerAdmin, $Password
        

        # Generate a unique filename for the BACPAC
        $bacpacFilename = "$DatabaseName" + (Get-Date).ToString("yyyy-MM-dd-HH-mm") + ".bacpac"


        # Blob storage information
        $StorageKey = "YOUR STORAGE KEY"
        $BaseStorageUri = "https://STORAGE-NAME.blob.core.windows.net/BLOB-CONTAINER-NAME/"
        $BacPacUri = $BaseStorageUri + $bacpacFilename
        New-AzureRmSqlDatabaseCopy -ResourceGroupName $ResourceGroupName -ServerName $ServerName -DatabaseName $DatabaseName -CopyResourceGroupName $ResourceGroupName -CopyServerName $ServerName -CopyDatabaseName $CopyDatabaseName
        
        Write-Output "Azure SQL DB $CopyDatabaseName Copy completed"

        # Create a request
        $Request = New-AzureRmSqlDatabaseExport –ResourceGroupName $ResourceGroupName –ServerName $ServerName `
        –DatabaseName $DatabaseName –StorageKeytype StorageAccessKey –StorageKey $StorageKey `
        -StorageUri $BacPacUri –AdministratorLogin $Credential.UserName `
        –AdministratorLoginPassword $Credential.Password


        # Check status of the export
        $exportStatus = Get-AzureRmSqlDatabaseImportExportStatus -OperationStatusLink $Request.OperationStatusLink
        [Console]::Write("Exporting")
        while ($exportStatus.Status -eq "InProgress")
        {
        $exportStatus = Get-AzureRmSqlDatabaseImportExportStatus -OperationStatusLink $Request.OperationStatusLink
        Start-Sleep -s 10
        }
        $exportStatus
        $Status= $exportStatus.Status
        if($Status -eq "Succeeded")
        {
        Write-Output "Azure SQL DB Export $Status for "$DatabaseName""
        }
        else
        {
        Write-Output "Azure SQL DB Export Failed for "$DatabaseName""
        }
            

        # Download file from azure
        Write-Output "Downloading"
        $StorageContext = Get-AzureRmStorageAccount -Name $StorageAccountName -ResourceGroupName $ResourceGroupName 
        $StorageContext | Get-AzureStorageBlob -Container $ContainerName -blob $bacpacFilename | Get-AzureStorageBlobContent -Destination $LocalPath
        $Status= $exportStatus.Status
        if($Status -eq "Succeeded")
        {
        Write-Output "Blob $bacpacFilename Download $Status for "$DatabaseName" To $LocalPath"
        }
        else
        {
        Write-Output "Blob $bacpacFilename Download Failed for "$DatabaseName""
        } 

        # Drop Copy Database after successful export
        Remove-AzureRmSqlDatabase -ResourceGroupName $ResourceGroupName `
        -ServerName $ServerName `
        -DatabaseName $CopyDatabaseName `
        -Force
         
        Write-Output "Azure SQL DB $CopyDatabaseName Deleted"

The above script can be saved and triggered manually, to automate the process and setup a schedule task, we can gain benefits from Windows Task Scheduler to run the PowerShell script on schedule of your preference. To do so, please follow steps below:

  1. Now that we need to automate this PowerShell script and run it using Windows Task Scheduler.
  2. Copy / past the below PowerShell Script to a new notepad and save it as CopyFilesFromAzureStorageContainerV2.ps1.
  3. Values highlighted in Green need to be updated, to automate run this PowerShell script which is explained in the below steps:
    1. a. Create a new schedule task
    2. In action tap navigate to powershell.exe localpath C:WindowsSystem32WindowsPowerShellv1.0powershell.exe
    3. In add argument (optional) copy/past the full local path to your PowerShell script C:bacpacCopyFilesFromAzureStorageContainerV2.ps1
    4. Add trigger (schedule) according to your need and then save it

<#
.SYNOPSIS
    Export Azure SQL Database to Blob storage and download the exported *.bacpac file from blob to local path
.DESCRIPTION
    This PowerShell Script to export Azure SQL DB to a blob storage and then copies blobs from a single storage container to a local directoy. 
   
    The script supports the -Whatif switch so you can quickly see how complex the copy
    operation would be.

.EXAMPLE

    .CopyFilesFromAzureStorageContainer -LocalPath "c:usersmyUserNamedocuments" `
        -ServerName "myservername" -DatabaseName "myDBname" -ResourceGroupName "myresourcegroupname" -StorageAccountName "mystorageaccount" -ContainerName "myuserdocuments" -Force
#>;

        # Login to Azure subscription
        $path = ‘C:AzurePSProfileContext.ctx’ 
        Import-AzureRmContext -Path $path
        
        $DatabaseName = " hidden"
        $CopyDatabaseName = $DatabaseName + "_Copy"
        $ServerName = "hidden"
        $ResourceGroupName = "hidden"
        $StorageAccountName = "hidden"
        $ContainerName = "bacpac"
        $LocalPath = "C:localpath"
        
        
        # Create a credential
        $ServerAdmin = "serverlogin"
        $Password = ConvertTo-SecureString –String 'password' –AsPlainText -Force
        $Credential = New-Object –TypeName System.Management.Automation.PSCredential `
        –ArgumentList $ServerAdmin, $Password
        

        # Generate a unique filename for the BACPAC
        $bacpacFilename = "$DatabaseName" + (Get-Date).ToString("yyyy-MM-dd-HH-mm") + ".bacpac"


        # Blob storage information
        $StorageKey = "YOUR STORAGE KEY"
        $BaseStorageUri = "https://StorageAccountName.blob.core.windows.net/ContainerName/"
        $BacPacUri = $BaseStorageUri + $bacpacFilename
        New-AzureRmSqlDatabaseCopy -ResourceGroupName $ResourceGroupName -ServerName $ServerName -DatabaseName $DatabaseName -CopyResourceGroupName $ResourceGroupName -CopyServerName $ServerName -CopyDatabaseName $CopyDatabaseName
        
        Write-Output "Azure SQL DB $CopyDatabaseName Copy completed"

        # Create a request
        $Request = New-AzureRmSqlDatabaseExport –ResourceGroupName $ResourceGroupName –ServerName $ServerName `
        –DatabaseName $CopyDatabaseName –StorageKeytype StorageAccessKey –StorageKey $StorageKey `
        -StorageUri $BacPacUri –AdministratorLogin $Credential.UserName `
        –AdministratorLoginPassword $Credential.Password


        # Check status of the export
        $exportStatus = Get-AzureRmSqlDatabaseImportExportStatus -OperationStatusLink $Request.OperationStatusLink
        [Console]::Write("Exporting")
        while ($exportStatus.Status -eq "InProgress")
        {
        $exportStatus = Get-AzureRmSqlDatabaseImportExportStatus -OperationStatusLink $Request.OperationStatusLink
        Start-Sleep -s 10
        }
        $exportStatus
        $Status= $exportStatus.Status
        if($Status -eq "Succeeded")
        {
        Write-Output "Azure SQL DB Export $Status for "$DatabaseName""
        }
        else
        {
        Write-Output "Azure SQL DB Export Failed for "$DatabaseName""
        }

               

        # Download file from azure
        Write-Output "Downloading"
        $StorageContext = Get-AzureRmStorageAccount -Name $StorageAccountName -ResourceGroupName $ResourceGroupName 
        $StorageContext | Get-AzureStorageBlob -Container $ContainerName -blob $bacpacFilename | Get-AzureStorageBlobContent -Destination $LocalPath
        $Status= $exportStatus.Status
        if($Status -eq "Succeeded")
        {
        Write-Output "Blob $bacpacFilename Download $Status for "$DatabaseName" To $LocalPath"
        }
        else
        {
        Write-Output "Blob $bacpacFilename Download Failed for "$DatabaseName""
        }

        # Drop Copy Database after successful export
        Remove-AzureRmSqlDatabase -ResourceGroupName $ResourceGroupName `
        -ServerName $ServerName `
        -DatabaseName $CopyDatabaseName `
        -Force
         
        Write-Output "Azure SQL DB $CopyDatabaseName Deleted"

Hope that you enjoyed that article, we appreciate your comments and feedback on this !!

C# 7 Series, Part 10: Span and universal memory management

$
0
0

Part 1: Value Tuples
Part 2: Async Main
Part 3: Default Literals
Part 4: Discards
Part 5: Private Protected
Part 6: Read-only structs
Part 7: Ref Returns
Part 8: “in” Parameters
Part 9: ref structs
Part 10: (This post) Span<T> and universal memory management

Background

.NET is a managed platform, that means the memory access and management is safe and automatic. All types are fully managed by .NET, it allocates memory either on the execution stacks, or managed heaps.

In the event of interop or low-level development, you may want the access to the native objects and system memory, here is why the interop part comes, there are types that can marshal into the native world, invoke native APIs, convert managed/native types and define a native structure from the managed code.

Problem 1: Memory access patterns

In .NET world, there are three types of memory you may be interested:

  • Managed heap memory, such as an array;
  • Stack memory, such as objects created by stackalloc;
  • Native memory, such as a native pointer reference.

Each type of memory access may need to use language features that are designed for it:

  • To access heap memory, use the fixed (pinned) pointer on supported types (like string), or use other appropriate .NET types that have access to it, such as an array or a buffer;
  • To access stack memory, use pointers with stackalloc;
  • To access unmanaged system memory, use pointers with Marshal APIs.

You see, different access pattern needs different code, no single built-in type for all contiguous memory access.

Problem 2: Performance

In many applications, the most CPU consuming operations are string operations. If you run a profiler session against your application, you may find the fact that 95% of the CPU time is used to call string and related functions.

Trim, IsNullOrWhiteSpace, and SubString may be the most frequently used string APIs, and they are also very heavy:

  • Trim() or SubString() returns a new string object that is part of the original string, this is unnecessary if there is a way to slice and return a portion of the original string to save one copy.
  • IsNullOrWhiteSpace() takes a string object that needs a memory copy (because string is immutable.)
  • Specifically,  string concatenation is expensive, it takes n string objects, makes n copy, generate n - 1 temporary string objects, and return a final string object, the n – 1 copies can be eliminated if there is a way to get direct access to the return string memory and perform sequential writes.

Span<T>

System.Span<T> is a stack-only type (ref struct) that wraps all memory access patterns, it is the type for universal contiguous memory access. You can think the implementation of the Span<T> contains a dummy reference and a length, accepting all 3 memory access types.

You can create a Span<T> using its constructor overloads or implicit operators from array, stackalloc’d pointers and unmanaged pointers.

// Use implicit operator Span<char>(char[]).
Span<char> span1 = new char[] { 's', 'p', 'a', 'n' };

// Use stackalloc.
Span<byte> span2 = stackalloc byte[50];

// Use constructor.
IntPtr array = new IntPtr();
Span<int> span3 = new Span<int>(array.ToPointer(), 1);

Once you have a Span<T> object, you can set value with a specified index, or return a portion of the span:

// Create an instance.
Span<char> span = new char[] { 's', 'p', 'a', 'n' };
// Access the reference of the first element.
ref char first = ref span[0];
// Assign the reference with a new value.
first = 'S';
// You get "Span".
Console.WriteLine(span.ToArray());
// Return a new span with start index = 1 and end index = span.Length - 1.
// You get "pan".
Span<char> span2 = span.Slice(1);
Console.WriteLine(span2.ToArray());

You can then use the Slice() method to write a high performance Trim() method:

private static void Main(string[] args)
{
    string test = "   Hello, World! ";
    Console.WriteLine(Trim(test.ToCharArray()).ToArray());
}

private static Span<char> Trim(Span<char> source)
{
    if (source.IsEmpty)
    {
        return source;
    }

    int start = 0, end = source.Length - 1;
    char startChar = source[start], endChar = source[end];

    while ((start < end) && (startChar == ' ' || endChar == ' '))
    {
        if (startChar == ' ')
        {
            start++;
        }

        if (endChar == ' ')
        {
            end—;
        }

        startChar = source[start];
        endChar = source[end];
    }

    return source.Slice(start, end - start + 1);
}

The above code does not copy over strings, nor generate new strings, it returns a portion of the original string by calling the Slice() method.

Because Span<T> is a ref struct, all ref struct restrictions apply. i.e. you cannot use Span<T> in fields, properties, iterator and async methods.

Memory<T>

System.Memory<T> is a wrapper of System.Span<T>, make it accessible in iterator and async methods. Use the Span property on the Memory<T> to access the underlying memory, this is extremely helpful in the asynchronous scenarios like File Streams and network communications (HttpClient etc..)

The following code shows simple usage of this type.

private static async Task Main(string[] args)
{
    Memory<byte> memory = new Memory<byte>(new byte[50]);
    int count = await ReadFromUrlAsync("https://www.microsoft.com", memory).ConfigureAwait(false);
    Console.WriteLine("Bytes written: {0}", count);
}

private static async ValueTask<int> ReadFromUrlAsync(string url, Memory<byte> memory)
{
    using (HttpClient client = new HttpClient())
    {
        Stream stream = await client.GetStreamAsync(new Uri(url)).ConfigureAwait(false);
        return await stream.ReadAsync(memory).ConfigureAwait(false);
    }
}

The Framework Class Library/Core Framework (FCL/CoreFx) will add APIs based on the span-like types for Streams, strings and more in .NET Core 2.1.

ReadOnlySpan<T> and ReadOnlyMemory<T>

System.ReadOnlySpan<T> is the read-only version of the System.Span<T> struct where the indexer returns a readonly ref object instead of ref object. You get read-only memory access when using System.ReadOnlySpan<T> readonly ref struct.

This is useful for string type, because string is immutable, it is treated as read-only span.

We can rewrite the above code to implement the Trim() method using ReadOnlySpan<T>:

private static void Main(string[] args)
{
    // Implicit operator ReadOnlySpan(string).
    ReadOnlySpan<char> test = "   Hello, World! ";
    Console.WriteLine(Trim(test).ToArray());
}

private static ReadOnlySpan<char> Trim(ReadOnlySpan<char> source)
{
    if (source.IsEmpty)
    {
        return source;
    }

    int start = 0, end = source.Length - 1;
    char startChar = source[start], endChar = source[end];

    while ((start < end) && (startChar == ' ' || endChar == ' '))
    {
        if (startChar == ' ')
     {
            start++;
        }

        if (endChar == ' ')
        {
            end—;
        }

        startChar = source[start];
        endChar = source[end];
    }

    return source.Slice(start, end - start + 1);
}

As you can see, Nothing is changed in the method body; I just changed the parameter type from Span<T> to ReadOnlySpan<T>, and used the implicit operator to convert a string literal to ReadOnlySpan<char>.

System.ReadOnlyMemory<T> is the read-only version of System.Memory<T> struct where the Span property is a ReadOnlySpan<T>. When using this type, you get read-only access to the memory and you can use it with an iterator method or async method.

Memory Extensions

The System.MemoryExtensions class contains extension methods for different types that manipulates with span types, here is a list of commonly used extension methods, many of them are the equivalent implementations for existing APIs using the span types.

  • AsSpan, AsMemory: Convert arrays into Span<T> or Memory<T> or their read-only counterparts.
  • BinarySearch, IndexOf, LastIndexOf: Search elements and indexes.
  • IsWhiteSpace, Trim, TrimStart, TrimEnd, ToUpper, ToUpperInvariant, ToLower, ToLowerInvariant: Span<char> operations similar to string.

Memory Marshal

In some case, you probably want to have lower level access to the memory types and system buffers, and convert between spans and read-only spans. The System.Runtime.InteropServices.MemoryMarshal static class provides such functionalities to allow you control these access scenarios. The following code shows to title case a string using the span types, this is high performant because there is no temporary string allocations.

private static void Main(string[] args)
{
    string source = "span like types are awesome!";
    // source.ToMemory() converts source from string to ReadOnlyMemory<char>,
    // and MemoryMarshal.AsMemory converts ReadOnlyMemory<char> to Memory<char>
    // so you can modify the elements.
    TitleCase(MemoryMarshal.AsMemory(source.AsMemory()));
    // You get "Span like types are awesome!";
    Console.WriteLine(source);
}

private static void TitleCase(Memory<char> memory)
{
    if (memory.IsEmpty)
    {
        return;
    }

    ref char first = ref memory.Span[0];
    if (first >= 'a' && first <= 'z')
    {
        first = (char)(first - 32);
    }
}

Conclusion

Span<T> and Memory<T> enables a uniform way to access contiguous memory, regardless how the memory is allocated. It is very helpful for native development scenarios, as well as high performance scenarios. Especially, you will gain significant performance improvements while using span types to work with strings. It is a very nice feature innovated in C# 7.2.

NOTE: To use this feature, you will need to use Visual Studio 2017.5 and language version 7.2 or latest.

Episode 4: IT IQ Series – AI will soon power students and schools’ growth. Why?

$
0
0

Summary: AI will not only prove essential as part of students’ learning experiences, but also play a major role in how efficiently Australia’s education sector grows.

Will artificial intelligence (AI) make students smarter? AI has already made its way onto the agenda of Australia’s education policymakers, who’ve acknowledged that time is of the essence if we’re to adopt AI in our classrooms. Unsurprisingly, the Australian education sector is being told to urgently adapt and ‘reconceive schooling’ to ensure that the future workforce is equipped to function in a world where AI plays a much greater role than even now.

That role will include powering the growth and expansion of education institutions themselves. Many institutions already adopt some form of cloud-based AI platform, most notably when automating basic processes. But AI in itself is a wide-ranging field, covering everything from bots (automated software-based robots) to machine learning and cognitive services like facial and language recognition.

Enhancing the learning experience

“We’ve been really encouraged at the extent to which the education industry, from primary all the way through tertiary education, have embraced AI and robotics,” says Mark Tigwell, an Azure technologist at Microsoft. “To no-one’s surprise, we’re seeing rapid uptake in the area of STEM learning where primary school students are building robots from robot kits and coding them to respond automatically to their surroundings. Technology is enhancing teaching and students are certainly being equipped with skills for the future.”

As one example, Tigwell points to growing demand for resources like Microsoft Imagine, a repository of tutorials and guides for coding, design, and building apps for the cloud. Imagine’s resources are inherently practical–directing students and teachers to think creatively and analyse situations–which, according to Tigwell, appeals to the direction which educators are taking towards STEM education.

“Theoretical resources may prove useful in later years, but that practical aspect piques the interest and passion of younger students, which is exactly what we need to raise STEM’s profile in our schools,” says Tigwell. “When you make AI and robotics into platforms for digital creativity, you’re not only attracting students but giving them essential skills for their future careers, whether they end up working in a STEM-centric field or not.”

AI’s impact, however, goes well beyond the syllabus. Some schools have already begun using the technology to monitor schoolyards for bullying, while emotion and speech recognition can also help teachers better engage with long-distance students.

“Being able to automatically recognise and trigger a response to emotional cues is incredibly valuable intelligence for teachers, whether you’re dealing with angry gestures in the playground or signs of boredom in your remote-learning students,” says Tigwell. “You’re not replacing the teacher or spoon-feeding the student, but you’re making their tasks that little bit less onerous to handle. And the effects, from averting schoolyard violence with automated alerts to studying better with speech-to-text transcription of classes, ultimately improve everyone’s learning experience in ways that even the best teachers can’t do at scale.”

“We currently have more than two dozen cognitive services in the Azure suite, but the myriad applications for them in learning environments alone are pushing us to go even further in development.”

Smarter schools for smarter students

Educators are also eyeing AI for another reason: it can make scaling up operations much, much easier. “AI can substantially reduce costs and improve the way we run our schools and universities,” says Tigwell. “Whether it’s monitoring the use of electricity to developing virtual assistants who can answer questions on the school website, you’re cutting out inefficiency at all levels of management.”

Automating administrative tasks also frees teachers’ time, letting them produce new content or research that yields substantial dividends in the long run. “There’s always a process that is causing pain in your daily workflow,” says Tigwell. “Let’s say, for example, you constantly receive the same simple questions from most of your students. A bot that automatically displays FAQs based on students’ questions can save huge amounts of time, and new tools like the Azure Bot Framework make doing so far less technical a process than previously.”

That, in turn, positions educational institutions for faster growth, both locally and overseas. Japan’s Hokkaido University, for example, transformed its e-learning platform with Azure by automating the process of transcribing, translating, captioning and encoding its lessons. The result was course preparation time being slashed from two weeks to two hours while attracting more students from outside Japan. And with Australia’s higher-education exports hitting record levels in 2016, similar AI-based models could help local universities and colleges take their reputation further with less risks to quality-control.

“We’ve only scratched the surface of what we can do with AI on a technical level, but even at its current stage it offers educators enough to transform both classrooms and entire schools,” says Tigwell. “These intelligent algorithms enable educational institutions to form real-time collaborations, reduce operational costs, and support worldwide scalability all at the same time. You could say they’re the most intelligent choice for schools looking to grow–both in quantity and quality.”

Watch Mark Tigwell answer your questions about AI on Microsoft Azure, and how it can help alleviate the challenges of Australia’s schools, on our YouTube channel.

Get started on Microsoft Azure, and learn how the Microsoft Imagine Academy and Microsoft Virtual Academy can help your students prepare for their future. Learn more about Microsoft Azure’s security policies here.

 

Our mission at Microsoft is to equip and empower educators to shape and assure the success of every student. Any teacher can join our community and effort with free Office 365 Education, find affordable Windows devices and connect with others on the Educator Community for free training and classroom resources. Follow us on Facebook and Twitter for our latest updates.

 

 

 

Enabling “Transfer-encoding: chunked” in the response header with IIS

$
0
0

While assisting a customer on configuring ARR (Application Request Routing), we found that chunked transfer encoding was not working properly. After bit of workaround, we were able to fix it up. Here is a summary of the troubleshooting session.

Quick Note: "Chunked" is a type of transfer encoding by which the message body is transmitted to the client as chunks that are stamped with the size of the chunks (see section 14.40 of RFC 2068). With chunked transfer encoding, the client can make sure that it has received all of the data that the server sends. Chunked transfer encoding is similar to MIME encoding in relation to Internet mail (see RFC 822). The specific differences between MIME encoding and chunked transfer encoding are discussed in section 19.4 of RFC 2068.

To enable chunked transfer encoding, set the value for AspEnableChunkedEncoding to True for the site, the server, or the virtual directory that you want to enable chunked transfer encoding for:

  • Open a command prompt.
  • Change to the InetpubAdminscripts folder.
  • Run the following:
    cscript adsutil.vbs set /W3SVC/AspEnableChunkedEncoding "TRUE"

Looking inside a .NET core process using windbg

$
0
0

In this blog post,we are going to look inside of a .NET Core process using windbg. .NET CORE process footprint is as minimal as possible so with this blog post,we are going to look at a simple .net core concole app running (simplest console app with bare minimum managed code) and check

  • How many .NET Objects it needs to run a console application
  • .NET dlls (managed) are loaded for a simple console application
  • What are the threads running (managed and unmanaged or native)
  • How does the callstack look like for .NET threads
  • How to look for a particular .NET object and dump it details

First lets start a simple .NET core console application .I already have the .NET core SDK installed .Now create a simple console app and run

D:PROJECTSdotnet>dotnet new console
The template "Console Application" was created successfully.

Processing post-creation actions...
Running 'dotnet restore' on D:PROJECTSdotnetdotnet.csproj...
 Restoring packages for D:PROJECTSdotnetdotnet.csproj...
 Generating MSBuild file D:PROJECTSdotnetobjdotnet.csproj.nuget.g.props.
 Generating MSBuild file D:PROJECTSdotnetobjdotnet.csproj.nuget.g.targets.
 Restore completed in 704.29 ms for D:PROJECTSdotnetdotnet.csproj.

Restore succeeded.

After this ,We can do a build and run

D:PROJECTSdotnet>dotnet build
Microsoft (R) Build Engine version 15.5.180.51428 for .NET Core
Copyright (C) Microsoft Corporation. All rights reserved.

Restore completed in 57.84 ms for D:PROJECTSdotnetdotnet.csproj.
 dotnet -> D:PROJECTSdotnetbinDebugnetcoreapp2.0dotnet.dll

Build succeeded.
 0 Warning(s)
 0 Error(s)

Time Elapsed 00:00:06.06

Finally running the app

D:PROJECTSdotnet>dotnet run
Hello World!

We are going to use a windbg tool to do this,you can install debugging tools for windows to get windbg .If you are doing a fresh installation, Make sure to install As a standalone tool set

If you want to download only Debugging Tools for Windows, install the Windows SDK, and, during the installation, select the Debugging Tools for Windows box and clear all the other boxes.

Once you have installed debugging tools,run windbg(make sure to run 64 bit version ). To inspect the dotnet process,we will make a small change to the code to add a Console.ReadLine(). This is done to make sure the process does not terminate as soon as it finishes

using System;

namespace dotnet
{
class Program
{
static void Main(string[] args)
{
Console.WriteLine("Hello World!");
Console.ReadLine();
}
}
}
  1. Run the console app using dotnet run command
  2. When it successfully runs,it will launch console app and will not exit because of the Console.ReadLine() we added
  3.  Now,launch windg and attach to dotnet.exe process as shown below

make sure to launch windbg with the correct bitness(32 bit or 64 bit) .You have to match the bitness of the process you are going to debug with the windbg

Now from the windbg menu,choose Attach to a Process( press F6) and find dotnet.exe running.

Once you have attached,you should see following

Microsoft (R) Windows Debugger Version 6.12.0002.633 AMD64
 Copyright (c) Microsoft Corporation. All rights reserved.

*** wait with pending attach
 Symbol search path is: srv*c:symcache*http://msdl.microsoft.com/download/symbols
 Executable search path is:
 ModLoad: 00000001`3fdc0000 00000001`3fde7000 C:Program Filesdotnetdotnet.exe
 ModLoad: 00000000`76e80000 00000000`7702a000 C:windowsSYSTEM32ntdll.dll
 ModLoad: 00000000`76c60000 00000000`76d7f000 C:windowssystem32kernel32.dll
 ModLoad: 000007fe`fcbf0000 000007fe`fcc5a000 C:windowssystem32KERNELBASE.dll
 ModLoad: 00000000`74860000 00000000`748ee000 C:windowsSystem32SYSFER.DLL
 ModLoad: 000007fe`fdb20000 000007fe`fdbfb000 C:windowssystem32ADVAPI32.dll
 ModLoad: 000007fe`fea80000 000007fe`feb1f000 C:windowssystem32msvcrt.dll
 ModLoad: 000007fe`fd310000 000007fe`fd32f000 C:windowsSYSTEM32sechost.dll
 ModLoad: 000007fe`feff0000 000007fe`ff11d000 C:windowssystem32RPCRT4.dll
 ModLoad: 000007fe`f5430000 000007fe`f5434000 C:windowssystem32api-ms-win-crt-runtime-l1-1-0.dll
 ModLoad: 000007fe`dee30000 000007fe`def24000 C:windowssystem32ucrtbase.DLL
 ModLoad: 000007fe`f5420000 000007fe`f5423000 C:windowssystem32api-ms-win-core-timezone-l1-1-0.dll
 ModLoad: 000007fe`f4410000 000007fe`f4413000 C:windowssystem32api-ms-win-core-file-l2-1-0.dll
 ModLoad: 000007fe`f4400000 000007fe`f4403000 C:windowssystem32api-ms-win-core-localization-l1-2-0.dll
 ModLoad: 000007fe`fba40000 000007fe`fba43000 C:windowssystem32api-ms-win-core-synch-l1-2-0.dll
 ModLoad: 000007fe`f4360000 000007fe`f4363000 C:windowssystem32api-ms-win-core-processthreads-l1-1-1.dll
 ModLoad: 000007fe`f4350000 000007fe`f4353000 C:windowssystem32api-ms-win-core-file-l1-2-0.dll
 ModLoad: 000007fe`ea900000 000007fe`ea905000 C:windowssystem32api-ms-win-crt-math-l1-1-0.dll
 ModLoad: 000007fe`f4100000 000007fe`f4103000 C:windowssystem32api-ms-win-crt-heap-l1-1-0.dll
 ModLoad: 000007fe`ee870000 000007fe`ee874000 C:windowssystem32api-ms-win-crt-convert-l1-1-0.dll
 ModLoad: 000007fe`f40f0000 000007fe`f40f4000 C:windowssystem32api-ms-win-crt-stdio-l1-1-0.dll
 ModLoad: 000007fe`f42f0000 000007fe`f42f4000 C:windowssystem32api-ms-win-crt-string-l1-1-0.dll
 ModLoad: 000007fe`ee620000 000007fe`ee623000 C:windowssystem32api-ms-win-crt-locale-l1-1-0.dll
 ModLoad: 000007fe`ea8f0000 000007fe`ea8f5000 C:windowssystem32api-ms-win-crt-multibyte-l1-1-0.dll
 ModLoad: 000007fe`d6b00000 000007fe`d6b51000 C:Program Filesdotnethostfxr2.0.5hostfxr.dll
 ModLoad: 000007fe`e77e0000 000007fe`e77e3000 C:windowssystem32api-ms-win-crt-filesystem-l1-1-0.dll
 ModLoad: 000007fe`d6a70000 000007fe`d6af9000 C:Program FilesdotnetsharedMicrosoft.NETCore.App2.0.5hostpolicy.dll
 ModLoad: 000007fe`d1070000 000007fe`d15ba000 C:Program FilesdotnetsharedMicrosoft.NETCore.App2.0.5coreclr.dll
 ModLoad: 000007fe`fd640000 000007fe`fd83c000 C:windowssystem32ole32.dll
 ModLoad: 000007fe`ff120000 000007fe`ff187000 C:windowssystem32GDI32.dll
 ModLoad: 00000000`76d80000 00000000`76e7a000 C:windowssystem32USER32.dll
 ModLoad: 000007fe`fd3d0000 000007fe`fd3de000 C:windowssystem32LPK.dll
 ModLoad: 000007fe`fd3e0000 000007fe`fd4ab000 C:windowssystem32USP10.dll
 ModLoad: 000007fe`fe990000 000007fe`fea6a000 C:windowssystem32OLEAUT32.dll
 ModLoad: 000007fe`fbc20000 000007fe`fbc2c000 C:windowssystem32VERSION.dll
 ModLoad: 000007fe`fd290000 000007fe`fd301000 C:windowssystem32SHLWAPI.dll
 ModLoad: 000007fe`fc4b0000 000007fe`fc4d2000 C:windowssystem32bcrypt.dll
 ModLoad: 000007fe`de290000 000007fe`de293000 C:Program FilesdotnetsharedMicrosoft.NETCore.App2.0.5api-ms-win-crt-utility-l1-1-0.dll
 ModLoad: 000007fe`ddb80000 000007fe`ddb83000 C:Program FilesdotnetsharedMicrosoft.NETCore.App2.0.5api-ms-win-crt-time-l1-1-0.dll
 ModLoad: 000007fe`fd260000 000007fe`fd28e000 C:windowssystem32IMM32.DLL
 ModLoad: 000007fe`fd4b0000 000007fe`fd5b9000 C:windowssystem32MSCTF.dll
 ModLoad: 000007fe`c5330000 000007fe`c5e84000 C:Program FilesdotnetsharedMicrosoft.NETCore.App2.0.5System.Private.CoreLib.dll
 ModLoad: 000007fe`f3ae0000 000007fe`f3b4f000 C:windowsSYSTEM32MSCOREE.DLL
 ModLoad: 00000000`00300000 00000000`00308000 D:PROJECTSdotnetbinDebugnetcoreapp2.0dotnet.dll
 ModLoad: 000007fe`ddb70000 000007fe`ddb7d000 C:Program FilesdotnetsharedMicrosoft.NETCore.App2.0.5System.Runtime.dll
 ModLoad: 000007fe`d1bb0000 000007fe`d1cbb000 C:Program FilesdotnetsharedMicrosoft.NETCore.App2.0.5clrjit.dll
 ModLoad: 000007fe`d0e60000 000007fe`d0e87000 C:Program FilesdotnetsharedMicrosoft.NETCore.App2.0.5System.Console.dll
 ModLoad: 000007fe`dc4e0000 000007fe`dc4f3000 C:Program FilesdotnetsharedMicrosoft.NETCore.App2.0.5System.Threading.dll
 ModLoad: 000007fe`d1cc0000 000007fe`d1d34000 C:Program FilesdotnetsharedMicrosoft.NETCore.App2.0.5System.Runtime.Extensions.dll
 (6018.7540): Break instruction exception - code 80000003 (first chance)

This means that windbg is successfully attached to our dotnet core process.

Now we can run some commands in windbg to get some internal details about

Getting the active threads running

For this run a simple command ~ in windbg

0:000> ~
 . 0 Id: 6018.1244 Suspend: 1 Teb: 000007ff`fffde000 Unfrozen
 1 Id: 6018.67e8 Suspend: 1 Teb: 000007ff`fffdc000 Unfrozen
 2 Id: 6018.550 Suspend: 1 Teb: 000007ff`fffda000 Unfrozen
 # 3 Id: 6018.7540 Suspend: 1 Teb: 000007ff`fffd8000 Unfrozen

tilde(~) simply lists all the threads running with the threadid.There are many variations you can do

  • ~*  - shows little more information like the top method currently executing .* means all the threads
  • ~*k - shows all the threads along with the stack
  • ~<threadnum>s - will switch to thread number threadnum you specifiy
    • e.g. 0:000> ~1s
       ntdll!ZwWaitForMultipleObjects+0xa:
       00000000`76ecc2ea c3 ret
       0:001>
0:001> ~*k

0 Id: 6018.1244 Suspend: 1 Teb: 000007ff`fffde000 Unfrozen
 Child-SP RetAddr Call Site
 00000000`001adc48 00000000`76c818e8 ntdll!NtRequestWaitReplyPort+0xa
 00000000`001adc50 00000000`76cb57f1 kernel32!ConsoleClientCallServer+0x54
 00000000`001adc80 00000000`76cca9f2 kernel32!ReadConsoleInternal+0x1f1
 00000000`001addd0 00000000`76c97e64 kernel32!ReadConsoleA+0xb2
 00000000`001adeb0 000007fe`716d147f kernel32!TlsGetValue+0x81fe
 00000000`001adef0 000007fe`d0e78f65 0x7fe`716d147f
 00000000`001adfb0 000007fe`d0e78db3 System_Console+0x18f65
 00000000`001ae010 000007fe`d1d0dc6d System_Console+0x18db3
 00000000`001ae080 000007fe`d1d0e04a System_Runtime_Extensions+0x4dc6d
 00000000`001ae0d0 000007fe`d0e7d517 System_Runtime_Extensions+0x4e04a
 00000000`001ae120 000007fe`d0e752fa System_Console+0x1d517
 00000000`001ae170 000007fe`716d04b6 System_Console+0x152fa
 00000000`001ae1a0 000007fe`d11a35d3 0x7fe`716d04b6
 00000000`001ae1e0 000007fe`d10cd9bf coreclr!CallDescrWorkerInternal+0x83 [E:A_work1791ssrcvmamd64CallDescrWorkerAMD64.asm @ 101]
 00000000`001ae220 000007fe`d1193ef7 coreclr!MethodDescCallSite::CallTargetWorker+0x17b [e:a_work1791ssrcvmcallhelpers.cpp @ 653]
 00000000`001ae370 000007fe`d108b195 coreclr!RunMain+0x17f [e:a_work1791ssrcvmassembly.cpp @ 1849]
 00000000`001ae5d0 000007fe`d112ba29 coreclr!Assembly::ExecuteMainMethod+0xb5 [e:a_work1791ssrcvmassembly.cpp @ 1944]
 00000000`001ae890 000007fe`d112d9ce coreclr!CorHost2::ExecuteAssembly+0x149 [e:a_work1791ssrcvmcorhost.cpp @ 502]
 00000000`001ae960 000007fe`d6a8e8b9 coreclr!coreclr_execute_assembly+0xde [e:a_work1791ssrcdllsmscoreeunixinterface.cpp @ 407]
 00000000`001ae9f0 000007fe`d6a8ee44 hostpolicy!run+0xdb9
 00000000`001af0c0 000007fe`d6b19b05 hostpolicy!corehost_main+0x164
 00000000`001af240 000007fe`d6b1f42b hostfxr!execute_app+0x1f5
 00000000`001af310 000007fe`d6b1e819 hostfxr!fx_muxer_t::read_config_and_execute+0x94b
 00000000`001af9b0 000007fe`d6b1cc8d hostfxr!fx_muxer_t::parse_args_and_execute+0x409
 00000000`001afb40 00000001`3fdc9abc hostfxr!fx_muxer_t::execute+0x22d
 00000000`001afcd0 00000001`3fdce099 dotnet!wmain+0x46c
 00000000`001afde0 00000000`76c759cd dotnet!__scrt_common_main_seh+0x11d [f:ddvctoolscrtvcstartupsrcstartupexe_common.inl @ 253]
 00000000`001afe20 00000000`76eaa561 kernel32!BaseThreadInitThunk+0xd
 00000000`001afe50 00000000`00000000 ntdll!RtlUserThreadStart+0x1d

1 Id: 6018.67e8 Suspend: 1 Teb: 000007ff`fffdc000 Unfrozen
 Child-SP RetAddr Call Site
 00000000`0232f648 000007fe`fcbf1430 ntdll!ZwWaitForMultipleObjects+0xa
 00000000`0232f650 00000000`76c816e3 KERNELBASE!WaitForMultipleObjectsEx+0xe8
 00000000`0232f750 000007fe`d118b36a kernel32!WaitForMultipleObjectsExImplementation+0xb3
 00000000`0232f7e0 000007fe`d118b44e coreclr!DebuggerRCThread::MainLoop+0xce [e:a_work1791ssrcdebugeercthread.cpp @ 1241]
 00000000`0232f8a0 000007fe`d118ae8a coreclr!DebuggerRCThread::ThreadProc+0xd2 [e:a_work1791ssrcdebugeercthread.cpp @ 1042]
 00000000`0232f8f0 00000000`76c759cd coreclr!DebuggerRCThread::ThreadProcStatic+0x1a [e:a_work1791ssrcdebugeercthread.cpp @ 1642]
 00000000`0232f920 00000000`76eaa561 kernel32!BaseThreadInitThunk+0xd
 00000000`0232f950 00000000`00000000 ntdll!RtlUserThreadStart+0x1d

2 Id: 6018.550 Suspend: 1 Teb: 000007ff`fffda000 Unfrozen
 Child-SP RetAddr Call Site
 00000000`1a94f5f8 000007fe`fcbf1430 ntdll!ZwWaitForMultipleObjects+0xa
 00000000`1a94f600 00000000`76c816e3 KERNELBASE!WaitForMultipleObjectsEx+0xe8
 00000000`1a94f700 000007fe`d1176361 kernel32!WaitForMultipleObjectsExImplementation+0xb3
 00000000`1a94f790 000007fe`d1175de2 coreclr!FinalizerThread::WaitForFinalizerEvent+0x85 [e:a_work1791ssrcvmfinalizerthread.cpp @ 469]
 00000000`1a94f7d0 000007fe`d10cd66b coreclr!FinalizerThread::FinalizerThreadWorker+0x62 [e:a_work1791ssrcvmfinalizerthread.cpp @ 587]
 00000000`1a94f830 000007fe`d10cd586 coreclr!ManagedThreadBase_DispatchInner+0x43 [e:a_work1791ssrcvmthreads.cpp @ 9204]
 00000000`1a94f870 000007fe`d10cd498 coreclr!ManagedThreadBase_DispatchMiddle+0x82 [e:a_work1791ssrcvmthreads.cpp @ 9253]
 00000000`1a94f9d0 000007fe`d117587c coreclr!ManagedThreadBase_DispatchOuter+0xb4 [e:a_work1791ssrcvmthreads.cpp @ 9492]
 00000000`1a94fa80 000007fe`d11773fb coreclr!FinalizerThread::FinalizerThreadStart+0x9c [e:a_work1791ssrcvmfinalizerthread.cpp @ 774]
 00000000`1a94fb20 00000000`76c759cd coreclr!Thread::intermediateThreadProc+0x8b [e:a_work1791ssrcvmthreads.cpp @ 2594]
 00000000`1a94fbe0 00000000`76eaa561 kernel32!BaseThreadInitThunk+0xd
 00000000`1a94fc10 00000000`00000000 ntdll!RtlUserThreadStart+0x1d

# 3 Id: 6018.7540 Suspend: 1 Teb: 000007ff`fffd8000 Unfrozen
 Child-SP RetAddr Call Site
 00000000`1ac7fc28 00000000`76f72e08 ntdll!DbgBreakPoint
 00000000`1ac7fc30 00000000`76c759cd ntdll!DbgUiRemoteBreakin+0x38
 00000000`1ac7fc60 00000000`76eaa561 kernel32!BaseThreadInitThunk+0xd
 00000000`1ac7fc90 00000000`00000000 ntdll!RtlUserThreadStart+0x1d

We see 4 threads running in idle state and callstacks of the threads.But these windbg commands only shows native stacks and does not show managed threads or stacks.To make windbg understand about CLR and managed threads, we have to windbg debugging extension dlls  .In this case we are going to use a dll called SOS

We have SOS.dll for every version and bitness of .NET framework (.NET 1.1,2.0,4.0 etc) .So for .NET core process debugging we need to use .NET Core's sos.dll . And the good part is that sos.dll is shipped with the dotnet and you will find it on

64bit: C:Program FilesdotnetsharedMicrosoft.NETCore.App2.0.5sos.dll 

32bit: C:Program Files (x86)dotnetsharedMicrosoft.NETCore.App2.0.5sos.dll

Now to load any extension to windbg,we have to use .load command

.load C:Program FilesdotnetsharedMicrosoft.NETCore.App2.0.5sos

You can also use alternate command .loadby

0:001> .loadby sos coreclr

What this does is it will automatically find the sos path and load from the already existing dll  coreclr which is loaded into the dotnet.exe process. Once it is loaded,you can get all the commands using help

0:001> !help
 -------------------------------------------------------------------------------
 SOS is a debugger extension DLL designed to aid in the debugging of managed
 programs. Functions are listed by category, then roughly in order of
 importance. Shortcut names for popular functions are listed in parenthesis.
 Type "!help <functionname>" for detailed info on that function.

Object Inspection Examining code and stacks
 ----------------------------- -----------------------------
 DumpObj (do) Threads
 DumpArray (da) ThreadState
 DumpStackObjects (dso) IP2MD
 DumpHeap U
 DumpVC DumpStack
 GCRoot EEStack
 ObjSize CLRStack
 FinalizeQueue GCInfo
 PrintException (pe) EHInfo
 TraverseHeap BPMD
 COMState

Examining CLR data structures Diagnostic Utilities
 ----------------------------- -----------------------------
 DumpDomain VerifyHeap
 EEHeap VerifyObj
 Name2EE FindRoots
 SyncBlk HeapStat
 DumpMT GCWhere
 DumpClass ListNearObj (lno)
 DumpMD GCHandles
 Token2EE GCHandleLeaks
 EEVersion FinalizeQueue (fq)
 DumpModule FindAppDomain
 ThreadPool SaveModule
 DumpAssembly ProcInfo
 DumpSigElem StopOnException (soe)
 DumpRuntimeTypes DumpLog
 DumpSig VMMap
 RCWCleanupList VMStat
 DumpIL MinidumpMode
 DumpRCW AnalyzeOOM (ao)
 DumpCCW

Examining the GC history Other
 ----------------------------- -----------------------------
 HistInit FAQ
 HistRoot
 HistObj
 HistObjFind
 HistClear

don't worry about all these commands ,we are going to use a handful of them .mainly !threads !CLRStack and !dumpheap .

Please note that all extension commands will start with ! .So all commands inside sos.dll we have to use !threads ,!clrstack etc.

 

Getting managed threads and stacks

0:001> !threads
 ThreadCount: 2
 UnstartedThread: 0
 BackgroundThread: 1
 PendingThread: 0
 DeadThread: 0
 Hosted Runtime: no
 Lock
 ID OSID ThreadOBJ State GC Mode GC Alloc Context Domain Count Apt Exception
 0 1 1244 00000000005cb900 20020 Preemptive 0000000002355278:00000000023561C0 0000000000433400 1 Ukn
 2 2 550 00000000005f1fb0 21220 Preemptive 0000000000000000:0000000000000000 0000000000433400 0 Ukn (Finalizer)

Once you see manageed thread,you can switch to that thread by ~<threadnumber>s

to switch to first thread ~ss

0:001> ~0s
 ntdll!NtRequestWaitReplyPort+0xa:
 00000000`76ecbf5a c3 ret
 0:000> !clrstack
 OS Thread Id: 0x1244 (0)
 Child SP IP Call Site
 00000000001adf20 0000000076ecbf5a [InlinedCallFrame: 00000000001adf20] Interop+Kernel32.ReadFile(IntPtr, Byte*, Int32, Int32 ByRef, IntPtr)
 00000000001adf20 000007fe716d147f [InlinedCallFrame: 00000000001adf20] Interop+Kernel32.ReadFile(IntPtr, Byte*, Int32, Int32 ByRef, IntPtr)
 00000000001adef0 000007fe716d147f DomainBoundILStubClass.IL_STUB_PInvoke(IntPtr, Byte*, Int32, Int32 ByRef, IntPtr)
 00000000001adfb0 000007fed0e78f65 System.ConsolePal+WindowsConsoleStream.ReadFileNative(IntPtr, Byte[], Int32, Int32, Boolean, Int32 ByRef, Boolean) [E:A_work1439scorefxsrcSystem.ConsolesrcSystemConsolePal.Windows.cs @ 1170]
 00000000001ae010 000007fed0e78db3 System.ConsolePal+WindowsConsoleStream.Read(Byte[], Int32, Int32) [E:A_work1439scorefxsrcSystem.ConsolesrcSystemConsolePal.Windows.cs @ 1121]
 00000000001ae080 000007fed1d0dc6d System.IO.StreamReader.ReadBuffer() [E:A_work1439scorefxsrcSystem.Runtime.ExtensionssrcSystemIOStreamReader.cs @ 627]
 00000000001ae0d0 000007fed1d0e04a System.IO.StreamReader.ReadLine() [E:A_work1439scorefxsrcSystem.Runtime.ExtensionssrcSystemIOStreamReader.cs @ 802]
 00000000001ae120 000007fed0e7d517 System.IO.SyncTextReader.ReadLine() [E:A_work1439scorefxsrcSystem.ConsolesrcSystemIOSyncTextReader.cs @ 78]
 00000000001ae170 000007fed0e752fa System.Console.ReadLine() [E:A_work1439scorefxsrcSystem.ConsolesrcSystemConsole.cs @ 474]
 00000000001ae1a0 000007fe716d04b6 dotnet.Program.Main(System.String[]) [D:PROJECTSdotnetProgram.cs @ 10]
 00000000001ae418 000007fed11a35d3 [GCFrame: 00000000001ae418]
 00000000001ae8f8 000007fed11a35d3 [GCFrame: 00000000001ae8f8]

Now We can check how the native stack will look like by running k

0:000> k
Child-SP RetAddr Call Site
00000000`001adc48 00000000`76c818e8 ntdll!NtRequestWaitReplyPort+0xa
00000000`001adc50 00000000`76cb57f1 kernel32!ConsoleClientCallServer+0x54
00000000`001adc80 00000000`76cca9f2 kernel32!ReadConsoleInternal+0x1f1
00000000`001addd0 00000000`76c97e64 kernel32!ReadConsoleA+0xb2
00000000`001adeb0 000007fe`716d147f kernel32!TlsGetValue+0x81fe
00000000`001adef0 000007fe`d0e78f65 0x7fe`716d147f
00000000`001adfb0 000007fe`d0e78db3 System_Console+0x18f65
00000000`001ae010 000007fe`d1d0dc6d System_Console+0x18db3
00000000`001ae080 000007fe`d1d0e04a System_Runtime_Extensions+0x4dc6d
00000000`001ae0d0 000007fe`d0e7d517 System_Runtime_Extensions+0x4e04a
00000000`001ae120 000007fe`d0e752fa System_Console+0x1d517
00000000`001ae170 000007fe`716d04b6 System_Console+0x152fa
00000000`001ae1a0 000007fe`d11a35d3 0x7fe`716d04b6
00000000`001ae1e0 000007fe`d10cd9bf coreclr!CallDescrWorkerInternal+0x83 [E:A_work1791ssrcvmamd64CallDescrWorkerAMD64.asm @ 101]
00000000`001ae220 000007fe`d1193ef7 coreclr!MethodDescCallSite::CallTargetWorker+0x17b [e:a_work1791ssrcvmcallhelpers.cpp @ 653]
00000000`001ae370 000007fe`d108b195 coreclr!RunMain+0x17f [e:a_work1791ssrcvmassembly.cpp @ 1849]
00000000`001ae5d0 000007fe`d112ba29 coreclr!Assembly::ExecuteMainMethod+0xb5 [e:a_work1791ssrcvmassembly.cpp @ 1944]
00000000`001ae890 000007fe`d112d9ce coreclr!CorHost2::ExecuteAssembly+0x149 [e:a_work1791ssrcvmcorhost.cpp @ 502]
00000000`001ae960 000007fe`d6a8e8b9 coreclr!coreclr_execute_assembly+0xde [e:a_work1791ssrcdllsmscoreeunixinterface.cpp @ 407]
00000000`001ae9f0 000007fe`d6a8ee44 hostpolicy!run+0xdb9
00000000`001af0c0 000007fe`d6b19b05 hostpolicy!corehost_main+0x164
00000000`001af240 000007fe`d6b1f42b hostfxr!execute_app+0x1f5
00000000`001af310 000007fe`d6b1e819 hostfxr!fx_muxer_t::read_config_and_execute+0x94b
00000000`001af9b0 000007fe`d6b1cc8d hostfxr!fx_muxer_t::parse_args_and_execute+0x409
00000000`001afb40 00000001`3fdc9abc hostfxr!fx_muxer_t::execute+0x22d
00000000`001afcd0 00000001`3fdce099 dotnet!wmain+0x46c
00000000`001afde0 00000000`76c759cd dotnet!__scrt_common_main_seh+0x11d [f:ddvctoolscrtvcstartupsrcstartupexe_common.inl @ 253]
00000000`001afe20 00000000`76eaa561 kernel32!BaseThreadInitThunk+0xd
00000000`001afe50 00000000`00000000 ntdll!RtlUserThreadStart+0x1d

 

As you see the managed stack and native stack look different this is because CLR abstracts away many details from the actual machine level execution.

hostfxr!execute_app method executes your .NET code. Once it loads coreclr ,everything happens is on .NET terms . coreclr loads the .NET dlls and all the reference dlls and execute your code

As you see there are two managed threads running (from the output of !threads command),lets see what the other thread is doing

0:000> ~2s
 ntdll!ZwWaitForMultipleObjects+0xa:
 00000000`76ecc2ea c3 ret
 0:002> !clrstack
 OS Thread Id: 0x550 (2)
 Child SP IP Call Site
 000000001a94fa00 0000000076ecc2ea [DebuggerU2MCatchHandlerFrame: 000000001a94fa00]

We really do not see any manage callstack although it is managed thread.So let's see what it actually is by looking at the native callstack

0:002> k
 Child-SP RetAddr Call Site
 00000000`1a94f5f8 000007fe`fcbf1430 ntdll!ZwWaitForMultipleObjects+0xa
 00000000`1a94f600 00000000`76c816e3 KERNELBASE!WaitForMultipleObjectsEx+0xe8
 00000000`1a94f700 000007fe`d1176361 kernel32!WaitForMultipleObjectsExImplementation+0xb3
 00000000`1a94f790 000007fe`d1175de2 coreclr!FinalizerThread::WaitForFinalizerEvent+0x85 [e:a_work1791ssrcvmfinalizerthread.cpp @ 469]
 00000000`1a94f7d0 000007fe`d10cd66b coreclr!FinalizerThread::FinalizerThreadWorker+0x62 [e:a_work1791ssrcvmfinalizerthread.cpp @ 587]
 00000000`1a94f830 000007fe`d10cd586 coreclr!ManagedThreadBase_DispatchInner+0x43 [e:a_work1791ssrcvmthreads.cpp @ 9204]
 00000000`1a94f870 000007fe`d10cd498 coreclr!ManagedThreadBase_DispatchMiddle+0x82 [e:a_work1791ssrcvmthreads.cpp @ 9253]
 00000000`1a94f9d0 000007fe`d117587c coreclr!ManagedThreadBase_DispatchOuter+0xb4 [e:a_work1791ssrcvmthreads.cpp @ 9492]
 00000000`1a94fa80 000007fe`d11773fb coreclr!FinalizerThread::FinalizerThreadStart+0x9c [e:a_work1791ssrcvmfinalizerthread.cpp @ 774]
 00000000`1a94fb20 00000000`76c759cd coreclr!Thread::intermediateThreadProc+0x8b [e:a_work1791ssrcvmthreads.cpp @ 2594]
 00000000`1a94fbe0 00000000`76eaa561 kernel32!BaseThreadInitThunk+0xd
 00000000`1a94fc10 00000000`00000000 ntdll!RtlUserThreadStart+0x1d

As you see from the stack,it is finalizer thread

Now let's move on to  .NET obejcts in the heap.For this ,we will use !dumpheap .

0:002> !dumpheap
 Statistics:
 MT Count TotalSize Class Name
 000007fec5cd6c68 1 24 System.Collections.Generic.GenericEqualityComparer`1[[System.Char, System.Private.CoreLib]]
 000007fec5ccb060 1 24 System.Environment+<>c
 000007fec5cc5130 1 24 System.Collections.Generic.GenericEqualityComparer`1[[System.String, System.Private.CoreLib]]
 000007fec5cbe1e0 1 24 System.Reflection.Missing
 000007fec5cb8c58 1 24 System.Security.Policy.ApplicationTrust
 000007fec5cb6e98 1 24 System.Diagnostics.Tracing.EtwEventProvider
 000007fec5cb4168 1 24 System.OrdinalIgnoreCaseComparer
 000007fec5cb4068 1 24 System.OrdinalCaseSensitiveComparer
 000007fec5cb1bd8 1 24 System.SharedStatics
 000007fec5ca00f8 1 24 System.WeakReference
 000007fec5c91228 1 24 System.Collections.Generic.NonRandomizedStringEqualityComparer
 000007fec5c88388 1 24 System.Boolean
 000007fec5c53388 1 24 System.Attribute[]
 000007fec537f038 1 24 System.Collections.Generic.Dictionary`2+KeyCollection[[System.String, System.Private.CoreLib],[System.Object, System.Private.CoreLib]]
 000007fe7157cdb8 1 24 System.IO.SyncTextReader
 000007fe715790d8 1 24 System.Console+<>c
 000007fe71566fe0 1 24 System.Collections.Generic.ObjectEqualityComparer`1[[System.RuntimeType, System.Private.CoreLib]]
 000007fec5c62708 1 26 System.Globalization.CalendarId[]
 000007fec5c53260 1 31 System.Boolean[]
 000007fec5ce2358 1 32 System.Buffers.TlsOverPerCoreLockedStacksArrayPool`1[[System.Char, System.Private.CoreLib]]
 000007fec5ccc430 1 32 System.IO.Stream+NullStream
 000007fec5c9d0a8 1 32 System.Diagnostics.Tracing.ActivityTracker
 000007fec5c8da20 1 32 System.Guid
 000007fec5c59028 1 32 System.Reflection.RuntimePropertyInfo[]
 000007fec5c98ef0 1 40 System.Collections.Generic.List`1+Enumerator[[System.String, System.Private.CoreLib]]
 000007fec5c54658 1 40 System.WeakReference[]
 000007fec5378e28 1 40 System.Collections.Generic.List`1[[System.WeakReference, System.Private.CoreLib]]
 000007fec5377168 1 40 System.Collections.Generic.List`1[[System.String, System.Private.CoreLib]]
 000007fe7157bc50 1 40 System.IO.TextWriter+NullTextWriter
 000007fe715799e0 1 40 Interop+InputRecord
 000007fe71567680 1 40 System.Reflection.CerHashtable`2+Table[[System.String, System.Private.CoreLib],[System.Reflection.RuntimePropertyInfo[], System.Private.CoreLib]]
 000007fe715668c0 1 40 System.Collections.Generic.Dictionary`2+KeyCollection+Enumerator[[System.String, System.Private.CoreLib],[System.Object, System.Private.CoreLib]]
 000007fec5cce310 1 48 System.Text.Encoding+DefaultDecoder
 000007fec5c59768 2 48 System.Reflection.ParameterInfo[]
 000007fe7157c0f8 1 48 System.IO.SyncTextWriter
 000007fe7157be68 1 48 System.Text.OSEncoder
 000007fec5c92158 1 56 System.RuntimeType+RuntimeTypeCache+MemberInfoCache`1[[System.Reflection.RuntimePropertyInfo, System.Private.CoreLib]]
 000007fec5c91d58 1 56 System.RuntimeType+RuntimeTypeCache+MemberInfoCache`1[[System.Reflection.RuntimeMethodInfo, System.Private.CoreLib]]
 000007fec5c8f3e8 1 56 System.Reflection.RuntimeAssembly
 000007fec5c88d10 1 56 System.Globalization.CompareInfo
 000007fec5cb3f58 2 64 System.CultureAwareComparer
 000007fec5ca0e78 2 64 System.LazyHelper
 000007fec5c963c8 1 64 System.Reflection.RuntimeModule
 000007fec5c49988 1 64 Microsoft.Win32.UnsafeNativeMethods+ManifestEtw+EtwEnableCallback
 000007fe7157c6e0 1 64 System.Func`1[[System.IO.TextReader, System.Runtime.Extensions]]
 000007fe71579188 1 64 System.Func`1[[System.IO.TextWriter, System.Runtime.Extensions]]
 000007fe71567188 2 64 System.Diagnostics.Tracing.EventSourceAttribute[]
 000007fec5ca64e0 3 72 System.IntPtr
 000007fec5cb7070 2 80 System.Diagnostics.Tracing.EventSourceAttribute
 000007fec5c9d478 2 80 System.Lazy`1[[System.Boolean, System.Private.CoreLib]]
 000007fec537a308 1 80 System.Collections.Generic.Dictionary`2[[System.String, System.Private.CoreLib],[System.Globalization.CultureData, System.Private.CoreLib]]
 000007fec53799f0 1 80 System.Collections.Generic.Dictionary`2[[System.RuntimeType, System.Private.CoreLib],[System.RuntimeType, System.Private.CoreLib]]
 000007fe715677c8 1 80 System.Reflection.RuntimePropertyInfo[][]
 000007fec5cce920 2 96 System.Text.UTF8Encoding+UTF8EncodingSealed
 000007fec5ccb5e8 1 96 System.Diagnostics.Tracing.EventSource+OverideEventProvider
 000007fec5c587e0 2 96 System.Reflection.CustomAttributeRecord[]
 000007fec5c54280 3 96 System.IntPtr[]
 000007fe7157cbc0 1 96 System.IO.StreamReader
 000007fe715662b8 1 96 System.Collections.Generic.Dictionary`2+Entry[[System.String, System.Private.CoreLib],[System.Globalization.CultureData, System.Private.CoreLib]][]
 000007fec5c92b58 1 104 System.Reflection.RuntimePropertyInfo
 000007fe7157b9d0 1 104 System.IO.StreamWriter
 000007fec5cc38f0 2 112 System.Text.UnicodeEncoding
 000007fe7157b4b8 2 112 System.Text.ConsoleEncoding
 000007fe71579e10 2 112 System.ConsolePal+WindowsConsoleStream
 000007fec5c590b8 3 120 System.Reflection.RuntimeMethodInfo[]
 000007fec5c9ba08 4 128 System.Text.DecoderReplacementFallback
 000007fec5c9b9a8 4 128 System.Text.EncoderReplacementFallback
 000007fec5c8b930 2 128 System.Globalization.TextInfo
 000007fec5c44c10 2 128 System.Func`1[[System.Boolean, System.Private.CoreLib]]
 000007fe7157b0a0 2 128 System.Text.OSEncoding
 000007fe71567c28 2 128 System.Func`1[[System.Text.Encoding, System.Private.CoreLib]]
 000007fec5cb81b0 1 152 System.Buffers.ArrayPoolEventSource
 000007fec5cb3db8 1 152 System.StackOverflowException
 000007fec5cb2f68 1 152 System.ExecutionEngineException
 000007fec5ca5ba0 1 152 System.OutOfMemoryException
 000007fec5c8a718 1 152 System.Exception
 000007fec5c88340 1 152 System.AppDomain
 000007fec5cc2cb8 4 160 System.Text.InternalEncoderBestFitFallback
 000007fec5c90d98 1 160 System.Globalization.CalendarData
 000007fec5c6d230 1 160 System.Char[][]
 000007fec5c55b50 1 160 System.Buffers.TlsOverPerCoreLockedStacksArrayPool`1+PerCoreLockedStacks[[System.Char, System.Private.CoreLib]][]
 000007fec5c8af30 7 168 System.Object
 000007fec5cb19e8 2 176 System.RuntimeMethodInfoStub
 0000000000483380 7 186 Free
 000007fec5cc2770 4 192 System.Text.InternalDecoderBestFitFallback
 000007fec5c8b5a0 4 192 System.Text.StringBuilder
 000007fec5c2fba8 3 192 System.Reflection.MemberFilter
 000007fec5c923c0 2 208 System.Reflection.RuntimeMethodInfo
 000007fec5c61260 1 208 System.Globalization.CalendarData[]
 000007fec5c54ec8 7 216 System.Type[]
 000007fec5c96bd8 3 240 System.Signature
 000007fe71566b68 1 288 System.Collections.Generic.Dictionary`2+Entry[[System.RuntimeType, System.Private.CoreLib],[System.RuntimeType, System.Private.CoreLib]][]
 000007fec5cb9c80 2 304 System.Threading.ThreadAbortException
 000007fec5c91cb8 2 304 System.RuntimeType+RuntimeTypeCache
 000007fec5c9c190 3 312 System.AppDomainSetup
 000007fec5379db0 4 320 System.Collections.Generic.Dictionary`2[[System.String, System.Private.CoreLib],[System.Object, System.Private.CoreLib]]
 000007fec5c88be8 3 336 System.Globalization.CultureInfo
 000007fec5c546f8 1 364 System.UInt32[]
 000007fec5c53e28 10 384 System.RuntimeType[]
 000007fec5c54158 13 940 System.Int32[]
 000007fec5c8b498 2 944 System.Globalization.CultureData
 000007fe71566648 6 1008 System.Collections.Generic.Dictionary`2+Entry[[System.String, System.Private.CoreLib],[System.Object, System.Private.CoreLib]][]
 000007fec5c8dd70 36 1440 System.RuntimeType
 000007fec5c528e8 24 1728 System.String[]
 000007fec5c53050 7 4148 System.Byte[]
 000007fec5c52ca8 10 17776 System.Object[]
 000007fec5c567e0 14 31538 System.Char[]
 000007fec5c87be8 320 94632 System.String

 

We see that there are around 600 objects. We can get more details about !dumpheap by getting the help with help command

0:002> !help  dumpheap0:002> !help  dumpheap-------------------------------------------------------------------------------!DumpHeap [-stat]           [-strings]           [-short]          [-min <size>]           [-max <size>]           [-live]          [-dead]          [-thinlock]           [-startAtLowerBound]          [-mt <MethodTable address>]           [-type <partial type name>]           [start [end]]
!DumpHeap is a powerful command that traverses the garbage collected heap, collection statistics about objects. With it's various options, it can look forparticular types, restrict to a range, or look for ThinLocks (see !SyncBlk documentation). Finally, it will provide a warning if it detects excessive fragmentation in the GC heap. 
When called without options, the output is first a list of objects in the heap,followed by a report listing all the types found, their size and number:
 0:000> !dumpheap Address       MT     Size 00a71000 0015cde8       12 Free 00a7100c 0015cde8       12 Free 00a71018 0015cde8       12 Free 00a71024 5ba58328       68 00a71068 5ba58380       68 00a710ac 5ba58430       68 00a710f0 5ba5dba4       68 ... total 619 objects Statistics:       MT    Count TotalSize Class Name 5ba7607c        1        12 System.Security.Permissions.HostProtectionResource 5ba75d54        1        12 System.Security.Permissions.SecurityPermissionFlag 5ba61f18        1        12 System.Collections.CaseInsensitiveComparer ... 0015cde8        6     10260      Free 5ba57bf8      318     18136 System.String ...
"Free" objects are simply regions of space the garbage collector can use later.If 30% or more of the heap contains "Free" objects, the process may suffer fromheap fragmentation. This is usually caused by pinning objects for a long time combined with a high rate of allocation. Here is example output where !DumpHeapprovides a warning about fragmentation:
 <After the Statistics section> Fragmented blocks larger than 1MB:     Addr     Size Followed by 00a780c0    1.5MB    00bec800 System.Byte[] 00da4e38    1.2MB    00ed2c00 System.Byte[] 00f16df0    1.2MB    01044338 System.Byte[]
The arguments in detail:
-stat     Restrict the output to the statistical type summary-strings  Restrict the output to a statistical string value summary-short    Limits output to just the address of each object. This allows you          to easily pipe output from the command to another debugger           command for automation.-min      Ignore objects less than the size given in bytes-max      Ignore objects larger than the size given in bytes-live     Only print live objects-dead     Only print dead objects (objects which will be collected in the          next full GC)-thinlock Report on any ThinLocks (an efficient locking scheme, see !SyncBlk           documentation for more info)-startAtLowerBound           Force heap walk to begin at lower bound of a supplied address range.          (During plan phase, the heap is often not walkable because objects           are being moved. In this case, DumpHeap may report spurious errors,           in particular bad objects. It may be possible to traverse more of           the heap after the reported bad object. Even if you specify an           address range, !DumpHeap will start its walk from the beginning of           the heap by default. If it finds a bad object before the specified           range, it will stop before displaying the part of the heap in which           you are interested. This switch will force !DumpHeap to begin its           walk at the specified lower bound. You must supply the address of a           good object as the lower bound for this to work. Display memory at           the address of the bad object to manually find the next method           table (use !dumpmt to verify). If the GC is currently in a call to           memcopy, You may also be able to find the next object's address by           adding the size to the start address given as parameters.) -mt       List only those objects with the MethodTable given-type     List only those objects whose type name is a substring match of the           string provided. start     Begin listing from this addressend       Stop listing at this address
A special note about -type: Often, you'd like to find not only Strings, butSystem.Object arrays that are constrained to contain Strings. ("new String[100]" actually creates a System.Object array, but it can only holdSystem.String object pointers). You can use -type in a special way to findthese arrays. Just pass "-type System.String[]" and those Object arrays willbe returned. More generally, "-type <Substring of interesting type>[]".
The start/end parameters can be obtained from the output of !EEHeap -gc. For example, if you only want to list objects in the large heap segment:
 0:000> !eeheap -gc Number of GC Heaps: 1 generation 0 starts at 0x00c32754 generation 1 starts at 0x00c32748 generation 2 starts at 0x00a71000 segment    begin allocated     size 00a70000 00a71000  010443a8 005d33a8(6108072) Large object heap starts at 0x01a71000 segment    begin allocated     size 01a70000 01a71000  01a75000 0x00004000(16384) Total Size  0x5d73a8(6124456) ------------------------------ GC Heap Size  0x5d73a8(6124456)
 0:000> !dumpheap 1a71000 1a75000 Address       MT     Size 01a71000 5ba88bd8     2064 01a71810 0019fe48     2032 Free 01a72000 5ba88bd8     4096 01a73000 0019fe48     4096 Free 01a74000 5ba88bd8     4096 total 5 objects Statistics:       MT    Count TotalSize Class Name 0019fe48        2      6128      Free 5ba88bd8        3     10256 System.Object[] Total 5 objects
Finally, if GC heap corruption is present, you may see an error like this:
 0:000> !dumpheap -stat object 00a73d24: does not have valid MT curr_object : 00a73d24 Last good object: 00a73d14 ----------------
That indicates a serious problem. See the help for !VerifyHeap for more information on diagnosing the cause.

 

We can use DumpHeap command to look for memory leak issues in our application.


 

following are the different usecase of dumpheap command

  • To get all the strings loaded into our application
0:002> !dumpheap -strings
 00000000023313f0 000007fec5c87be8 26
 00000000023314c0 000007fec5c87be8 42
 0000000002331600 000007fec5c87be8 94
 0000000002331680 000007fec5c87be8 46
 00000000023316b0 000007fec5c87be8 74
 0000000002331700 000007fec5c87be8 40
 00000000023317e8 000007fec5c87be8 80
 0000000002331838 000007fec5c87be8 27448
 0000000002338370 000007fec5c87be8 84
 00000000023383c8 000007fec5c87be8 146
 0000000002338460 000007fec5c87be8 72
 00000000023384a8 000007fec5c87be8 68
 00000000023384f0 000007fec5c87be8 98
 0000000002338558 000007fec5c87be8 78
 00000000023385a8 000007fec5c87be8 112
 0000000002338618 000007fec5c87be8 70

===============trimmed=====================

42 1 HH:mm:ss
 42 1 November
 42 1 Saturday
 42 1 Thursday
 42 1 encoding
 42 1 hh:mm tt
 42 1 December
 42 1 February
 42 1 Infinity
 42 1 Internet
 42 1 JIT_PATH
 44 1 September
 44 1 FullTrust
 44 1 yyyy MMMM
 44 1 -Infinity
 44 1 APP_PATHS
 44 1 Wednesday
 46 1 MM/dd/yyyy
 46 1 yyyy-MM-dd
 48 1 MultiDomain
 50 1 NotSpecified
 50 1 FX_DEPS_FILE
 50 1 APP_NI_PATHS
 50 1 SingleDomain
 50 1 Hello World!
 56 1 MultiDomainHost
 60 1 Invariant Country
 62 1 Gregorian Calendar
 62 1 Invariant Language
 62 1 dddd, dd MMMM yyyy
 64 1 PROBING_DIRECTORIES
 64 1 LOADER_OPTIMIZATION
 66 1 ArrayPoolEventSource
 68 1 !x-sys-default-locale
 68 2 Name
 68 1 APP_LOCAL_WINMETADATA
 68 1 RFLCT_InvalidPropFail
 70 1 RFLCT_InvalidFieldFail
 70 1 APP_CONTEXT_DEPS_FILES
 72 2 en-us
 72 2 bytes
 72 2 chars
 78 1 APP_CONTEXT_BASE_DIRECTORY
 80 1 International Monetary Fund
 86 1 System.Globalization.Invariant
 88 2 charCount
 88 2 byteCount
 88 2 charIndex
 90 1 UseRandomizedStringHashAlgorithm
 92 2 dotnet.exe
 96 1 SYSTEM.BUFFERS.ARRAYPOOLEVENTSOURCE
 98 1 UseLatestBehaviorWhenTFMNotSpecified
 102 1 Invariant Language (Invariant Country)
 108 3 en-US
 112 1 D:PROJECTSdotnetbinDebugnetcoreapp2.0
 122 1 System.Diagnostics.Eventing.FrameworkEventSource
 130 1 ERROR: Exception during construction of EventSource
 136 2 AppDomainCompatSwitch
 144 2 PLATFORM_RESOURCE_ROOTS
 146 1 C:Program FilesdotnetsharedMicrosoft.NETCore.App2.0.5;
 148 2 C:Program Filesdotnet
 160 2 TRUSTED_PLATFORM_ASSEMBLIES
 164 1 C:Program FilesdotnetsharedMicrosoft.NETCore.App2.0.5clrj
 168 1 C:Program FilesdotnetsharedMicrosoft.NETCore.App2.0.5msco
 168 2 NATIVE_DLL_SEARCH_DIRECTORIES
 174 1 C:Program FilesdotnetsharedMicrosoft.NETCore.App2.0.5Wind
 174 1 C:Program FilesdotnetsharedMicrosoft.NETCore.App2.0.5SOS.
 174 1 C:Program FilesdotnetsharedMicrosoft.NETCore.App2.0.5nets
 188 2 C:Program Filesdotnetdotnet.exe
 192 2 System.Buffers.ArrayPoolEventSource
 264 2 D:PROJECTSdotnetbinDebugnetcoreapp2.0dotnet.dll
 298 1 C:Program Filesdotnetstorex64netcoreapp2.0;C:Usersrkolak
 326 1 D:PROJECTSdotnetbinDebugnetcoreapp2.0dotnet.deps.json;C:
 988 5 C:Program FilesdotnetsharedMicrosoft.NETCore.App2.0.5Micr
 29538 146 C:Program FilesdotnetsharedMicrosoft.NETCore.App2.0.5Syst
 54894 2 D:PROJECTSdotnetbinDebugnetcoreapp2.0dotnet.dll;C:Progra

You can get all the strings from the process e.g. your password stored as plaintext and stored in config if you are loading that into memory,you will be able to see it here.

  • To check the objects in LOH Large Object Heap

any objects which are more than 85000 bytes will be stored in Large Object Heap(LOH) and lot of objects in LOH can cause memory issues in your application.

0:002> !dumpheap -min 85000
Address MT Size

Statistics:
MT Count TotalSize Class Name
Total 0 objects

We did not get any large objects

  • How many objects are locked using lock statements
0:002> !dumpheap -thinlock
 Address MT Size
 0000000002355260 000007fe7157cdb8 24 ThinLock owner 1 (00000000005cb900) Recursive 0
 Found 1 objects.
0:002> !do 0000000002355260
 Name: System.IO.SyncTextReader
 MethodTable: 000007fe7157cdb8
 EEClass: 000007fe716cbc98
 Size: 24(0x18) bytes
 File: C:Program FilesdotnetsharedMicrosoft.NETCore.App2.0.5System.Console.dll
 Fields:
 MT Field Offset Type VT Attr Value Name
 000007fe7157c520 40001a5 d0 System.IO.TextReader 0 static 0000000000000000 Null
 000007fe7157c520 400011c 8 System.IO.TextReader 0 instance 00000000023548a0 _in
 ThinLock owner 1 (00000000005cb900), Recursive 0
  • Dump all the objects of a particular type
0:002> !dumpheap -type Console
 Address MT Size
 00000000023530e0 000007fe715790d8 24
 0000000002353178 000007fe71579e10 56
 0000000002353330 000007fe7157b4b8 56
 0000000002354700 000007fe71579e10 56
 0000000002354810 000007fe7157b4b8 56

Statistics:
 MT Count TotalSize Class Name
 000007fe715790d8 1 24 System.Console+<>c
 000007fe7157b4b8 2 112 System.Text.ConsoleEncoding
 000007fe71579e10 2 112 System.ConsolePal+WindowsConsoleStream
 Total 5 objects

Dumping an object from process and looking at all properties

To get the details of any object,we will be using !DumpObject command.To this command,you have to pass the Object Address.Normally we use DumpObject command along with another command(dumpheap). We will first use dumpheap to find the address of a particular tpe of object and then use DumpObject(!do is alias) to further drill down

0:002> !dumpheap -type Console
 Address MT Size
 00000000023530e0 000007fe715790d8 24
 0000000002353178 000007fe71579e10 56
 0000000002353330 000007fe7157b4b8 56
 0000000002354700 000007fe71579e10 56
 0000000002354810 000007fe7157b4b8 56

Statistics:
 MT Count TotalSize Class Name
 000007fe715790d8 1 24 System.Console+<>c
 000007fe7157b4b8 2 112 System.Text.ConsoleEncoding
 000007fe71579e10 2 112 System.ConsolePal+WindowsConsoleStream
 Total 5 objects
 0:002> !do 00000000023530e0
 Name: System.Console+<>c
 MethodTable: 000007fe715790d8
 EEClass: 000007fe716cadc0
 Size: 24(0x18) bytes
 File: C:Program FilesdotnetsharedMicrosoft.NETCore.App2.0.5System.Console.dll
 Fields:
 MT Field Offset Type VT Attr Value Name
 000007fe715790d8 4000044 60 System.Console+<>c 0 static 00000000023530e0 <>9
 000007fe7157c6e0 4000045 68 ...time.Extensions]] 0 static 00000000023546c0 <>9__13_0
 000007fe71567c28 4000046 70 ...Private.CoreLib]] 0 static 0000000002354738 <>9__15_0
 000007fe71567c28 4000047 78 ...Private.CoreLib]] 0 static 00000000023531b0 <>9__18_0
 000007fe71579188 4000048 80 ...time.Extensions]] 0 static 00000000023530f8 <>9__25_0
 000007fe71579188 4000049 88 ...time.Extensions]] 0 static 0000000000000000 <>9__27_0
 0000000000000000 400004a 90 0 static 0000000000000000 <>9__33_0
 0000000000000000 400004b 98 0 static 0000000000000000 <>9__35_0
 0000000000000000 400004c a0 0 static 0000000000000000 <>9__37_0
 000007fec5c701d8 400004d a8 ...Private.CoreLib]] 0 static 0000000000000000 <>9__151_0

 

In our next post we will explore looking inside a ASP.NET CORE process.We will also explore more detail  commands which can help you find memory leak inside a asp.net core process.

Deployment Groups is now generally available: sharing of targets and more…

$
0
0

We are excited to announce that Deployment Groups is out of preview and is now generally available. Deployment Groups is a robust out-of-the-box multi-machine deployment feature of Release Management in VSTS/TFS. 

What are Deployment Groups?

With Deployment Groups, you can orchestrate deployments across multiple servers and perform rolling updates, while ensuring high availability of your application throughout. You can also deploy to servers on-premises or virtual machines on Azure or any cloud, plus have end-to-end traceability of deployed artifact versions down to the server level.

Agent-based deployment relies on the same agents your builds and releases use, which means you can use the full task catalog on your target machines. From an extensibility perspective, you can also use the REST APIs for deployment groups and targets for programmatic access.

Customer adoption:

Since we announced public preview back in May 2017, we have seen many customers find innovative uses for deployment groups and we’ve been blown away by all the great feedback from customers and the community.

For example, we have customers,

  • Deploying to multiple geographic regions or multiple data-centers.
  • Delivering application updates to multiple end-customers (multi-tenanted deployments).
  • Manage transient work-loads on clouds by dynamically spinning up deployment targets and tearing-it down. Or with auto-scaled targets (example Azure VMSS).
  • Perform blue/green deployments with traffic manager/load balancer
  • With  deployment scale varying from a single server to 600+ servers in a single deployment group.

What's new?

Shared deployment targets:

If you are using the same server to host multiple applications, managed by multiple teams, you can now share the deployment targets across team projects using deployment pools. Check out the blog for more details.

New templates:

Deploying to multiple targets is now a breeze with the new release definition templates. You will find templates for IIS web site and IIS web site with database deploy, plus multiple deployment templates for SQL DB offline, partially online, and fully online database upgrades.

Provisioning VMs

Use the enhanced Azure Resource Group task to dynamically bootstrap agents on the newly provisioned or pre-existing Virtual Machines on Azure.

You can also bootstrap Azure Virtual Machines with deployment agent from the portal or using Azure Resource Manager extension. Check out the new portal experience that is rolling out soon.

Refreshed experience:

When we launched deployment groups in last May, we shipped a simple UX that worked well for the few scenarios we supported. However, as we expanded the service with these new investments, we enhanced the UX that more closely matches the rest of Team Services.

Get started:

Report any problems on Developer Community, make suggestions on UserVoice, get advice on Stack Overflow, and get support via our Support page. For product news, follow @VSTS

Surface Pro LTE Advanced にてドライバーパック適用後も eSIM が警告表示となる

$
0
0

こんにちは。Surface 法人サポート担当の沖です。
弊社では、以下のサイトにて最新の Surface に関する最新のファームウエアやドライバーパックをご案内しております。

Download the latest firmware and drivers for Surface devices
https://docs.microsoft.com/en-us/surface/deploy-the-latest-firmware-and-drivers-for-surface-devices

Surface Pro with LTE Advanced モデルにて現時点で最新のドライバー パックを適用した場合でも、デバイスマネージャーにて eSIM が警告表示されますためこちらの対処方法をご紹介します。

Surface Pro LTE Drivers and Firmware
https://www.microsoft.com/en-us/download/details.aspx?id=56278

 

[発生事象]
Surface Pro with LTE Advanced モデルに現時点で最新のドライバー パックを適用した場合でも、デバイスマネージャーにて eSIM が警告表示されます。
本事象につきましては以下の公開情報でもご紹介していおります通り Surface のサービスが利用するランタイム (Visual C ++再配布可能ファイル) がインストールされていない場合に発生致します。

Issues after deploying a custom image or new installation of Windows to Surface when Visual C++ redistributable is not included
https://support.microsoft.com/en-us/help/4090656/issue-deploy-custom-image-or-new-windows-to-surface-without-vc-redist

OEM 版のイメージには Surface のサービスが機能するために必要なランタイムがが既に含まれておりますが、Volume License 版のインストール ディスクには必要なランタイムが含まれていないために Volume License 版をインストールしている場合に発生します。

// デバイス マネージャー

 

// Gemalto eSIM Firmware Update のプロパティ

 

[対処策]
以下のサイトより、Visual Studio 2015 の Visual C++ 再頒布可能パッケージをダウンロードしてインストールすることで警告が表示されなくなります。

Visual Studio 2015 の Visual C++ 再頒布可能パッケージ
https://www.microsoft.com/ja-jp/download/details.aspx?id=48145

//デバイス マネージャー

// Gemalto eSIM Firmware Update のプロパティ


Azure IoT SDK の Long Term Support (LTS) Branch について

$
0
0

こんにちは。Azure IoT 開発サポートチームの S.M です。

 

Azure IoT SDK は概ね1か月に1回から2回更新されていますが、長期サポートが行われる LTS (Long Term Support) 版のブランチがリリースされています。詳細は 2018 2 12 日のアナウンスをご確認ください。原文および翻訳版の内容は以下のサイトからご確認いただけます。

 

Azure IoT SDKs released new Long-Term Support branch

https://azure.microsoft.com/en-us/blog/iot-sdk-lts-branch/

 

Azure IoT SDK で新しい Long Term Support Branch をリリース

https://blogs.technet.microsoft.com/jpitpro/2018/02/15/iot-sdk-lts-branch/ 

 

LTS版は6か月ごとにリリースされ、NuGetPiPyapt-getMavenNPM などの各パッケージマネージャーから入手いただけます。 

 

頻繁にSDKのパージョンがアップデートされることにお悩みの方はLTS版のご活用をご検討ください。

Sharepoint 2016 : The CU upgrade fails during PSconfig execution

$
0
0

Symptom:

The SharePoint configuration wizard fails with the following message after the installation of any CU / Language pack / security patch etc

Failed to upgrade SharePoint Products.

This is a critical task. You have to fix the failures before you can continue. Follow this link for more information about how to troubleshoot upgrade failures: http://go.microsoft.com/fwlink/?LinkId=259653

An exception of type Microsoft.SharePoint.PostSetupConfiguration.PostSetupConfigurationTaskException was thrown. Additional exception information:

Feature upgrade action 'CustomUpgradeAction.AddSwitchField' threw an exception upgrading Feature 'CustomTiles' (Id: 15/'68642d38-a556-4384-888c-082844fbf224') in WebApplication 'SharePoint - 80: List |0

Feature upgrade incomplete for Feature 'CustomTiles' (Id: 15/'68642d38-a556-4384-888c-082844fbf224') in WebApplication 'SharePoint - 80. Exception: List |0

Upgrade completed with errors. Review the upgrade log file located in C:Program FilesCommon FilesMicrosoft SharedWeb Server Extensions16LOGSUpgrade-20180220-162221-353-19435b7e3e40479183e3ca8d9f3155fa.log. The number of errors and warnings is listed

To diagnose the problem, review the application event log and the configuration log file located at:C:Program FilesCommon FilesMicrosoft SharedWeb Server Extensions16LOGSPSCDiagnostics_2_20_2018_16_22_10_998_970437191.log

Its been noticed that the issue occurs only when we try to upgrade the farm with any patch and have the web application created with farm build level 16.0.4456.1000

Resolution: 

The corrective action to perform is to Enable the custom tiles feature to force the creation of the Custom tiles list to move past this road block in running PSConfig.

Enable-spfeature -URL http://webappurl -identity "68642d38-a556-4384-888c-082844fbf224"

 

POST BY : Shiva Prasad [MSFT]

在Visual Studio中调试嵌入式ARM设备

$
0
0

 

[原文发表地址]Debugging an embedded ARM device in Visual Studio

[原文作者] Marc Goodner-MSFT

[原文发表时间] 2018/1/10

我们在15.5版本的Visual Studio 2017中引入了对ARM GCC交叉编译的支持。在15.6 Preview 2中,我们添加了对调试的支持。这个调试功能的概述源自ARM交叉编译入门的安装,并将作为补充进行集成。

首先,确保输出具有调试符号很重要。在从ARM在线编译器导出的GCC项目中,他们不这样做。要添加它们,请在toolsflags部分下编辑makefile,并为GCCG ++命令添加-g标志,如下所示。

          CC      = 'arm-none-eabi-gcc' '-g' ...

          CPP     = 'arm-none-eabi-g++' '-g' ...

现在,在构建二进制文件并刷新设备后,右键单击二进制输出,然后选择“调试”和“启动设置”。

 

在弹出的对话框中选择C / C ++调试微控制器(gdbserver)。

 

这将创建一个launch.vs.json,其中包含许多与嵌入式调试相关的选项.有很多方法可以调试这些设备,所以您在这里填写的内容将特定于您的开发板,硬件调试器及其提供gdbserver接口的相关软件。我们提供尽可能多的默认和提示,我们可以帮助你。在这个预览中,一些发出的环境变量还没有工作,你需要用所需的值替换它们。

  • $ {workspaceRootFolderName},您的文件夹名称
  • $ {env.gccpath},您的VS安装路径跟在Linux gcc_arm bin之后
  • $ {debugInfo.linuxNatvisPath},如果你有一个Natvis文件的路径。这是很好的删除,因为它是针对特定的情况

我将通过使用OpenOCDST Nucleo-F411RE进行配置。这个过程与大多数电路板相似。

首先,在输出中更改程序名称以指向您的.elf文件。

         "program": "${workspaceRoot}\BUILD\Nucleo_blink_led.elf",

更改miDebuggerPath以指向arm-none-eabi-gdb.exe的完整路径。

         "miDebuggerPath": "C:\Program Files (x86)\Microsoft Visual                 Studio\Internal\Enterprise\Linux\gcc_arm\bin\arm-none-eabi-gdb.exe",

在“setupCommands”中,您可以删除文档链接部分(保持不变)。更改符号加载指向您的.elf文件。

         "text": "-file-exec-and-symbols Nucleo_blink_led.elf",

您可以执行其他命令,使您的开发板进入调试状态,通常您需要暂停,然后在会话开始时重置开发板。为此,请按如下所示将这些命令作为setup命令数组的一部分添加。

         {

           "text": "monitor reset halt",

           "ignoreFailures": true

         },

        {

          "text": "monitor reset init",

          "ignoreFailures": true

        }

确保miDebuggerServerAddress与您的硬件调试器将提供的位置相匹配。这是OpenOCD的默认设置。

        "miDebuggerServerAddress": "localhost:3333",

如果你想手动启动你的硬件接口,你可以省略这些行。如果你希望VS启动软件来与你的硬件调试器进行交互,这是一个用配置文件启动OpenOCD的例子。您应该手动检查此命令,以确保配置了正确的文件,并且确保用于验证服务器已经启动的的文件配置是正确的。

         "debugServerPath": "D:\openocd-0.10.0\bin-x64\openocd.exe",

         "debugServerArgs": "-f d:/openocd-0.10.0/scripts/board/st_nucleo_f4.cfg",

         "serverStarted": "Info : stm32f4x.cpu: hardware has 6 breakpoints, 4 watchpoints",

现在使用我们的配置,使用F5Visual Studio开始在设备上调试嵌入式ARM应用程序。

 

接下来该干什么下载Visual Studio 2017预览版,安装Linux C ++工作负载,选择嵌入式和物联网开发选项,并试用您的项目。

我们正在积极为嵌入式方案提供额外的支持。您的反馈对我们非常重要。我们期待您的回音,并看到您所做的一切。

联系我们的最佳方式是通过我们的GitHub托管的问题列表,直接通过邮件 vcpplinux-support@microsoft.com或找到我在Twitter @robotdad

Upcoming Events and News from @MicrosoftEduUK

$
0
0


Microsoft UK are dedicated to ensuring educators are up to date with the latest and greatest in Microsoft technologies and solutions for Education. Here are some of the latest news announcements and upcoming events happening this month!


'What's New in Edu' Episode 15


Immersive Reader Updates

If you are already a user of the Immersive Reader then you will absolutely love the release of the latest and greatest updates last week. If you're not then you should definitely check them out! Either way read the blog post below for all the details of the new updates coming soon.

Picture Dictionary, Custom Parts of Speech Colors and Roaming Settings come to Immersive Reader


Learn Teams Conference

This will be an online only event all about Microsoft Teams from April 3rd-7th, 2018 to get you and your team collaborating, impacting and doing more. Learn From 25+ Microsoft MVPs, MIEEs and Experts in 5+ Days of Online Videos, Chat and Community, register here: www.learnteamsconference.com.

Transformational Technologies for Learning and Teaching with Microsoft Surface and Office 365

             

Wednesday 28th March, 10.00 AM - 4.15 PM

University of Central Lancashire,
Preston Campus, Harrington Building
Fylde Road
Preston, PR1 2HE

Microsoft Education UK and the University of Central Lancashire (UCLan) are thrilled to invite you to join us for our Transformational Technologies for Learning and Teaching event. This will be a premier event for educators and IT personnel within Higher Education, Further Education and Schools that aims to inspire, engage and connect learning practitioners and technologists alike with the latest in transformative ideas, opportunities and solutions to support your communities of learning.
Microsoft Surface, together with Windows 10 and Office 365, continues to be a critical enabler for the University of Central Lancashire’s strategic learning and teaching ambitions. UCLan and Microsoft are celebrating our recent initiatives in learning environment enhancement and academic development, which have culminated in the roll-out of Surface devices to all 1500 members of the University’s academic teaching community. We want to share our experiences of this exciting journey, share best practice, and explore ideas for future enhancements in technology enabled learning and teaching.
This is a day you don't want to miss! It will consist of inspirational keynotes , practical seminars, and further opportunities to see how the latest Microsoft technologies can support and embrace digital transformation in education.

Space is limited so please register now.


Microsoft UK Roadshow is back! 

More dates have now been released for our 2018 #MicrosoftEdu Roadshows in April.

17/04/2018 Surrey: St Hilary's School, Holloway Hill, Godalming. GU7 1RZ SIGN UP HERE

18/04/2018 Fort William: Lochaber High School, Camaghael SIGN UP HERE

19/04/2018 Hertfordshire: Jupiter Community Free School, Jupiter Drive, Hemel Hempstead, HP2 5NT SIGN UP HERE

19/04/2018 Inverness: STEM HUB, University of the Highlands and Islands, An Lochran, 10 Inverness Campus IV2 5NA SIGN UP HERE

25/04/2018 Oxfordshire: Manor School, 28 Lydalls Cl, Didcot SIGN UP HERE

27/04/2018 Weston: Weston College, Winter Gardens (Italian Gardens Entrance), Royal Parade, Weston-Super-Mare  SIGN UP HERE

*Please note: dates are regularly added so check back for updates.

Look out for more updates about Roadshow events series on Twitter online by following the hashtag #MicrosoftEdu or @microsofteduk. Also visit the Microsoft Educator Community UK Roadshow page to find out about the events near you and sign up.

Our aim is to reach every corner of the UK, so if you are able to host a Roadshow in your locality then please contact us on the e-mail: Eduroadshow@microsoft.com.


Twitter feed Updates for Microsoft Edu UK
Check out what is happening in Microsoft in Education here in the UK by viewing the Microsoft Education Twitter updates below.


So that wraps up this week's What's New in Edu UK and Upcoming Events. Remember to follow @Microsofteduk for all our latest updates daily!

Guest Post Chris Macaulay, Advvy: How Microsoft Inspire Changed Everything!

$
0
0


Chris Macaulay
CEO
Advvy

 

The back story

Advvy was a company created with a strong focus on solving some of the biggest challenges facing the advertising industry. My Co-Founder Tristan Ozinga and I started the company to lead the digital transformation of workflow within the media and advertising industry.

Our platform is designed for media agencies to take that vast deluge of information around how they work everyday, and connect it all together in a streamlined work flow. Tristan and I had both been working in an environment where media agencies had sophisticated systems but everything was operating in silos.

There were tools for the planning team, tools for the strategy team and tools for the buying team but none of these systems spoke to each other. People working in the various departments communicated internally by emailing documents to each other and uploading their work various file servers - there was no visibility across 70% of the media agency. So we built a cloud based platform built on top of Dynamics 365 to streamline operations and give top visibility to management on what is actually happening in the business, while driving efficiencies.

We had built our platform on Microsoft technology but our relationship with them was mainly conducted through the Venture Capital firm that backs us and so we made the decision to attend Inspire primarily to develop that direct relationship with Microsoft.

Arriving at Microsoft Inspire

When we got to Microsoft Inspire in 2017 we didn’t know many people but our ISV partner manager had lined up various different meetings with most of the Australian Microsoft executives relevant to us, as well as some of the global executives.

There was a big change happening within Microsoft at the time, during which they shifted their whole business model to make working with partners so much easier and we were awestruck from the moment we arrived. It was a huge eye opener for us, we went to a lot of the sessions and keynotes and during that time we really got an understanding of what was possible for our business.

Microsoft Inspire opened up doors for us. In fact it not only opened doors, but it directed us to the right path and we were invited to join a new partnership initiative driven by the One Commercial Partner model.

A massive return on investment

It was at Microsoft Inspire that Microsoft started to talk about investing to fast-track our product. The conversation developed and before long we were talking about gaining access to co-marketing support and looking at how we could potentially get in front of Microsoft’s sales people. Since that time we have signed a global partnership with Microsoft and a whole new world of opportunities have presented themselves to us.

Microsoft is funding two programs for Advvy, one around Artificial Intelligence and building image recognition software so we can optimise how agencies engage in certain processes; and the other around scalability where we are trying to give the agency the power to control and configure the system themselves – reducing the cost of deployment. From a co-marketing perspective Microsoft have agreed to fund 50% of all of our marketing campaigns going forward.

On the back of signing the partnership agreement with Microsoft we’ve had investors reach out to us enabling us to bring on capital for the business to help us expand and grow the team. The Microsoft partnership has been a game changer for us in every way possible, and I don’t know that any of this would have happened had we not been at Inspire.

The rapid change benefitting Microsoft partners.

On a personal level I was blown away when I got to Microsoft Inspire and discovered that a company as large as Microsoft had made the decision to change their entire business model, which impacted several thousand employees, to more effectively serve their partners. Even with all my experience with tech start ups, and I have been heavily involved with them for the last six years, I have never seen a company undergo so much rapid change as fast. I am truly inspired about what is possible and watching this unfold and witnessing the opportunities open up to all Microsoft partners has been remarkable.

It has been amazing to see the rise of Microsoft over the past five years and how they have gone from the uncool kid to become leaders in so many areas. Not just leaders in technology but leaders with heart and humanity who are looking to change the way we do business and to make the world a better place.

I can’t wait to attend Microsoft Inspire 2018 and see what Microsoft have in store for the next year.


Don't miss out register for Microsoft Inspire today!

ATTENTION GOLD CLOUD COMPETENCY PARTNERS

By attaining Gold status in a cloud competency, you’ve shown your commitment to providing your customers with the best solutions built on Microsoft cloud technology. We’d like to thank you for your achievements by extending to you the discounted price of USD1,995 for a Microsoft Inspire All Access pass when you register before March 31, 2018. This is a savings of USD300 off the current price, only for partners with a gold cloud competency like you. If you have not received your Gold Cloud Competency discount code via email, drop me an email sarahar@microsoft.com

Lesson Learned #38: Which is the impact using connection pooling in my application

$
0
0

In this video in Spanish and English we are going to show you an example about which is the impact to use or not connection pooling, using a C# application.

As probably you know connection pooling is a special connection cached, that is enabled by default using ADO .NET  with a maximum capacity, by default, of 100 concurrent connections.

Using connection pooling we will have an improvement in the time spent in every connection attempt made to our Azure SQL Database.

Enjoy!

Office 365 Developer: Office Web Add-in dialog box cannot be displayed and throws error

$
0
0

I came across this issue while working with my developer customer(s) who was developing an Office Web Add-in (Outlook/OWA specific)

- They designed an Outlook Web add-in, the user is asked to allow a dialog box to be displayed.
- Now the user chooses Allow, and it throws the error message (in both IE and Edge)
- The error message states, "The security settings in your browser prevent us from creating a dialog box. Try a different browser, or configure your browser so that [URL] and the domain shown in your address bar are in the same security zone."

In order to overcome the issue, we tried adding the domain of the add-in to the list of trusted sites in Internet Explorer – it worked. Just make sure its a trusted add-in!!

Hope this helps!!


Representation of Math Accents

$
0
0

The post Math Accents discusses how accent usage in math zones differs from that in ordinary text, notably in the occurrence of multicharacter bases. Even with single character bases, the accents may vary in width while in ordinary text the accent widths are the same for all letters. The present post continues the discussion by describing the large number of accents available for math in Unicode and in Microsoft Office math zones and how they are represented in MathML, RTF, OMML, LaTeX, and UnicodeMath.

Unicode math accents

As noted in Section 3.10 Accent Operators of the UnicodeMath specification, the most common math accents are (along with their TeX names)

These and more accents are described in Section 2.6 Accented Characters and 3.2.7 Combining Marks in Unicode Technical Report #25, Unicode Support For Mathematics. More generally, the Unicode ranges U+0300..U+036F and U+20D0..U+20EF have these and other accents that can be used for math.

The Windows Character Map program shows that the Cambria Math font has all combining marks in the range 0300..036F as well as 20D0..20DF, 20E1, 20E5, 20E6, 20E8..20EA. The range 0300..036F used as math accents in Word looks like

Except for the horizontal overstrikes and the double-character accents shown in red, all these work as math accents in Microsoft Office apps, although many aren’t used in math. In keeping with the Unicode Standard, UnicodeMath represents an accent by its Unicode character, placing the accent immediately after the base character. There’s no need for double-character accents in Microsoft Office math since the corresponding “single” character accents expand to fit their bases as in

In UnicodeMath, this is given by (a+b)~, where ~ can be entered using the TeX control word tilde. This is simpler than TeX, which uses widetilde{a+b} for automatically sized tildes rather than tilde{a+b}.

The combining mark in the range 20D0..20EF that work as accent objects in Office math zones areYou can test accents that don’t have TeX control words by inserting a math zone (type alt+=), type a non-hex letter followed by the Unicode value, alt+x, space. For example, alt+=, z, 36F, alt+x, space gives

Accents in MathML

MathML 1 was released as a W3C recommendation in April 1998 as the first XML language to be recommended by the W3C. At that time, Unicode was just starting to take hold as Microsoft Word 97 and Excel 97 had switched to Unicode. [La]TeX was developed before Unicode 1.0, so it relied on control words. Accordingly, it was common practice in 1998 to use control words or common spacing accents to represent accents instead of the Unicode combining marks even though many accents didn’t have a unified standardized representation. Unicode standardized virtually all math accents by using combining marks. One problem with using the combining marks in file formats is that they, well, combine! So, it may be difficult to see them as separate entities unless you insert a no-break space (U+00A0) or space (U+0020) in front of them. UnicodeMath allows a no-break space to appear between the base and accent since UnicodeMath is used as an input format as well as in files. Only programmers need to look at most file formats (HTML, MathML, OMML, RTF), so a reliable standard is more important for file formats than user-friendly presentation.

MathML 3’s operator dictionary defines most horizontal arrows with the “accent” property. In addition, it defines the following accents

02C6      ˆ              modifier letter circumflex accent

02C7      ˇ              caron

02C9      ˉ              modifier letter macron

02CA     ˊ              modifier letter acute accent

02CB     ˋ              modifier letter grave accent

02CD     ˍ              modifier letter low macron

02D8     ˘              breve

02D9     ˙              dot above

02DA     ˚              ring above

02DC     ˜             small tilde

02DD     ˝             double acute accent

02F7      ˷             modifier letter low tilde

0302        ̂             combining circumflex accent

0311        ̑             combining inverted breve

Presumably the operator dictionary should be extended to include more math combining marks and their equivalents, if they exist, with the spacing diacritics in the range U+02C6..U+02DD.

Here’s the MathML for the math object 𝑎̂.

<mml:mover accent="true">

mm<mml:mi>a</mml:mi>

mm<mml:mo>^</mml:mo>

</mml:mover>

 

Accents in OMML

“Office MathML” OMML is the XML used in Microsoft Office file formats to represent most math. It’s an XML version of the in-memory math object model which differs from MathML. The math accent object 𝑎̂ has the following OMML

<m:acc>
mm<m:accPr>
mmmm<m:chr m:val=" ̂"/>
mmmm<m:ctrlPr/>
mm</m:accPr>
mm<m:e>
mmmm<m:r>
mmmmmm<m:t>𝑎</m:t>
mmmm</m:r>
mm</m:e>
</m:acc>

The Rich Text Format (RTF) represents math zones essentially as OMML written in RTF syntax. Regular RTF uses the uN notation for Unicode characters not in the current code page. The math accent object 𝑎̂ has the RTF

{macc{maccPr{mctrlPrif0fs20 }{mchr u770? }}{meiu-10187?u-9138?}}

Unicode RTF is easier to read since characters are written in Unicode

{macc{maccPr{mctrlPrif0fs20 }{mchr  ̂}}{mei 𝑎}}

But none of these is as simple as the UnicodeMath 𝑎 ̂ ☺.

 

 

 

AI Infused Apps- Access to Source Code & Demos

[Azure HPC] Intro to HPC and steps to setup CycleCloud in Azure

$
0
0

As part of Microsoft Internal MOOC course “Big Compute: Uncovering and Landing Hyperscale Solutions in Azure”, I was introduced to CycleCloud and learned how to setup CycleCloud in my Azure subscription. I would like to blog about some of my HPC learning + steps followed to setup one.

What is HPC? High Performance computing(HPC) is a parallel processing technique for solving complex computational problems. HPC applications can scale to thousands of compute cores. We can run these workloads in our premise by setting up clusters, extend the burst volume to cloud or run as a 100% cloud native solution.

image

Where is Big Compute used, usecase ? Usually compute intensive operations are best suited for this workload.

image

How HPC can be achieved in Microsoft Azure?

1) Azure Batch –>managed service, “cluster” as a service, running jobs, developers can write application that submit jobs using SDK, cloud native, HPC as a service, Pay as you go billing

2) CycleCloud –>acquired by MS, “cluster” management software aka orchestration software, supports hybrid clusters, multi cloud, managing and running clusters, one time license, you have complete control of the cluster and nodes

3) CrayComputer –>partnership with CrayComputer, famous weather forecasting service

4) HPC pack in Azure Infra–> Marketplace offerings  {HPC Applications, HPC VM images, HPC storages}

Azure Batch doesn’t need intro as it is there for quite sometime, setting up a Batch is very easy. Tools like Batch Labs helps us to monitor/control the Batch job effortlessly. Batch SDK helps us to integrate with existing legacy application easily to submit the job or manage the entire batch operation using their custom developed application. The end uses need not to login to Azure portal for submitting the jobs.

What is CycleCloud? CycleCloud provides a simple, secure, and scalable way to manage compute and storage resources for HPC and Big Compute/Data workloads in Cloud. CycleCloud enables users to create environments in Azure. It supports distributed jobs and also parallel workloads to tightly-coupled applications such as MPI jobs on Infiniband/RDMA. By managing resource provisioning, configuration, and monitoring, CycleCloud allows users and IT staff to focus on business needs instead infrastructure.

image

image

How to set it up in Azure? Steps are already documented here, I am trying to put the same steps in screenshot for easy reference.

1) Download the json files to your local drive. Say, c:temp

2) Generate the Service Principle

3) Generate SSH pub and private key

4) Clone the repo file to your local drive, say c:temp

git clone https://github.com/azurebigcompute/Labs.git 

5) Edit the vms-params.json file to specify the generated rsaPublicKey parameter from Step3. The cycleDownloadUri and cycleLicenseSas parameters have been pre-configured, but if you procure license then you need to update these two params as well. For now, I am leaving as it..

image

6) Now login to Azure CLI, create resource group, storage account, create VNET deployment and at last create VMs

C:temp>az login

C:temp>az group create --name "cycle-rg" --location "southeastasia"

C:temp> az storage account create --name "mikkyccStorage" --group "cycle-rg" --location "southeastasia" --sku "Standard_LRS"

C:temp>az group deployment create --name "vnet_deployment" --resource-group "cycle-rg" --template-uri https://raw.githubusercontent.com/azurebigcompute/Labs/master/CycleCloud/deploy-vnet.json --parameters vnet-params.json

C:temp>az group deployment create --name "vms_deployment" --resource-group "cycle-rg" --template-uri https://raw.githubusercontent.com/azurebigcompute/Labs/master/CycleCloud/deploy-vms.json --parameters vms-params.json

cyc3


7) Post the deployment, you will find the above set of resources created in our resource group say “cycle-rg”. Select the Cycleserver VM and copy the IP address to see if you can browse CycleCloud setup page.

image

8) Pls note, the installation uses a self-signed SSL certificate, which may show up with a warning in your browser. So, it is safe to ignore the warning and add it as exception to get the page like the after setting up the cluster (refer configure “CycleCloud Server” section from this page). If you get the below page after all the setup, then we are ready to create new cluster and submit the jobs.

image

9) Refer the section as it is “Creating a Grid Engine Cluster” 5.1 as it is from here

10) After the cluster is created, we need to start the cluster and see it is running like the below.

image

11) Now our Grid Engine cluster is ready for the job submission, For security reasons, the CycleCloud VM (CycleServer) is behind a jump box/bastion host. To access CycleServer, we must first log onto the jump box, and then ssh onto the CS instance. To do this, we'll add a second host to jump through to the ssh commands.

From Azure portal, retrieve the admin box DNS and construct the SSH command as in screenshot. The idea is to “ssh –J” to our CycleServer through CycleAdmin box. One cannot directly ssh into CycleServer which is for security.

$ ssh -J cycleadmin@{JUMPBOX PUBLIC HOSTNAME} cycleadmin@cycleserver -i {SSH PRIVATE KEY}

cyc

12) Once we get into CycleAdmin@CycleServer, first change into root user and call CycleCloud Initialize command. You need to enter the username and password for that machine.

image

13) Connecting to the Grid Engine Master as

[root@cycleserver ~]$ cyclecloud connect master –c <clustername>

image

14) Now ready to submit our first job, qstat is to query the status of grid engine jobs and queues & qsub is to submit the batch jobs.

image

15) On successful submission, we should see the job started executing in our nodes.

read16

Master takes the batch job and getting executed from 3 nodes spin under execute node template

image

image

By the way, if we login the Azure portal and navigate to the RG, then we would see there is VMSS created as part of execute worker nodes.

image

image

we could also set the autoscaling feature from CycleCloud cluster settings, so the Azure VM’s comes and goes away once the job is completed. We have submitted 100 jobs per our command so it will request 100 cores. Based on the cluster core limit, it will decide whether to scale till that or not. Let say, if we have set 100 cores as cluster scale limit, then we would see many other VM’s also getting created to complete the task in parallel.

[cyclecloud@ip-0A000404 ~]$ qsub -t 1:100 -V -b y -cwd hostname

Once the job is completed, we can terminate the cluster and also delete the RG if you don’t want to retain which is our last step. I know it’s a bit of learning + confusing to start for the first time, but once you hands-on then it is easy to setup whenever you require and dispose off after completing our jobs.

Happy learning !

Experiencing Data Access Issue in Azure Portal for Many Data Types – 03/31 – Resolved

$
0
0
Final Update: Saturday, 31 March 2018 05:48 UTC

We've confirmed that all systems are back to normal with no customer impact as of 03/31, 05:25 UTC. Our logs show the incident started on 03/31, 04:45 UTC and that during the 40 minutes that it took to resolve the issue 5% of customers would have experienced data access issues in
South Central
.
  • Root Cause: The failure was due to issues with the back-end storage service in the South Central region.
  • Incident Timeline: 40 minutes - 03/31, 04:45 UTC through 03/31, 05:25 UTC

We understand that customers rely on Application Insights as a critical service and apologize for any impact this incident caused.

-Mohini


Initial Update: Saturday, 31 March 2018 05:12 UTC

We are aware of issues within Application Insights and are actively investigating. Some customers may experience Data Gaps. The following data types are affected: Availability,Customer Event,Dependency,Exception,Metric,Page Load,Page View,Performance Counter,Request,Trace.
  • Work Around: None
  • Next Update: Before 03/31 07:30 UTC

We are working hard to resolve this issue and apologize for any inconvenience.
-Mohini

Granular VSTS/TFS Deployment Privileges using Service Principals

$
0
0

Visual Studio Team Services (VSTS) and Team Foundation Server (TFS) integrate smoothly with Azure App Services for Continuous Deployment. It is even possible to set up Continuous Deployment to an isolated App Service Environment (ASE). The easiest way to get started with this is when you log into VSTS with the same credentials as you use for your Azure subscription. However, this generally exposes access to all the resources you have access to in Azure. Another approach is to establish a service connection (using an Azure Active Directory Service Principal) to Azure. The standard dialog for establishing this connection looks like this:

As you can see, this will establish a new service principal (an identity) with privileges on the entire subscription or a specific resource group. This may be too much access to grant and moreover, it may well be that it is not the user of VSTS/TFS responsible for access policy. In this blog, I will briefly describe how to create a service principal with no access and then granularly give it access (e.g., on a resource by resource basis).

In the example below, I will be creating a service principal in Azure Government and connecting it to a VSTS project. In the Azure portal, do the following 5 steps:

 

1. Create a new App Registration in Azure Active Directory:

 

 

2. Make a note of the "Application ID" (a.k.a. Service Principal ID):

 

3. Click the "Keys" pane and create a new key by giving it a name and duration and clicking save.

The key will only be visible right after you click save, so make a note of it:

4. Make a note of your "Subscription ID" and "Subscription Name":

5. Finally make a note of your Azure Active Directory Tenant ID:

After this you should have 5 pieces of information:

Subscription ID: SUBSCRIPTION GUID
Subscription Name: NAME OF SUBCRIPTION
Service Principal ID: SERVICE PRICIPAL GUID
Service Principal Key: SOME-LONG-KEY
Tenant ID: TENANT GUID

You can think of these as a set of user credentials, which currently have no privileges in your subscription. But you can now selectively add this "user" to resources where you want to grant privileges. Let's suppose we want to use this service principal to publish into one specific Web App and nothing else. Find the Web App and bring up the IAM (Identity and Access Management) pane:

Hit "+ Add" and search for the App Registration you created. In this example we are granting it contributor rights to a single Web App:

 

Now let's move to VSTS and add a Service connection using the Service Principal. Instead of using the dialog above, hit the "use the full version of the endpoint dialog." link, which will take you to this dialog:

After you have filled in all the details, make sure to hit "Verify connection" and you should get a green check mark. If you do not get a green check mark, it could be because one of the fields have been filled in incorrectly, but it will also happen if you have created a Service Principal without assigning it privileges to any resources yet. Please follow the steps above.

In the example above, we have named the connection "CICD-MAG" and this name will show up as a subscription when we set up deployments. For example, to deploy into a Web App:

 

And then choose the right "Subscription":

You should notice that you will only be able to pick a single "App service name", since we only granted privileges on a single Web App. You can of course use the Service Principal to grant access to any number of resources and even to entire resource groups. Unlike the simplified dialog discussed in the beginning, this gives you granular access.

And that's it, you now have the tools to delegate just enough control to specific credentials. Let me know if you have questions/comments/suggestions.

Viewing all 12366 articles
Browse latest View live