the next door (so called) geek @rakkimk | your next door geek | friend | blogs mostly on technology, and gadgets.

Kudu for Azure Websites updated with ‘Process Explorer’ tab

I’m sure everyone appreciates the pace in which Azure Websites team releasing cool features. Azure Websites was all over the announcements in the recent //build. The team has updated the Kudu console with new tab named ‘Process Explorer’. You will see it in the list of options available in the site. To access the Kudu console, go to https://yourwebsite.scm.azurewebsites.net (note the https, and .scm in the url).

image

If you have used the Kudu console before, you would have seen there are REST APIs available for a lot of things, including “Processes and mini-dumps” which when used in Google Chrome with JSON viewer extension was easier to use to get mini dumps of the w3wp.exe process, or getting a gcdump of the process. This new “Process Explorer” tab gives you a cool UI way of doing the same.

image

It will list down all your running processes, under your site’s context – including the w3wp.exe that serves your main website (as well as this Kudu site, or any other site extensions like Monaco editor), any WebJobs your site might have, even the cmd.exe/powershell.exe that gets launched when I open the debug console. You could easily see things like, the memory usage of the process, how many threads are there, handles within the process, and more.

image

Getting memory dumps of the worker process was one of the main post mortem debugging techniques we often do in Microsoft Support while helping the customers with their common issues like hang, slow response, memory leak. Good to see this easy way to get dumps from the Kudu console.

Happy Debugging!

Site Extension Gallery in Windows Azure Websites

I’m no ScottGu, or Scott Hanselman, but I had my own privilege to introduce the Site Extensions gallery to the world during my talk on ‘Deepdive with Windows Azure Websites’ along with Puneet at India’s First ever Windows Azure Conference, Bangalore, March 20, 21st. David Ebbo was kind enough to let me do this. Watch out for his blog/twitter for more updates on this in the coming days. This gallery is part of your Kudu console of the Windows Azure Website. It’s like Nuget for Site Extensions. This should provide an amazing opportunities to people with diagnostics products, or helper console for Websites to get people use it with Azure Websites. Needless to say, you will hear more about this soon, but for now, it is how it appears.

image

There are only a few available right now, but I’m sure this list is going to grow. By the way, this feature is live! So, login to your SCM site of your Windows Azure Websites, and you will see a new option in the top bar, ‘Site Extensions’. You can install a Site Extension listed there by just clicking on “Install”. Once you have installed it, you have to restart the site to make it functional. You have a “Restart Site” button in the top. You also will see this new one that you installed in the “Installed” tab. From there, you can click on the “Launch” button to see the new Site Extension that you installed for your site in action. The Azure Storage Explorer site extension is really cool.

I’m sure I will write more about this sooner. Come back again later. You can follow me on Twitter for quick updates.

WAWS - WebJob to upload FREB files to Azure Storage using the WebJobs SDK

After writing my earlier post on creating a simple WebJob to upload the Failed Request Tracing logs automatically to Windows Azure Blob Storage account, I was discussing this with the awesome development team of the WebJob SDK, Amit Apple, and Mike Stall. And, the outcome is, getting my sample modified to use the awesome WebJobs SDK that eases a lot of tasks. And there is more to it – cool Azure Jobs Dashboard with your Windows Azure Web sites giving you a cool dashboard of your WebJobs messages getting processed.

With the WebJobs SDK, there is automatic way of calling certain functions. You can check Scott’s blog where he have used a function that just monitors his Azure Blob Storage account for an new blob to be created, and process that image, and push it to his Azure Blob Storage account itself. The code he has written is very less, just one function, and wrapping that inside an application with the WebJobs SDK. If you notice his function, he used attributes [BlobInput], and [BlobOutput]. However, in my case, for the example that I was trying – to push the files from file system to the Azure Blob storage, I need some thing like [FileInput] which isn’t available, but WebJobs SDK seems to give custom ways to hook in functions to help with the interaction with the Azure Blob Storage account. I’ve modified my function that uploads the file as below. I also call this Upload function from another function.

public static void Upload(string name, string path, // Local file 
                                    [BlobOutput("freblogs/{name}")] Stream output,
 bool deleteAfterUpload)
        {
 using (var fileStream = System.IO.File.OpenRead(path))
            {
                fileStream.CopyTo(output);
            }
 
 if (deleteAfterUpload)
            {
                File.Delete(path);
            }
        }
public void UploadFileToBlob(string name, string path)
        {
            var method = typeof(AzureStorageHelper).GetMethod("Upload");
            _host.Call(method, new { name = name, path = path, deleteAfterUpload = deleteAfterUpload });
        }

If you notice, from the UploadFileToBlob() which I’m calling from my FileSystemWatcher callback, I’m not doing any single bit of code to upload the blob to the Azure Blob Storage, I just call another function via the _host (of type Jobhost), and pass the parameter name, path, and the boolean, WebJobs SDK function automatically fills in the Stream output there which would be created as a blob under the “freblogs” container, with the same name that I pass into this function, “name”. You just need to configure the connection string with name “AzureJobsData” for the application in it’s app.config file. Pretty awesome, isn’t it? This WebJobs SDK is in alpha I’m told, so I’m really waiting to see what are the new features in the final version. Sure, the team has set a high bar for themselves Smile

If this isn’t enough, you have an awesome site extension for your Azure Websites that shows all these WebJobs operations with the Azure Blob Storage. I was trying to understand how this works, what WebJobs SDK does is, create another container called “azure-jobs-invoke-log” in your Blob storage account, and stores the logs inside, which then are fetched by the AzureJobs site extension, and shown to you. Here is what my Storage Account shows the containers, the “freblogs” that contain all the FREB files, and the “azure-jobs-invoke-log” showing the container that holds all the log messages of WebJobs SDK.

image

And, to enable the Site Extension, you need to make sure you first configure the connection string named “AzureJobsRuntime” having the same connection string to the Blob Container.

image

After saving this connecting string, then you have to go to the URL https://<yoursitename>.scm.azurewebsites.net/AzureJobs URL. This page shows you details about your WebJobs configured on the Blob Storage account that you have configured, and it’s full details. You can click on each invocation to see it’s details. Remember, in my case it is a custom function that I’m invoking, so it will show you all the details about the parameters that were passed in, and the result of that call. Also, it has a link to the file that we just uploaded – you can download that file by just clicking on the output hyperlink in the 2nd screenshot below.

image

image

Pretty awesome! Don’t wait. Add more power, and background processing to your Azure Websites using the new WebJobs SDK.

Windows Azure Websites - WebJob to upload FREB logs to Azure Blob Storage

Currently in Windows Azure Web sites, there is an option to store your website logs to Azure Blob Storage, but however, the FREB logs – failed request tracing logs, can only be stored in the file system. You will then grab them via your favorite FTP tool, and do the analysis. One of my co-worker asked this question on if we can store the Failed Request Tracing logs in the Azure Blob Storage, and Richard Marr gave this interesting idea of using WebJobs to move the files to Azure Blob Storage. I tried this quickly, and have a beta version of a similar webjob ready for you if you want to try it. I might revive the code sometime later when I find time, but feel free to use the code that I’ve posted in this GitHub repo.

If you notice, I’ve not used the WebJobs SDK as such, but as a normal C# program that monitors the folders and uploads the files created in that folder to Azure Blob Storage. It creates a container called “freblogs” if it doesn’t exist, and stores the files there. You can modify the code to cater to your need.

Entire code of my small C# application:

using System;
using System.IO;
using System.Configuration;
using Microsoft.WindowsAzure.Storage;
using Microsoft.WindowsAzure.Storage.Auth;
using Microsoft.WindowsAzure.Storage.Blob;
 
namespace PushToStorage
{
class AzureStorageHelper
    {
        CloudStorageAccount storageAccount;
        CloudBlobClient blobClient;
        CloudBlobContainer container;
        CloudBlockBlob blockBlob;
bool deleteAfterUpload = false;
public AzureStorageHelper()
        {
 // Retrieve storage account from connection string.
            storageAccount = CloudStorageAccount.Parse(ConfigurationManager.ConnectionStrings["StorageConnectionString"].ConnectionString);
 
 // Create the blob client.
            blobClient = storageAccount.CreateCloudBlobClient();
 
 // Retrieve a reference to a container. 
            container = blobClient.GetContainerReference("freblogs");
 
 // Create the container if it doesn't already exist.
            container.CreateIfNotExists();
 string tmp = ConfigurationManager.AppSettings["DeleteAfterUpload"];
 if (tmp != null)
            {
                deleteAfterUpload = (Int32.Parse(tmp) == 1) ? true : false;
            }
 
        }
 
public void UploadFileToBlob(string name, string path)
        {
 try
            {
                Console.WriteLine("Starting uploading " + name);
 using (var fileStream = System.IO.File.OpenRead(path))
                {
                    blockBlob = container.GetBlockBlobReference(name);
                    blockBlob.UploadFromStream(fileStream);
                    Console.WriteLine(name + " successfully uploaded!");
 if (deleteAfterUpload)
                    {
                        fileStream.Close();
                        File.Delete(path);
                        Console.WriteLine(path + " deleted!");
                    }
                }
            }
 catch (Exception ee)
            {
                Console.WriteLine(ee.Message);
            }
        }
 
public bool IsFileReady(String sFilename)
        {
 try
            {
 using (FileStream fileStream = File.Open(sFilename, FileMode.Open, FileAccess.Read, FileShare.None))
                {
 if (fileStream.Length > 0)
 return true;
 else
 return false;
                }
            }
 catch (Exception)
            {
 return false;
            }
        }
    }
class Program
    {
static AzureStorageHelper azStorageHelper;
static void Main(string[] args)
        {
            Console.WriteLine("Initializing AzureStorageHelper!");
            azStorageHelper = new AzureStorageHelper();
 
 string path = ConfigurationManager.AppSettings["directory"];
 
 string[] directories = Directory.GetDirectories(path, "*W3SVC*");
 
            FileSystemWatcher[] fsw = new FileSystemWatcher[directories.Length];
            Console.WriteLine(path + " " + fsw.Length);
 for (int i = 0; i < directories.Length; i++)
            {
                fsw[i] = new FileSystemWatcher(directories[i]);
                fsw[i].NotifyFilter = NotifyFilters.LastAccess | NotifyFilters.LastWrite | NotifyFilters.FileName | NotifyFilters.DirectoryName | NotifyFilters.Size;
                fsw[i].Created += fsw_Created;
                fsw[i].IncludeSubdirectories = true;
                fsw[i].EnableRaisingEvents = true;
                Console.WriteLine(String.Format("{0} Started watching directory {1} for files!", DateTime.Now.ToString(), directories[i]));
            }
 
            Console.ReadLine();
            Console.WriteLine(DateTime.Now.ToString() + " Stopping!");
        }
 
static void fsw_Created(object sender, FileSystemEventArgs e)
        {
            FileInfo fileInfo = new FileInfo(e.FullPath);
 while (!azStorageHelper.IsFileReady(e.FullPath))
                System.Threading.Thread.Sleep(1000);
 
            Console.WriteLine(" Created " + e.Name + " " + e.FullPath);
            azStorageHelper.UploadFileToBlob(e.Name, e.FullPath);
        }
    }
}

As you see, the code reads from web.config for the Azure Blob Storage Connection String, as well as two other settings – folder to watch, and option to delete the file after uploading. They are self explanatory, so you can use this to move any files of your folders to the Azure Blob Storage. You need to install the Windows Azure Storage Nuget Package. Once you have finished the application as a standalone, you need to create a compressed zip file consisting of your .exe, .exe.config, and other Windows Azure Storage reference DLL (since these are not part of WAWS instance by default), and follow the steps in this article to create a WebJob. You can choose to create different type of the job as per your need.

Articles below helped me writing this small tool:

How to use the Windows Azure Blob Storage Service in .NET
http://www.windowsazure.com/en-us/documentation/articles/storage-dotnet-how-to-use-blobs-20/#upload-blob

Using the WebJobs feature of Windows Azure Web Sites

http://curah.microsoft.com/52143/using-the-webjobs-feature-of-windows-azure-web-sites

Hope this helps!

Do you know swapping between a staging environment, and the production environment is a push of a button, in Windows Azure Websites?

Yes, you are reading it right. It is true. It’s just a matter of one click to promote your staging website to be the production site. Gone are those hassles you were facing with publishing a new deployment to the live website, and some unbaked pages/assemblies, needless to say much pain, downtime, and probability of errors.

Windows Azure Web Sites is here to ease a lot of tasks you typically do in your websites. Now, create a new staging environment for your existing website in a matter of few seconds, and start testing the new bits, and push a button to make it live.

image

Read more about this feature in this blog post of ScottGu.

Windows Azure Web Sites (WAWS) - Collecting dumps of the worker process (w3wp.exe) automatically whenever a request takes a long time

Websites being slow is perhaps the most common problem every website administrator, and developers run into. If they are extremely unlucky, then see this problem only in their production environment. Many troubleshooting techniques, best practices are available for this scenario. I will try to cover them in a different post as a part of my ASP.NET Troubleshooting series some other time. Meanwhile, you can try looking at this post of mine, where I’ve something that might help you.

For now, let’s focus on Windows Azure Web Sites. As you know this is a closed (well, not completely) hosting environment, and still there are a few things that you can do for this problem – for example, you can try collecting FREB traces for a long running request, and see where it is stuck. FREB shows ASP.NET ETW events as well, but has only the page lifecycle events. For example, it will tell you where the problem like Page_Load is, but not what inside Page_Load. To find more, you either have to profile your application, or collect a memory dump of your process serving the request, and see what the request is doing for such a long time.

I’ll put the steps to enable an automatic collection of memory dump whenever a request processing exceeds ‘x’ number of seconds. This is going to use the same customAction for FREB which I’ve detailed in this old post of mine. In WAWS, the customActionsEnabled attribute for the website is set to “true” by default, so you have to just put the below web.config file. In this example, I’m going to use Windows Sysinternals procdump.exe to take the dump of our process (w3wp.exe). Here are the steps:

Enable ‘Failed Request Tracing’ from the Portal

First, you need to turn on FREB from your management portal. This article has the brief steps how to view those logs from Visual Studio, and even configuring it from there. From the portal, for your website, under configure tab -> site diagnostics, set the below to On.

clip_image001

Transfer Procdump.exe to your deployment folder using FTP

Second, you need to put procdump.exe in your website deployment folder. Download it to your local machine from here. You can create a new folder, and place it in there, let that folder be the path where the dumps be stored as well. In my example, I’ve created a folder called ‘Diagnostics’ under the root, and placed the procdump.exe in there. Screenshot of my FileZilla:

clip_image002

Configure the web.config with configuration to collect dump

Lastly, you need to place the below configuration in the web.config file to enable procdump.exe to be spawned with certain parameters whenever the request exceeds 15 seconds, in this case:

<?xml version="1.0" encoding="UTF-8"?>
<configuration>
<system.webServer>
  <tracing>
  <traceFailedRequests>
    <remove path="*" />
    <add path="*" customActionExe="d:\home\Diagnostics\procdump.exe" customActionParams="-accepteula w3wp d:\home\Diagnostics\w3wp_PID_%1%_" customActionTriggerLimit="5">
     <traceAreas>
       <add provider="ASP" verbosity="Verbose" />
       <add provider="ASPNET" areas="Infrastructure,Module,Page,AppServices" verbosity="Verbose" />
       <add provider="ISAPI Extension" verbosity="Verbose" />
       <add provider="WWW Server" areas="Authentication,Security,Filter,StaticFile,CGI,Compression,
                                         Cache,RequestNotifications,Module,FastCGI"
                                  verbosity="Verbose" />
     </traceAreas>
     <failureDefinitions timeTaken="00:00:15" />
    </add>
  </traceFailedRequests>
  </tracing>
</system.webServer>
</configuration>

 

Above configuration will take a mini-dump of the w3wp.exe serving your WAWS site, and put it in the folder d:\home\Diagnostics with dump name having it’s PID. And, if you want a full dump, you can have -ma parameter added. Example customActionParams="-accepteula -ma w3wp d:\home\Diagnostics\w3wp_PID_%1%_".

You could use any other additional switches that typically use for ProcDump. For a slow running page scenario, I might collect dumps at regular intervals – 3 dumps with 5 seconds interval each, so that we can check what the request is doing across these timings. For that you can set the customActionParams to “-accepteula -s 5 -n 3 w3wp d:\home\Diagnostics\w3wp_PID_%1%_”.

Hope this helps!

Quick ways to edit your files hosted in Windows Azure Web Sites (WAWS), other than re-deploying

Editing a small piece of code, or change in configuration file is perhaps the most common thing for a developer to do while testing the site, or even when the site is live for production traffic. When you host in with some hosting providers, most often you end up re-deploying the whole package, or transfer that file over FTP, or any other deployment methods. In this blog, I’m going to cover a few other methods which will help you edit your files of your website hosted in Windows Azure Web Sites.

[Monaco] Visual Studio Online

If you have a very quick edit of a source code, or configuration, editing it over this new shiny Visual Studio Online link for WAWS is perhaps the easiest of all. I have already written about this feature some time back in this blog. Here are the quick steps:

  1. Enable ‘Edit in Visual Studio Online’ option for the website. You can find this option in the Portal –> Websites –> Your Website –> Configure tab. Set it to ON. Click on Save.
  2. Click on the ‘Edit in Visual Studio Online’ option in the Dashboard. Once you did the step 1 to enable this option, and saved it successfully, you should see a new option listed in the Dashboard, ‘Edit in Visual Studio Online’. Click on it.
  3. Enter your deployment user credentials when asked.
  4. Choose the file you want to edit, and start editing your site live. No save button. It’s live as you edit.

Here is a screenshot of my test config file with Monaco:

image

Kudu Console of your website

If you have your site in WAWS, you should definitely check out this Kudu console for the site. Along with some cool stuffs like an interactive console, this also offers you a way to edit the files. Here are the quick steps to edit your web.config file using Kudu console:

  1. Browse to https://<yoursite>.scm.azurewebsites.net, and enter your Deployment credentials to access the Kudu console.
  2. You are greeted with a lot of options, for our task, choose the “Debug console” from the top menu.
  3. Now you are presented with a somewhat familiar screen showing a command prompt window, and a file explorer in the top.
  4. Select the folder. In our case, web.config would be inside /site/wwwroot
  5. You would see something like below:

     

    image

  6. Click on the image ‘Edit’ icon in the web.config file. Now, you will see something like below, where you can edit the content of the file, and hit ‘Save’ button.

 

image

Hope this helps!

Removing the X-Powered-By response header from Windows Azure Web Sites

People do want to remove this header as a part of some of their security audit that claims to know the server software running their site, and that knowledge will make an attacker craft malicious attacks known for that server version. If you are on latest versions of any server side framework, you should be good. But, some think it is always a good idea to remove that.

In PHP, you have to set the expose_php setting to Off to hide the PHP version information from the response headers. In Windows Azure Web Site, you can have optional .user.ini file where some of the PHP settings can be overridden. You can look at the steps mentioned in this article. For example, look at this blog by one of my colleague talking about increasing the upload limit for the files. However, there are a few core PHP settings that cannot be overridden from this .user.ini file. Don’t worry, WAWS gives you an option to host your custom PHP runtime. This article has steps for the same.

 

<?xml version="1.0" encoding="UTF-8"?>
<configuration>
    <system.webServer>
    <rewrite>
      <outboundRules>
                <rule name="Set PoweredBy Header" preCondition="IsHtml">
                    <match serverVariable="RESPONSE_X_Powered_By" pattern="(.+)" />
                    <action type="Rewrite" value="" />
                </rule>
                <preConditions>
                    <preCondition name="IsHtml">
                        <add input="{RESPONSE_CONTENT_TYPE}" pattern="^text/html" />
                   </preCondition>
                </preConditions>
      </outboundRules>
    </rewrite>
    </system.webServer>
</configuration>

 

Hope this helps!

Tweaking the queueLength for PHP handler - Windows Azure Web Sites

Users moving to Windows Azure Web Sites (WAWS) is increasing day by day. Happy to see many of the PHP websites being hosted with WAWS. If you are hosting your high traffic website with WAWS, I would like you to consider increasing the queueLength property of FastCGI handler for PHP that handles your request. By default, the value for queueLength property is 1000, which means only 1000 concurrent requests can be in the queue getting processed. For a many high traffic websites, this might seem to be a low number, and you would start seeing 503 errors in your instance logs.

With help from David Ebbo from the WAWS Product Team here at Microsoft, I was able to tweak this number with the below steps.

Steps to increase FastCGI PHP on Windows Azure Web Sites

1. Create a file named applicationhost.xdt under /SiteExtensions/<YourFolder>

Login to your website root using FTP, and create a folder /SiteExtensions. Create another folder inside it with some name, in this case, you can name it PHPQueueLength. This name is not important. Now, create a file inside that folder named applicationHost.xdt, and the content of the file is below. This specifically looks for the version PHP 5.4. If you are using a different version, then please change it to the appropriate path. Take help from the steps mentioned in this page that will let you download the copy of applicationHost.config.

 

<?xml version="1.0"?>

<configuration xmlns:xdt="http://schemas.microsoft.com/XML-Document-Transform">

    <system.webServer>

      <fastCgi>

<application xdt:Locator="Match(fullPath)" xdt:Transform="SetAttributes(queueLength)" fullPath="D:\Program Files (x86)\PHP\v5.4\php-cgi.exe" queueLength="5000">

            </application>

       </fastCgi>

    </system.webServer>

</configuration>

 

2. Create an App Setting called WEBSITE_PRIVATE_EXTENSIONS for the website, with a value 1.

Login to your Azure Management portal. You can find the App Settings for your website under the ‘CONFIGURE’ tab of your website. Add a new configuration setting with name WEBSITE_PRIVATE_EXTENSIONS and value 1.

clip_image002

 

3.  Restart the site from your management portal.

clip_image004

You can verify if this transform is applied. You could follow the steps mentioned in this page, under the ‘Debugging private Extensions’ section. This configuration will definitely help you getting rid of those server errors which are of 503s for PHP processing. However, you should still work on to see if there are any requests taking more time to get processed, and debug the same.

Here are a few blog articles which can help you to debug slow running PHP pages in WAWS, if you aren’t aware of them already.

http://blogs.msdn.com/b/asiatech/archive/2013/11/15/azure-websites-find-php-performance-bottleneck.aspx

http://ruslany.net/2013/01/php-troubleshooting-in-windows-azure-web-sites

 

Happy hosting with WAWS!

Editing Windows Azure Web Sites online with the new shiny Monaco

Yeah, this is one feature everyone wanted for a long time. An online editing option, with all (okay, almost) the goodness of Visual Studio. Here is my step by step guide to do the same. I’m assuming you have already created a Windows Azure Web Sites. If you haven’t, you can read about that here.

Step 1 : Enable ‘Edit in Visual Studio Online’ option for the website.

You can find this option in the Portal –> Websites –> Your Website –> Configure tab. Set it to ON. Click on Save.

image

Wait until you see the setting saved. You should see the below message:

image

Step 2 : Click on the ‘Edit in Visual Studio Online’ option in the Dashboard.

Once you did the step 1 to enable this option, and saved it successfully, you should see a new option listed in the Dashboard, ‘Edit in Visual Studio Online’. Click on it.

image

Step 3 : Enter your deployment user credentials when asked.

This is the user account that you use to login to your FTP folder. If you aren’t sure, you can look at the username in the Dashboard page. If you have forgotten the password, you can select the ‘Reset your deployment credentials’ option from the Dashboard, and use the new credentials.

Step 4 : Edit your site live. No save button. It’s live as you edit.

image

Note the html tags highlighting. This is just brilliant! If you want to quickly edit something in the site, say some config setting, you don’t need to redeploy the web.config using any of your favorite deployment methods, just a click away to edit the files.

Don’t miss to follow the Channel 9 series on Monaco.