Archive

Author Archive

Upcoming Windows Azure Events June 2012

June 3, 2012 Michael Washam Leave a comment
We have a LOT of exciting news and events coming soon with Windows Azure. The Meet Windows Azure event will kick things off in San Francisco and will be immediately followed up by Learn Windows Azure @TechEd. From there most everyone on my team is on a world tour to talk about what’s coming up. I will be speaking in TechEd North America, 6/22 in London and also TechEd Europe.

Very exciting times to be working with Windows Azure!

Check out the list below for upcoming events near you!


Meet Windows Azure – June 7th 2012

spacer
Learn Windows Azure @TechEd – June 11th 2012

spacer
Windows Azure Dev Camps – Various Dates and Locations

spacer
Categories: Windows Azure Tags: Windows Azure Events

Automated Global Deployments with Traffic Manager and PowerShell

April 24, 2012 Michael Washam Leave a comment

On my recent appearance on Cloud Cover my demo failed spacer

Sad to say but it happens. Especially, when you don’t test your demo after rebuilding your laptop. That’s OK though. The beauty of the Internet is you can always make up for it later by posting a working example spacer

So here it is in it’s entirety a script that deploys a Windows Azure solution to two data centers and configures Traffic Manager to balance traffic between the endpoints.


#Select subscription first
select-Subscription azurepub

$certpath = 'C:\managementCert.pfx'
$mcpwd = 'certPassword!'

# Deployment package and configuration 
$PackageFile = 'C:\Deployment\MortgageApp.cspkg'
$ServiceConfig = 'C:\Deployment\ServiceConfiguration.Cloud.cscfg'

# Set up our variables for each data center 
$NCService = 'WoodGroveNC'
$EUService = 'WoodGroveEU'
$NCStorage = 'demoncstorage1'
$EUStorage = 'demoeustorage1'
$NCDC = 'North Central US'
$EUDC = 'North Europe'
$TMProfileName = 'WoodGroveGlobalTM'

# Set up our variables for each data center 
$NCService = 'WoodGroveNC'
$EUService = 'WoodGroveEU'
$NCStorage = 'demoncstorage1'
$EUStorage = 'demoeustorage1'
$NCDC = 'North Central US'
$EUDC = 'North Europe'
$TMProfileName = 'WoodGroveGlobalTM'


# Created Hosted Service in North Central US
New-HostedService -ServiceName $NCService -Location $NCDC | 
	Get-OperationStatus -WaitToComplete


# Created Hosted Service in Europe

New-HostedService -ServiceName $EUService -Location $EUDC | 
	Get-OperationStatus -WaitToComplete



# Get a reference to the newly created services
$hostedServiceNC = Get-HostedService -ServiceName $NCService
$hostedServiceEU = Get-HostedService -ServiceName $EUService
	


# Add the management certificate to each hosted service 
$hostedServiceNC | Add-Certificate -CertToDeploy $certpath -Password $mcpwd |
	Get-OperationStatus -WaitToComplete
	
$hostedServiceEU | Add-Certificate -CertToDeploy $certpath -Password $mcpwd | 
	Get-OperationStatus -WaitToComplete


# Create a new storage account in each data center
New-StorageAccount -StorageAccountName $NCStorage -Location $NCDC | 
	Get-OperationStatus -WaitToComplete    
	
New-StorageAccount -StorageAccountName $EUStorage -Location $EUDC | 
	Get-OperationStatus -WaitToComplete    

$hostedServiceEU | New-Deployment -Name $EUService -StorageAccountName $EUStorage `
				   -Package $PackageFile -Configuration $ServiceConfig  | 
				   Get-OperationStatus -WaitToComplete


# Deploy the hosted service to each data center 
$hostedServiceNC | New-Deployment -Name $NCService -StorageAccountName $NCStorage `
				   -Package $PackageFile -Configuration $ServiceConfig  | 
				   Get-OperationStatus -WaitToComplete



# Start each service 
$hostedServiceNC | Get-Deployment -Slot Production | Set-DeploymentStatus -Status Running `
				 | Get-OperationStatus -WaitToComplete
				 
$hostedServiceEU | Get-Deployment -Slot Production | Set-DeploymentStatus -Status Running `
			     | Get-OperationStatus -WaitToComplete


# Configure Traffic Manager to enable Performance Based Load Balancing and Failover
$profile = New-TrafficManagerProfile -ProfileName $TMProfileName `
		  -DomainName 'woodgroveglobal.trafficmanager.net'

$endpoints = @()
$endpoints += New-TrafficManagerEndpoint -DomainName 'WoodGroveNC.cloudapp.net'
$endpoints += New-TrafficManagerEndpoint -DomainName 'WoodGroveEU.cloudapp.net'


# Configure the endpoint Traffic Manager will monitor for service health 
$monitors = @()
$monitors += New-TrafficManagerMonitor –Port 80 –Protocol HTTP –RelativePath /



# Create new definition
$createdDefinition = New-TrafficManagerDefinition -ProfileName $TMProfileName -TimeToLiveInSeconds 300 `
			-LoadBalancingMethod Performance -Monitors $monitors -Endpoints $endpoints -Status Enabled
			

# Enable the profile with the newly created traffic manager definition 
Set-TrafficManagerProfile -ProfileName $TMProfileName -Enable -DefinitionVersion $createdDefinition.Version

Categories: PowerShell, Traffic Manager, Windows Azure

Windows Azure PowerShell Cmdlets 2.2.2 Release

March 12, 2012 Michael Washam 1 comment


Windows Azure PowerShell Cmdlets (v2.2.2)

We have a brand new release of the Windows Azure PowerShell cmdlets that we hope
will make getting started and scripting with the cmdlets a much easier task.

The new release can be downloaded from its CodePlex project site 
here.

Getting Started Improvements

In 2.2.2 we have added a start menu link that starts a PowerShell session with
the Windows Azure cmdlets already loaded. We have also added a Start Here link
that shows how to complete the setup and a short tour about the capabilities of
the Windows Azure PowerShell cmdlets and release changes.

spacer

Subscription Management Improvements

We have taken the
subscription management improvements from the 2.2 release and made them much
better.
Specifically, we have added the ability to persist your subscription settings
into your user profile. This functionality allows you to set the subscription
data once and then in new scripts or PowerShell sessions just select the
subscription you want to use without the need to specify the subscription ID,
Certificate and storage accounts each time.

Code Snippet One: Setting Subscription Data

      $subid = "{subscription id}"
     $cert = Get-Item cert:\CurrentUser\My\CERTTHUMBPRINTUPPERCASE

     # Persisting Subscription Setings
     Set-Subscription -SubscriptionName org-sub1 -Certificate $cert -SubscriptionId $subid

     # Setting the current subscription to use 
     Select-Subscription -SubscriptionName org-sub1 
    

Calling the Set-Subscription cmdlet with your certificate
and subscription ID. Set-Subscription will persist the certificate thumbprint
and subscription id to (C:\Users\{username}\AppData\Roaming\Windows Azure
PowerShell Cmdlets\DefaultSubscriptionData.xml) associated with the subscription name.

This functionality supports adding multiple subscriptions to your configuration
so you can manage each individually within the same script simply by calling
Select-Subscription with the subscription name.

Code Snippet Two: Setting the Default Subscription

        Set-Subscription -DefaultSubscription org-sub1
    

Snippet two demonstrates setting the default subscription to use if you do not set one with Select-Subscription.

Code Snippet Three: Associating Storage Accounts with your Subscription

     # Save the cert and subscriptio id for two subscriptions
     Set-Subscription -SubscriptionName org-sub1 -StorageAccountName mystoragename1 -StorageAccountKey mystoragekey1
     Set-Subscription -SubscriptionName org-sub1 -StorageAccountName mystoragename2 -StorageAccountKey mystoragekey2

     # Specify the default storage account to use for the subscription
     Set-Subscription -SubscriptionName org-sub1 -DefaultStorageAccount mystoragename1
    

Snippet three shows that you can associate multiple storage accounts with a single subscription. All it takes to use the correct storage account is to set the default before calling a cmdlet that requires a storage account.

Code Snippet Four: Specifying the Subscription Data File Location

       # Overriding the default location to save subscription settings
       Set-Subscription -SubscriptionName org-sub1 -Certificate $cert -SubscriptionId $subid -SubscriptionDataFile c:\mysubs.xml

       # Retrieving a list of subscriptions from an alternate location
       Get-Subscription -SubscriptionDataFile c:\mysubs.xml
    

Each of th subscription cmdlets take a -SubscriptionDataFile parameter that allows you to specify which XML file to use for operations.

Code Snippet Five: MISC Subscription Management

        # Returns all persisted settings 
       Get-Subscription

       # Removes mysub2 from persisted settings
       Remove-Subscription -SubscriptionName org-sub2

       # Removing a storage account from your persisted subscription settings
       Set-Subscription -SubscriptionName org-sub1 -RemoveStorageAccount mystoragename1
    

Other Usability Improvements

We have made many of the cmdlets simpler to use by allowing more parameters to
be optional with default values.

  • -Label parameter is now optional in New-AffinityGroup, Set-AffinityGroup, New-HostedService, New-StorageAccount, New-Deployment and Update-Deployment.
  • -Slot parameter is now optional in New-Deployment and Update-Deployment (Production slot is used by default).
  • -Name parameter is now optional in New-Deployment (a Globally Unique Identifier value is used by default).

In addition to the defaults we provided some needed fixes to unblock certain scenarios.

  • Get-Deployment now returns $null if no deployment was found in the specified slot (an error was thrown in previous versions).
  • -Package and -Configuration parameters now accept UNC paths in New-Deployment and Update-Deployment.

Breaking Changes

With improvements like these we did have to make some sacrifices.
Before you download the latest build please review the list below because we have a few breaking changes.

  • -DefaultStorageAccountName and -DefaultStorageAccountKey parameters were removed from Set-Subscription. Instead, when adding multiple accounts to a subscription, each one needs to be added with -StorageAccountName and -StorageAccountKey or -ConnectionString. To set a default storage account, use Set-Subscription –DefaultStorageAccount {account name}.
  • -SubscriptionName is now mandatory in Set-Subscription.
  • In previous releases, the subscription data was not persisted between PowerShell sessions. When importing subscription settings from a publishsettings file downloaded from the management portal, the Import-Subscription cmdlet optionally saved the subscription information to a file that could then be restored using Set-Subscription thereafter. This behavior has changed. Now, imported subscription data is always persisted to the subscription data file and is immediately available in subsequent sessions. Set-Subscription can be used to update these subscription settings or to create additional subscription data sets.
  • Renamed -CertificateToDeploy parameter to -CertToDeploy in Add-Certificate.
  • Renamed -ServiceName parameter to -StorageAccountName in all Storage Service cmdlets (added “ServiceName” as a parameter alias for backward compatibility).

Summary

In the 2.2.2 release we have made a number of fixes such as accepting UNC paths and fixing Get-Deployment to not throw an error on empty slots. We have also substantially improved the getting started experience and how you can manage your Windows Azure subscriptions from PowerShell.

The new release can be downloaded from wappowershell.codeplex.com.

Categories: PowerShell, Windows Azure Tags: Windows Azure PowerShell CmdLets

Windows Azure PowerShell Cmdlets 2.2 Released

January 16, 2012 Michael Washam Leave a comment


Windows Azure PowerShell Cmdlets (v2.2)

Download the Latest Release

We have a brand new release of the Windows Azure PowerShell cmdlets that we hope will open up new opportunities to automate and manage your Windows Azure deployments along with just making your life easier as a PowerShell user. So what is new? Let’s take a look!

Scripting Usability Improvements

If you have used the cmdlets for any amount of time one thing that has likely annoyed you is the requirement to pass -SubscriptionID, -Certificate, -StorageAccountName and -StorageAccountKey around to almost every cmdlet. This design made it almost impossible to use from a command shell and only lended itself to easily being used from a script.

We have introduced three new cmdlets to make your life easier in this respect:

  • Import-Subscription
  • Set-Subscription
  • Select-Subscription

Set-Subscription and Select-Subscription

Set/Select Subscription allows you to specify the SubscriptionID, Certificate, DefaultStorageAccountName and DefaultStorageAccountKey and save them in session state. What this means is once you call these cmdlets you do not need to pass those arguments to every cmdlet any more. They will just use the data from session state and save you a  ton of typing. These cmdlets do support multiple subscriptions. Just call Set-Subscription once for each subscription you need to use and then call Select-Subscription to set the current subscription.

One thing that is important is using Set/Select-Subscription should be used mutually exclusive from passing the same data as parameters. In some cases it may work fine and in others you may get strange errors about -SubscriptionID or -Certificate not being known parameters.

Example:

     $cert = Get-Item cert:\CurrentUser\My\{your cert thumbprint}
    Set-Subscription -SubscriptionName "mysub" -SubscriptionId {your subscription id} `
                     -Certificate $cert `
                     -DefaultStorageAccountName "{your storage account}" `
                     -DefaultStorageAccountKey "{your storage key}"

    Select-Subscription -SubscriptionName "mysub"
    Get-HostedService -ServiceName "myService"
    

Import-Subscription

Import-Subscription allows you to import a .publishingsettings file that was previously downloaded from: https://windows.azure.com/download/publishprofile.aspx?wa=wsignin1.0.

This cmdlet adds the embedded management certificate into your local certificate store and saves an xml file that the cmdlets can use to automatically import the subcription information into your PowerShell session.

Used in conjunction with the new Set-Subscription and Select-Subscription cmdlets it makes for an easy way to get setup for the first time without having to deal with manually creating/importing the management certificates.
An example:

     Import-Subscription -PublishSettingsFile "C:\WindowsAzure\Field_ mwasham-1-15-2012-credentials.publishsettings" `
                        -SubscriptionDataFile "c:\WindowsAzure\mysub.xml"

    Set-Subscription -SubscriptionName "mysub" -SubscriptionDataFile "c:\WindowsAzure\mysub.xml" `
                     -DefaultStorageAccountName "{your storage account}" `
                     -DefaultStorageAccountKey "{your storage key}"

    Select-Subscription -SubscriptionName "mysub"
    

Windows Azure Traffic Manager Support

We’ve added the ability to fully manage and customize your deployments that use Windows Azure Traffic Manager.

Windows Azure Traffic Manager Cmdlets

  • New-TrafficManagerProfile
  • Get-TrafficManagerProfile
  • Remove-TrafficManagerProfile
  • Set-TrafficManagerProfile
  • Get-TrafficManagerDefinition
  • New-TrafficManagerDefinition
  • Add-TrafficManagerEndpoint
  • New-TrafficManagerEndpoint
  • Set-TrafficManagerEndpoint
  • Remove-TrafficManagerEndpoint
  • New-TrafficManagerMonitor

Here is an example of how you can use PowerShell to create a new profile and definition:

    # Create new Traffic Manager Profile
    New-TrafficManagerProfile -DomainName "woodgrove.trafficmanager.net" `
                              -ProfileName "ProfileFromPS"

    # Specify the monitor settings
    $monitors = @()
    $monitor = New-TrafficManagerMonitor -RelativePath "/" -Port 80 -Protocol http
    $monitors += $monitor

    # Create an array to hold the Traffic Manager Endpoints
    $endpoints = @()

    # Specify the endpoint for our North Central US application
    $endpoint1 = New-TrafficManagerEndpoint -DomainName "WoodGroveUS.cloudapp.net"
    $endpoints += $endpoint1

    # Specify the endpoint for our North Europe application
    $endpoint2 = New-TrafficManagerEndpoint -DomainName "WoodGroveEU.cloudapp.net"
    $endpoints += $endpoint2

    # Create the definition by passing in the monitor, endpoints and other settings.
    # -Status enabled automatically enables the profile with the new definition as the active one.
    $newDef = New-TrafficManagerDefinition -ProfileName "ProfilefromPS" `
					    -TimeToLiveInSeconds 300 -LoadBalancingMethod Performance `
					    -Monitors $monitors -Endpoints $endpoints -Status Enabled
    # Set the active profile version & enable
    Set-TrafficManagerProfile -ProfileName "ProfilefromPS" -Enable `
   		           -DefinitionVersion $newDef.Version
    

New-SqlAzureFirewallRule -UseIpAddressDetection

This cmdlet is not new however the -UseIpAddressDetection parameter is. It was actually released in a 2.1 release that wasn’t highly publicized. The -UseIpAddressDetection parameter allows you to add a firewall rule whether you know your external IP address or not. Perfect for getting up and running quickly in a new environment.

Here is an example of using -UseIpAddressDetection:

    New-SqlAzureFirewallRule -ServerName "{server name}" `
			    -UseIpAddressDetection `
			    -RuleName "testautodetect"
    

Set-RoleInstanceCount

This cmdlet is new and allows you to specify the number of instances for a given role. We’ve always had the ability to increment or decrement the number of instances but it wasn’t nearly as easy.

Here is an example that sets the instance count for myWebRole to 4.

    Get-HostedService -ServiceName "WoodGroveUS" | `
				  Get-Deployment -Slot Production | `
				  Set-RoleInstanceCount -Count 4 -RoleName "myWebRole"
    

New-PerformanceCounter

This cmdlet is new and just wraps the creation of perfmon counters for setting diagnostics. It makes your life easier when dealing with logging spacer

Here is an example of adding some perfmon counters:

    function GetDiagRoles {
         Get-HostedService -ServiceName $serviceName | `
                         Get-Deployment -Slot $deploymentslot | `
                         Get-DiagnosticAwareRoles
    }
    $counters = @()
    $counters += New-PerformanceCounter -SampleRateSeconds 10 `
			    -CounterSpecifier "\Processor(_Total)\% Processor Time"

    $counters += New-PerformanceCounter -SampleRateSeconds 10 `
			    -CounterSpecifier "\Memory\Available MBytes"

    GetDiagRoles | foreach {
	    $_ | Set-PerformanceCounter -PerformanceCounters $counters -TransferPeriod 15
    }
    

Get-PerfmonLog <breaking change>

This cmdlet was introduced in the 2.0 release of the cmdlets. Some of our awesome field folks determined that it was not capturing perfmon counters correctly when there were multiple instances of a role being monitored. We have fixed this bug and it now appears to be capturing all of the relevant data. The problem is this is a breaking change. So if you are currently using this cmdlet your script will need to be updated. Thankfully, the script change is a minor one. Instead of taking a file name for -LocalPath it now takes a directory. Each instance being monitored will now get its own .blg or .csv file.

Example that uses a helper function called GetDiagRoles to download the perfmon logs.

    function GetDiagRoles {
         Get-HostedService -ServiceName $serviceName | `
                         Get-Deployment -Slot $deploymentslot | `
                         Get-DiagnosticAwareRoles
    }
    GetDiagRoles | foreach {
	$_ | Get-PerfmonLog -LocalPath "c:\DiagData" -Format CSV
    }
    

In addition to these improvements we have actually made quite a few other more minor improvements. One key request is to make a binary installation so you aren’t required to build the cmdlets before using them. That was a key priority for us and we made this happen in 2.2. Additionally, take a look at the Readme.docx in the new release for further details about other improvements.

Categories: Diagnostics, PowerShell, SQL Azure, Traffic Manager, Windows Azure Tags: Windows Azure PowerShell CmdLets

Handy Library for Dealing with Transient Connectivity in the Cloud

November 14, 2011 Michael Washam Leave a comment

The Windows Azure CAT team has built a library (available via Nuget) that provides alot of functionality around handling transient connection problems across the Windows Azure Platform with SQL Azure, ServiceBus, Cache, Configuration and Storage supported (this has been around awhile but it’s the first chance I’ve had to try it out).

To use add the TransientFaultHandlingFx reference:

spacer

In addition to adding the assemblies it also adds a .chm to your project with full documentation on how to use the library.

There are numerous ways to actually use the library. For SQL Azure I would recommend reading the whitepaper the Windows Azure CAT team published.

The method I chose was to configure a retry policy in my web/app.config:

<configSections>
 <section name=”RetryPolicyConfiguration” type=”Microsoft.AzureCAT.Samples.TransientFaultHandling.Configuration.RetryPolicyConfigurationSettings, Microsoft.AzureCAT.Samples.TransientFaultHandling” />
 </configSections>
 
<RetryPolicyConfiguration defaultPolicy=”FixedIntervalDefault” defaultSqlConnectionPolicy=”FixedIntervalDefault” defaultSqlCommandPolicy=”FixedIntervalDefault” defaultStoragePolicy=”IncrementalIntervalDefault” defaultCommunicationPolicy=”IncrementalIntervalDefault”>
 <add name=”FixedIntervalDefault” maxRetryCount=”10″ retryInterval=”100″ />
 <add name=”IncrementalIntervalDefault” maxRetryCount=”10″ retryInterval=”100″ retryIncrement=”50″ />
 <add name=”ExponentialIntervalDefault” maxRetryCount=”10″ minBackoff=”100″ maxBackoff=”1000″ deltaBackoff=”100″ />
 </RetryPolicyConfiguration>

From there it’s simple to create a RetryPolicy object from the configuration:

public static RetryPolicy GetRetryPolicy()
 {
 // Retrieve the retry policy settings from the application configuration file.
 RetryPolicyConfigurationSettings retryPolicySettings = ApplicationConfiguration.Current.GetConfigurationSection<RetryPolicyConfigurationSettings>(RetryPolicyConfigurationSettings.SectionName);
 
// Retrieve the required retry policy definition by its friendly name.
 RetryPolicyInfo retryPolicyInfo = retryPolicySettings.Policies.Get(“FixedIntervalDefault”);
 
// Create an instance of the respective retry policy using the transient error detection strategy for SQL Azure.
 RetryPolicy sqlAzureRetryPolicy = retryPolicyInfo.CreatePolicy<SqlAzureTransientErrorDetectionStrategy>();
 
return sqlAzureRetryPolicy;
 }

You can pass the RetryPolicy object to the extension methods (for ADO.NET in my example):

sqlCon.OpenWithRetry(rp); // for SqlConnection
object rValue = sqlCmd.ExecuteScalarWithRetry(rp); // from SQLCommand

There is functionality for LINQ as well.

This library not only makes your code robust but can save you a massive amount of time too since they have already put the resources into testing/debugging it spacer

Categories: Cloud Computing, SQL Azure, Windows Azure

Windows Azure Storage Analytics with PowerShell

September 28, 2011 Michael Washam Leave a comment

Windows Azure Storage Analytics allows you to log very detailed information about how a storage account is being used. Each service (Blob/Table/Queues) has independant settings allowing you to have granular control over what data is collected. To enable/disable each setting just add or omit the argument (-LoggingDelete as an example). One of the great things about this service is the ability to set a retention policy so the data can be automatically deleted after a set number of days.

Enabling Storage Analytics per Service

Set-StorageServicePropertiesForAnalytics -ServiceName "Table" `
		-StorageAccountName $storageAccount -StorageAccountKey $storagekey `
		-LoggingDelete -LoggingRead -LoggingWrite -MetricsEnabled -MetricsIncludeApis `
		-MetricsRetentionPolicyDays 5 -LoggingRetentionPolicyEnabled -LoggingRetentionPolicyDays 5 `
		-MetricsRetentionPolicyEnabled 
									
Set-StorageServicePropertiesForAnalytics -ServiceName "Queue" `
		-StorageAccountName $storageAccount -StorageAccountKey $storagekey `
		-LoggingDelete -LoggingRead -LoggingWrite -MetricsEnabled -MetricsIncludeApis `
		-MetricsRetentionPolicyDays 5 -LoggingRetentionPolicyEnabled -LoggingRetentionPolicyDays 5 `
		-MetricsRetentionPolicyEnabled 
									
Set-StorageServicePropertiesForAnalytics -ServiceName "Blob" `
		-StorageAccountName $storageAccount -StorageAccountKey $storagekey `
		-LoggingDelete -LoggingRead -LoggingWrite -MetricsEnabled -MetricsIncludeApis `
		-MetricsRetentionPolicyDays 5 -LoggingRetentionPolicyEnabled -LoggingRetentionPolicyDays 5 `
		-MetricsRetentionPolicyEnabled 		

Retrieving the Current Storage Analytics Settings

	Get-StorageServicePropertiesForAnalytics -ServiceName "Table" `
		-StorageAccountName $storageAccount -StorageAccountKey $storagekey | Format-List
	
	Get-StorageServicePropertiesForAnalytics -ServiceName "Blob" `
		-StorageAccountName $storageAccount -StorageAccountKey $storagekey | Format-List
		
	Get-StorageServicePropertiesForAnalytics -ServiceName "Queue" `
		-StorageAccountName $storageAccount -StorageAccountKey $storagekey | Format-List

Downloading the storage analytics data requires a bit more explanation. Each service has two fundamental types of data (except blob storage which has 3). The two are log data which contains all of the requests for that service (depending on which settings you have enabled) and transactions. Transactions contains numerous metrics that give you a deep understanding of how the storage service is performing. Metrics such as % Success or Avg E2E Latency are extremely useful for understanding your application. Blob storage also has a “Capacity” set of data that will tell you how much storage space blob storage is using broken down by analytics and application data.

Downloading Storage Analytics Data

Get-StorageAnalyticsLogs -ServiceName "Blob" `
	-LocalPath "c:\DiagData\SALogsBlob.log"  `
	-StorageAccountName $storageAccount -StorageAccountKey $storagekey  

Get-StorageAnalyticsMetrics -DataType "Capacity" -ServiceName "Blob" `
    -LocalPath "c:\DiagData\SAMetricsBlob-Capacity.log" `
    -StorageAccountName $storageAccount -StorageAccountKey $storagekey 
						 
Get-StorageAnalyticsMetrics -DataType "Transactions" -ServiceName "Blob" `
	-LocalPath "c:\DiagData\SAMetricsBlob-Transactions.log" `
	-StorageAccountName $storageAccount -StorageAccountKey $storagekey 

Get-StorageAnalyticsLogs -ServiceName "Table" `
	-LocalPath "c:\DiagData\SALogsTable.log"  `
	-StorageAccountName $storageAccount -StorageAccountKey $storagekey  
						 
Get-StorageAnalyticsMetrics -DataType "Transactions" -ServiceName "Table" `
	-LocalPath "c:\DiagData\SAMetricsTable-Transactions.log" `
	-StorageAccountName $storageAccount -StorageAccountKey $storagekey 

Get-StorageAnalyticsLogs -ServiceName "Queue" `
	-LocalPath "c:\DiagData\SALogsQueue.log" `
	-StorageAccountName $storageAccount -StorageAccountKey $storagekey  
						 
Get-StorageAnalyticsMetrics -DataType "Transactions" -ServiceName "Queue" `
	-LocalPath "c:\DiagData\SAMetricsQueue-Transactions.log" `
	-StorageAccountName $storageAccount -StorageAccountKey $storagekey 

For more information on storage analytics and details on understanding the metrics you can use see the following:
msdn.microsoft.com/en-us/library/windowsazure/hh343268.aspx

Categories: PowerShell,
gipoco.com is neither affiliated with the authors of this page nor responsible for its contents. This is a safe-cache copy of the original web site.