XLSX Template with EPPlus and Web API, save as XLSX or PDF

Nov 7

Posted by Zach in C# Development, Information Technology, Web Development | No Comments

I’ve been using NPOI for years to create/edit XLS documents via C#, and it works perfect. If you want to work with XLS on your server without installing Office and using COM, check out NPOI and I’m sure you won’t be disappointed.  Since I migrated our entire company over to Office365, everybody is now running Office 2013+ and I’m constantly being asked why my apps are spitting out XLS files that give everybody an annoying compatibility warning and limit the rows to 65K.  Since I needed a mental break (more like a learning stretch) on a new project, I decided to try a out EPPlus for managing XLSX files via C# without COM.

Most of the time I develop reports/document/etc… with Excel, I like to start with a formatted template. This makes the initial conversations with the users easy, and it abstracts the formatting from my code (insert data into per-formatted cell). I also find that 1/2 my changes after go-live are formatting related and by using Excel I can adjust and deploy the update in seconds. I did a bit of looking around and I choose EPPlus, since the project is active and there is a NuGet package available (EPPlus, Version 4.0.4).

Here is my application UI, which needs a new excel document

spacer

The UI is pretty simple, the workflow is based on the warehouse staff adding delivery notes (warehouse orders) to a shipment until the shipping coordinator locks the shipment and prepares it for pickup (palletize, wrap, shipping labels, etc…). The UI follows the physical process. Once the shipment is locked, the warehouse staff are unable to add additional orders to the shipment. The process of locking, will schedule the next pickup and display a link to download the Excel document. I’ve been asked to support PDF as well, as the warehouse stations are moving away from using Excel. This article follows the development of the Bill of Lading (“BOL”). The interface is 100% Ajax driven with jQuery and Web API on the back end.

Steps to Implement BOL with EPPlus & Web API

Item #1: Create Excel XLSX template with Excel 2013

Here is the excel template I created (layout, colors, formatting, formulas for totals, etc…).

spacer

I created Warehouse_BoL.xlsx and stored it in the root of my project in a folder called /template/report/, this is the location where I keep all of my report templates.

Item #2 ( Populate Template and Return as XLSX/PDF )

Step #1: Install EPPlus via NuGet

Package Manager Console: PM> Install-Package EPPlus

Step #2: Populate Template

Issue #1 (dynamic rows):

As I started thinking about how I was going to support 300+ rows, I came up with a VERY simple solution. I created my template with rows 12-400 being allotted and formatted for data. I would then load the file with data, and delete the BLANK rows. Since I wanted my document to look pretty when printed on a single page (as this should be 90% of the documents printed), I decided to always leave up to row 35 in the excel file regardless of the number of rows of data added. This keeps the single page print, pretty.

Issue #2 (formula refresh):

When I got the export work and moved to exporting to PDF I found my formulas to total cartons/weight were showing “0” in the report. I’m not sure if this is a bug, but I found notes in the documentation that you can call “Calculate()” on the cell to force the formula to refresh. Worked perfect.

using OfficeOpenXml;
using System.IO;

public static MemoryStream GenerateBOL(int id)
{
    try
    {
        // Template File
        string templateDocument = 
            HttpContext.Current.Server.MapPath("~/templates/report/warehouse_bol.xlsx");

        // Results Output
        MemoryStream output = new MemoryStream();

        // Read Template
        using (FileStream templateDocumentStream = File.OpenRead(templateDocument))
        {
            // Create Excel EPPlus Package based on template stream
            using (ExcelPackage package = new ExcelPackage(templateDocumentStream))
            {
                // Grab the sheet with the template, sheet name is "BOL".
                ExcelWorksheet sheet = package.Workbook.Worksheets["BOL"];

                // SubSonic 2x - Data Acces, load shipment header data.
                Shipment shipment = Shipment.FetchByID(id);

                // Insert BOL header data into template
                sheet.Cells[2, 3].Value = shipment.ScheduledCarrier + " PICK-UP";
                sheet.Cells[3, 3].Value = string.Format("{0} ( {1}-{2} )",
                    shipment.CustomerName,
                    shipment.CustomerCode,
                    shipment.DelToCode);
                sheet.Cells[6, 6].Value = shipment.DestinationAddress1;
                sheet.Cells[7, 6].Value = shipment.DestinationAddress2;
                sheet.Cells[8, 6].Value = shipment.DestinationAddress3;
                sheet.Cells[9, 6].Value = shipment.DestinationAddress4;

                // Start Row for Detail Rows
                int rowIndex = 12;

                // SubSonic 2x - Data Access, load shipment details
                using (IDataReader reader = SPs.RohmPortal_GetShipmentOrderStatistics(id).GetReader())
                {
                    while (reader.Read())
                    {
                        // Cell 1, Carton Count
                        sheet.Cells[rowIndex, 1].Value = Int32.Parse(reader["Cartons"].ToString());
                        // Cell 2, Order Number (Outline around columns 2-7 make it look like 1 column)
                        sheet.Cells[rowIndex, 2].Value = reader["DNNO"].ToString().Trim();
                        // Cell 8, Weight in LBS (convert KG to LBS, and rounding to whole number)
                        sheet.Cells[rowIndex, 8].Value =
                            Math.Round(decimal.Parse(reader["GrossWeight"].ToString()) * 2.2m);

                        // Increment Row Counter
                        rowIndex++;
                    }
                }

                // Force formula to compute, or formulas are not updated when you convert XLSX / PDF.
                sheet.Cells[402, 1].Calculate();
                sheet.Cells[402, 8].Calculate();

                // Logic to keep up to row 35 when the data is short (pretty 1 page).
                if (rowIndex < 35)
                {
                    sheet.DeleteRow(36, 400 - 36);
                }
                else if (rowIndex < 400)
                {
                    sheet.DeleteRow(rowIndex, 400 - rowIndex);
                }

                package.SaveAs(output);
            }
            return output;
        }
    }
    catch (Exception e)
    {
        // Log Exception
        return null;
    }
}

Next, I needed to create a Web API route to generate the XLS file. Seeing how I planned to return at least 2 different formats XLSX/PDF, I created a generic wrapper and a method to return the data as a stream.

Generic – Return Stream as File Method

public static HttpResponseMessage ReturnStreamAsFile(MemoryStream stream, string filename)
{
    // Set HTTP Status Code
    HttpResponseMessage result = new HttpResponseMessage(HttpStatusCode.OK);

    // Reset Stream Position
    stream.Position = 0;
    result.Content = new StreamContent(stream);

    // Generic Content Header
    result.Content.Headers.ContentType = new MediaTypeHeaderValue("application/octet-stream");
    result.Content.Headers.ContentDisposition = new ContentDispositionHeaderValue("attachment");

    //Set Filename sent to client
    result.Content.Headers.ContentDisposition.FileName = filename;

    return result;
}

Generic – API Wrapper to Generate XLSX/PDF.

[Route("api/warehouse/schedule/bol/{id}/{format}")]
[HttpGet]
public HttpResponseMessage GetBillOfLadding(int id, string format)
{
    MemoryStream documentStream = GenerateBOL(id);

    // Make Sure Document is Loaded
    if (documentStream != null && documentStream.Length > 0)
    {
        // Generate dynamic name for BOL document "BOL-XXXXXX.PDF or .XLSX"
        string documentName = string.Format("BOL-{0}.{1}", id, format);

        // Based on format requested in URI (PDF/XLSX), call the associated method.
        switch (format)
        {
            case "PDF":
                MemoryStream pdfMemoryStream = ConvertXLSXtoPDF(documentStream);
                return FileAsAttachment(pdfMemoryStream, documentName);
                break;

            case "XLSX":
                return FileAsAttachment(documentStream, documentName);
                break;
        }
    }

    // If something fails or somebody calls invalid URI, throw error.
    return new HttpResponseMessage(HttpStatusCode.NotFound);
}

Step #3: Install FreeSpire.XLS

Package Manager Console: PM> Install-Package FreeSpire.XLS

Free version is limited to 5 sheets per workbook and 150 rows per sheet. This limitation is enforced during reading or writing XLS or PDF files. Free Spire.XLS v7.8 has no any limitation when loading and saving .xlsx file format. When converting Excel files to PDF files, you can only get the first 3 pages of PDF file.

Step #4: Create method to convert XLSX to PDF

Yep, FreeSpire.NET has a limit of 3 pages. Yuck!!! I was short on time and I’d love to find another simple open simple open source solution, but E-ICEBLUE’s Spire.XLS code really made the XLSX to PDF process very simple and I had working code in 5 minutes. The DLL is ~12MB, which is 10x larger than my entire web application, but it works!

If you know of a plugin solution or one that requires little tweaking to generate XLSX documents into PDF, please let me know!!!

using Spire.Xls;
using Spire.Pdf;
using Spire.Xls.Converter;

// Convert XLSX to PDF MemoryStream
public static MemoryStream ConvertXLSXtoPDF(MemoryStream stream)
{
    // Result Stream
    MemoryStream pdfMemoryStream = new MemoryStream();

    // Spire.XLS to open XLSX workbook stream created by EPPlus
    Workbook workbook = new Workbook();
    workbook.LoadFromStream(stream, ExcelVersion.Version2013);

    // Spire.PDF to convert XLSX to PDF, I read it has limited functionality (total pages, rows, etc...).
    PdfConverter pdfConverter = new PdfConverter(workbook);
    PdfConverterSettings settings = new PdfConverterSettings();
    PdfDocument pdfDocument = pdfConverter.Convert(settings);
    pdfDocument.SaveToStream(pdfMemoryStream);
            
    return pdfMemoryStream;
}

I found references saying to use “Workbook.SaveToFile()” to generate the PDF file, but this was resulting in errors (blank PDF files). Using PdfConverter() works, but you should expect compile warning that the method is obsolete.

THE RESULTS

XLSX Excel Document

/api/warehouse/schedule/bol/14240002/XLSX

spacer

PDF Document

/api/warehouse/schedule/bol/14240002/PDF

spacer

Adding Web API 2.1 to Web Form App

Oct 20

Posted by Zach in C# Development, Web Development | No Comments

Like many others, I have a huge pile of legacy apps that I maintain that are running very smooth with no justification for a major upgrade/update, lots of small code tweaks and simple enhancements, but nothing major. I had a new request the other day to add a basic warehouse tracking system that people could add packages too by scanning bar codes. Very simple process and something I could easily do with jQuery, jQuery tmpl plugin, and a back-end service. Before running off to build a WCF service, I decided to take a peek at Web API (version 2.1 included with Visual Studio 2013) to learn something new.

The MSDN tutorials were pretty straight forward but missing a few step, that I didn’t figure out till I messed things up a few times. Nothing terrible, and with GIT you can just toss the changes and start over again. ;)

Here is the play-by-play to add Web API 2.1 to Web Forms Application

  1. In the root of your Web Forms application create a folder called “api“.

    ** The name and location of this folder is best practice for Web API controllers.

     

  2. Right click on your new “api” folder and choose Add -> New Item from the right context menu.

    On the “Add New Item” menu, go to Web and choose Web API Controller Class (v2.1) and name your controller [Something]Controller.cs.

    ** The standard naming convention for Web API is to put the Suffix “Controller” on your Web API name. The world “Controller” will never be used in api calls, but it’s important for the convention based routing. In the example below, I created a Web API for my Inventory functions, I gave it a name of “InventoryController.cs”.

     

    using System.Collections.Generic;
    using System.Web.Http;
    
    namespace  ZachHunter.DemoProject.api
    {
        public class InventoryController : ApiController
        {
            // GET api/<controller>
            public IEnumerable<string> Get()
            {
                return new string[] { "value1", "value2" };
            }
    
            // GET api/<controller>/5
            public string Get(int id)
            {
                return "value";
            }
    
            // POST api/<controller>
            public void Post([FromBody]string value)
            {
            }
    
            // PUT api/<controller>/5
            public void Put(int id, [FromBody]string value)
            {
            }
    
            // DELETE api/<controller>/5
            public void Delete(int id)
            {
            }
        }
    }
    

     

  3. Next go to your “App_Start” folder in the root of your application (if you don’t have this folder, create it) and right click on the folder to add a new class named “WebApiConfig.cs”.

    ** This is recommended location to store your Web API routing configuration.

    Add the following two routing rules into your WebApiConfig.cs file to enable convention and attribute based routing

     

    using System;
    using System.Collections.Generic;
    using System.Linq;
    using System.Web;
    using System.Web.Http;
    
    namespace ZachHunter.DemoProject
    {
        public static class WebApiConfig
        {
            public static void Register(HttpConfiguration config)
            {
                // Attribute Routing
                // Allows for custom routing to be set at the method, example [Route("api/warehouse/scheduling/all")]
                config.MapHttpAttributeRoutes();
    
                // Convention-Based Routing
                // Allows for standard controller based routing, example api/warehouse, api/warehouse/1
                config.Routes.MapHttpRoute(
                    name: "DefaultApi",
                    routeTemplate: "api/{controller}/{id}",
                    defaults: new { id = RouteParameter.Optional }
                );
            }
        }
    }
    

     

  4. Finally, open your Global.asax.cs and add modify your Application_Start to load your Web API routing configuration.

     

            protected void Application_Start(Object sender, EventArgs e)
            {
                // Here are settings to change the default JSON.net Json formatting to indented, many other options to choose.
                HttpConfiguration config = GlobalConfiguration.Configuration;
                config.Formatters.JsonFormatter.SerializerSettings.Formatting = Newtonsoft.Json.Formatting.Indented;
    
                // WebApi Routing, loads up your Web API routes stored in App_Start/WebApiConfig.cs
                GlobalConfiguration.Configure(WebApiConfig.Register);
            }
    

Automatic changes made when you added Web API 2.1 Class

References:

Newtonsoft.Json.dll
System.Net.Http.dll
System.Web.Http.WebHost.dll

Web.Config

    <handlers>
      <remove name="ExtensionlessUrlHandler-Integrated-4.0" />
      <remove name="OPTIONSVerbHandler" />
      <remove name="TRACEVerbHandler" />
      <add name="ExtensionlessUrlHandler-Integrated-4.0" path="*." verb="*" type="System.Web.Handlers.TransferRequestHandler" preCondition="integratedMode,runtimeVersionv4.0" />
    </handlers>
  ...
  <runtime>
    <assemblyBinding xmlns="urn:schemas-microsoft-com:asm.v1">
      <dependentAssembly>
        <assemblyIdentity name="Newtonsoft.Json" publicKeyToken="30ad4fe6b2a6aeed" culture="neutral" />
        <bindingRedirect oldVersion="0.0.0.0-7.0.0.0" newVersion="7.0.0.0" />
      </dependentAssembly>
    </assemblyBinding>
  </runtime>

When your all done, you can browse to the api url by running your project and typing in the convention base syntax listed in the comments above the auto generated method (e.g. api/inventory). If you use IE, it will default to loading JSON or you can use FireFox and it will default to XML.

Her is a basic call in jQuery using the Web API sample above

$(document).ready(function () {

            $.ajax({
                type: 'GET',
                url: '/api/inventory,
                contentType: 'application/json; charset=utf-8',
                dataType: 'json',
                data: '',
                success: function (results) {
                    // Do Something with the Results
                },
                error: function (xhr, textStatus, error) {
                    // In case of ERROR, do something else
                }
            });
});

Storage Processor Failover with a EMC Celerra NX4 / Clarion AX4

May 5

Posted by Zach in Information Technology | No Comments

We’ll yesterday was the first time in many years that I had to spend an entire day in the office on a weekend due to a major hardware outage. This is never a good thing and it really *feels* bad when your team fails to keep core operations running smoothly. As a IT Manager, I strive for 0% unscheduled downtime and 95% up-time overall (Yep, that 5% might sound high but due to the volume of updates, backups, patches, hardware migrations, performance improvement tasks, etc… Thing need to go down for schedule work).

** I think Benjamin Franklin’s quote about a Apple a day, can be adapted to “A update a day, keeps another hacker away!”.

As for my data unavailable (DU) outage, the core symptom was our SQL Server 2008 host connect via fiber HBA to the SAN, loosing all of it’s LUNS. When my DBA saw this, she was on the phone with me in minutes yelling that all of her table had disappeared from every database in the system.  She said that the system went down at 1:45 AM and all of our backups failed and all of our databases were empty except for the system tables.  I immediately hopped online and realized this symptom was caused by SQL Server showing unmounted databases (why not say offline, unavailable), I’ll leave that question to Microsoft to explain…. but it definitely left her in a panic seeing all the data missing from every database on the server.  Since I’m familiar with the issue (I think I’ve caused or fixed a few thousand issues, over my past 10 years of managing SQL Server), I packed the car and headed into the office.

Thinking we had DR built into our very expensive EMC SAN, I was going crazy trying to figure out why we were not online. I put in a severity 1 request with EMC telling them that we had DU and to get somebody on the phone ASAP.  I grabbed a SP collect and sent it off to EMC and tech support person 1/7 found a bad fiber port on storage processor A (SP A – SFP FE 0) and said this was my issue.  Since I know we have 2 storage processors, I asked him about moving the resource to B, but he said he only knew the Celerra part of our system and declined helping saying I would need to work with the Clarion tech who would be dispatched to replace SFP FE0 in 4 hours, and be on-site with parts within 24 hours. I asked multiple times if he could directly transfer me to somebody on the Clarion team to get my system online and data available, but he kept saying NO.  I  finally gave up on the call, so I could log some NASTY messages in the call ticket to get somebody moving ASAP (This works very well with EMC, so if you can’t get the online person to solve your problem then post all your issues, comments, and concerns in the ticketing system).

Into the server room I went, one of the few times I wish the room was a bit warmer than 67 degrees… I manually moved the host fiber cable connected to the HBA over to storage processor B and saw my LUNs show up on the HOST in seconds, but they were all showing as unrecognized and prompting for initialization. Knowing there was something wrong, I spent about a hour looking in the NaviSpehere, UniSphere, and CLI (both GUIs were new to me because 38 days prior EMC did a major update to our system). After getting half-way there, I decided to be brave to re-seat storage processor A.  This ended up restoring data, but it only last for ~45 minutes before going offline again.  At this point, the Clarion tech called and said he be in site in 1 hour, at 2PM with parts. Upon his arrival with a new SFP FE0 port, I asked him to first help me get online with storage processor B so we could get data available while working on storage processor A. He also didn’t know the new interface and we proceeded to go through 5 Clarion tech support people, 2 shift changes, and 4 more hours before ending up with somebody who knew what to do.  As I had joked to the on-site tech a few times about it probably being 1 button or switch to set, the online tech showed us a single magic button to fix the problem!  ARGH, 15 hours of downtime and 7 tech support people from EMC kept my data unavailable.   The issue with the fail-over to storage processor B, was permission with the virtual disk being assigned only to storage processor A, so we moved it from A -> B and I was finally online in seconds.  Since this guy seemed to KNOW the system pretty well, I asked why automatic DR fail-over did not work and why manually assigning a virtual disk to a node was needed. He realized the question meant there was else wrong, since he too was confused why this whole permission change was needed in the first place. Another hour of research from the online tech, and he determined I was missing EMC PowerPath software on all my hosts connected to the SAN via iSCSI and Fiber.

At this point, totally confused I had to get a 30 minutes lesson on PowerPath and how it works and where it was supposed to be installed.  Finally, after hashing out the whole situation to determine the root cause with the; original designer, past employee who lead the project for me, my current team, Celerra support and the on-site Clarion support it was determined that the automatic fail-over features of the SAN were never setup correctly and we have never had an outage to trigger this scenario. This was a *shock* to everybody involved, but it explains a few mysterious events that have happened over the past 6 months (cluster resources becoming unavailable, complete SAN failure during upgrade, high IO on individual storage processors, etc…) that EMC support could not explain.

So, the end result and lesson learned is to always install PowerPath on all non-clustered hosts that connect to your EMC SAN if you want automatic fail-over.  If you do have this setup and have a storage processor fail, follow the following steps:

1. Connect to Navisphere
2. Click “Virtual Disks”
3. Choose your virtual to edit and click the name.
4. Click “Virtual Disk Access Details”

spacer

5. Click “Change Ports for Access”
** On this screen, it will show where the virtual disk is currently assigned (e.g. SP A  or SP B).

That’s it, hopefully this saves somebody 15 hours of downtime confused tech support people and/or a mis-configured DR setup!

Save IMAP Attachments with AE.Net.Mail in C#

Jun 6

Posted by Zach in C# Development | 5 Comments

One of the people in my team ran into a problem yesterday, where a customer could only send data via Email. The customer does not support formal EDI or have IT staffers who can support building or using something more automated. After talking about how the processing needed to be performed 7 days a week, we knew we had to build something automated. I went to my Google and StackOverflow to search for a free IMAP solution (I so wish IMAP/POP/SFTP were supported in the core framework) and found AE.Net.Mail. The product doesn’t have any documentation, but after looking at 2 posts about issues other people were having with with the client, I was able to figure out everything I needed to process my emails and save the attachments out to disk. The project is even support via NuGet, which is a big plus. All of the code below is based on using AE.Net.Mail v1.6.0.0 with .NET 4.0.

Here is the code I built to process email attachments from 2 different customers:

            // Connect to Exchange 2007 IMAP server (locally), without SSL.
            using (ImapClient ic = new ImapClient("<server address>", "<user account>", "<user password>", ImapClient.AuthMethods.Login, 143, false))
            {
                // Open a mailbox, case-insensitive
                ic.SelectMailbox("INBOX");

                // Get messages based on the flag "Undeleted()".
                Lazy<MailMessage>[] messages = ic.SearchMessages(SearchCondition.Undeleted(), false);

                // Process each message
                foreach (Lazy<MailMessage> message in messages)
                {
                    MailMessage m = message.Value;

                    string sender = m.From.Address;
                    string fileName = string.Empty;

                    // Rules by Sender
                    switch (sender)
                    {
                        case "zachary@customerA.com":
                            fileName = ExtractString(m.Subject, "(", ")");

                            foreach (Attachment attachment in m.Attachments)
                            {
                                attachment.Save(@"C:\Demo\" + fileName + Path.GetExtension(attachment.Filename));
                            }
                            break;

                        case "hunter@customerB.com":

                            foreach (Attachment attachment in m.Attachments)
                            {
                                string filePrefix = attachment.Filename.Substring(0, attachment.Filename.IndexOf('_'));

                                switch (filePrefix)
                                {
                                    case "DAILY":
                                        fileName = "CustomerB_Inventory";
                                        break;
                                    case "PULL":
                                        fileName = "CustomerB_Pulls";
                                        break;
                                    case "RECEIPT":
                                        fileName = "CustomerB_Receipts";
                                        break;
                                }

                                attachment.Save(@"C:\Demo\" + fileName + Path.GetExtension(attachment.Filename));
                            }
                            break;
                    }

                    // Delete Message (so it drops out of criteria and won't be double processed)
                    ic.DeleteMessage(m);
                }
            }
        }

        /// <summary>
        /// Helper method to pull out a string between a start and stop string.
        /// Example:
        ///    string story = "The boy and the cat ran to the store.";
        ///    ExtractString(story, "cat", "to"); //returns "ran"
        /// </summary>
        /// <param name="stringToParse">String to search</param>
        /// <param name="startTag">Starting String Pattern</param>
        /// <param name="endTag">Ending String Pattern</param>
        /// <returns>String found between the Starting and Ending Pattern.</returns>
        static string ExtractString(string stringToParse, string startTag, string endTag)
        {
            int startIndex = stringToParse.IndexOf(startTag) + startTag.Length;
            int endIndex = stringToParse.IndexOf(endTag, startIndex);
            return stringToParse.Substring(startIndex, endIndex - startIndex);
        }

Most of the processing is done in the switch statement, since I’m going to apply different parsing rules by customer. This is far from a long term elegant and maintainable longterm solution, but it gets the problem solved quick. Here is the rule summary by customer.

case “zachary@customerA.com”:

Everyday I’m going to get 3 emails from this customer, the subject for each email will be:

“Daily Reports for Customer A (CustomerA_Inventory)”
“Daily Reports for Customer A (CustomerA_Receipts)”
“Daily Reports for Customer A (CustomerA_Pulls)”

All three files have a CSV attachment that is called CustomerA.csv. The file in each email contains different data and I need all 3 written to disk for processing in my EDI solution. My solution was to parses the subject using the helper method, and renames each attachment with the name matching in the (parenthesis). The result of processing the emails with the code above, is 3 files written to disk with names that correctly identify their data content.

C:\Demo\CustomerA_Inventory.CSV
C:\Demo\CustomerA_Receipts.CSV
C:\Demo\CustomerA_Pulls.CSV

case “hunter@customerB.com”:

This customer is similar to the top, but I’m only going to get 1 email with 3 attachments that each have a date appended to the file. I need to save each file to disk without a date stamp, so my EDI processor can be setup to pull in the same “file name” everyday. My solution was to parse the attachment file name and rename the file based on the prefix of the attachment name. The results of processing the email is below.

Source Email:

Attachment 1: DAILY_INVENTORY_06_05_2012.XLS
Attachment 2: DAILY_RECEIPT_06_05_2012.XLS
Attachment 3: DAILY_PULL_06_05_2012.XLS

Results of Processing

C:\Demo\CustomerB_Inventory.XLS
C:\Demo\CustomerB_Pulls.XLS
C:\Demo\CustomerB_Receipts.XLS

Note: Awhile back I looked to see if there was a turn-key app/solution that could pull in emails for processing (basically run as a service on a server, and provide a nice GUI to allow non-technical people to setup rules for email processing), but I couldn’t find anything that worked with our setup. This would ideally be a better solution, since it would allow users to add new rules for new emails at a moments notice, versus my hard-code logic used above.

ASP.NET 4.0 WCF RESTful Services – Part 2

May 18

Posted by Zach in C# Development, Web Development | No Comments

In the first part, we learned how to setup a ASP.NET 4.0 WCF (file-extension less) service and the basics on using jQuery’s $.getJSON() method to call a service implementing the GET method. Now, we are going to review all the HTTP methods commonly found in a

gipoco.com is neither affiliated with the authors of this page nor responsible for its contents. This is a safe-cache copy of the original web site.