Exchange Reports Project Overview

During this summer, in-between semesters I was fortunate enough to get a software development job for a local company just 10 minutes walk from my door. The project was to produce an ‘Exchange Reports’ system that would provide email messaging statistics exactly to the customers specification. The system would be automated so that after reports were designed, they would be generated programmatically by a service and emailed to any recipients that had been setup to receive each report. The solution was to be comprised of 3 distinct programs that would need to be developed along with configuration tools to setup the non-GUI processes in the solution (namely the services).

I have produced the following diagram to demonstrate the solutions processes, ( Click to enlarge):

The design was in place when I started and an existing code-base was also present, but still required the vast majority of the functionality to be added. It was the first time having worked professionally as a software engineer and therefore also the first time getting to grips with existing code made by developers no longer around. More so, understanding the solutions technical proposal well enough to execute exactly what the customer and my employer wanted. I think working in IT professionally for a lot of years certainly helped me get into a comfortable stride after an initial information overload when taking on solely what was a surprisingly large but beneficial technical project compared to what I had envisioned. Being thrown into the deep end is probably the fastest way you can improve and I feel that above all, I have taken a lot from this experience which will prove valuable in the future. I’m very pleased with the outcome and successfully got all the core functionality in and finished in the time frame that was assigned. I whole heartily would encourage students thinking of getting professional experience to go for it, ideally with an established company from which you can learn a great deal. Having experienced developers around to run things by is great way to improve.

Now onto the technical details. The project was coded in C# and used WinForms for initial testing of processes and later for the configuration programs. I used a set of third-party .NET development tools from ‘DevExpress’ that proved to be fantastic and a massive boon to those wanting to create quick and great looking UI’s with reporting functionality. SQL Server provided the relational database functionality, an experience I found very positive and very much enjoyed the power of Query Language when it came to manipulating data via .NET data tables, data adapters, table joins or just simple direct commands.

Using the diagram as a reference, I’ll briefly go through each process in the solution for A) those interested in such things and B) future reference for myself while it’s still fresh in my mind because i’ll likely forget much of how the system works after a few months of 3D graphics programming and Uni coursework :P.

Exchange Message Logs: 

In Exchange 2010 Message Tracking logs can be enabled quite simply and provide a wealth of information that can be used for analysis and reporting if so desired. They come in the form of comma delimited log files that can be opened simply with a text editor. They have been around a lot of years and in the past during IT support work I have found myself looking at them from time to time to diagnose various issues. This time I’d be using them as the source of data for a whole reporting system. The customer was a large international company and to give an example from just one Exchange system they were producing 40 MB-worth of these messaging logs each day. With these being effectively just text files that’s an awful lot of email data to deal with.

Processing Service: 

The first of 3 core components of the solution, the Processing Service as the name suggests is an install-able Windows Service that resides on a server with access to the Exchange Messaging log files. The service is coded to run daily at a specified time and it’s purpose is comprised of 5 stages:

1. Connect to the Exchange server and retrieve a list of users from the Global Address List (GAL). This is done using a third-party Outlook library called ‘Redemption’ that enables this information to be extracted and then check it for any changes to existing users and/or any new users. The users are placed in a table on the SQL database server and will be used later to provide full name and department information for each email message we store.

2. Next, each Exchange Message log is individually parsed and useful messaging information is extracted and stored into various tables on the database server. Parsed log file names are kept track of in the database  to prevent reading logs more than once.

3. Any message forwards or replies are identified and tallied up.

4. A separate Summary table on the database is populated with data processed from the prior mentioned message tables. This table is what the reports will look at to generate data. Various calculations are made such as time difference between an email being received and then forwarded or replied to gauge estimates of response times being just one example; a whole plethora of fields are populated in this table, much more than could comfortably fit on a single report. Due to this large amount of potentially desirable data we later allow the user to select which fields they want from the Summary table in the ‘Report Manager’ if they wish to create a custom report or alternatively and more typically, they use predefined database ‘Views’ that have been created for them based on the customers specification which allows them to access only the data they need. Database Views are a really neat feature.

5. The databases Messaging tables are scoured for old records beyond a threshold period and deleted. This maintenance is essential to prevent table sizes growing too large. Their associated Summary data that has been generated is still kept however but I added functionality to archive this by serializing this data off and deleting it from the database if required.

Report Manager:

Initially we had thought to utilise DevExpress’s ‘Data Grid’ object controls in a custom Form application but we decided that the appearance of the reports that were generated from this were not satisfactory. This turned out to be a good design decision since we later discovered DevExpress has remarkable reporting controls that allow very powerful design and presentation features that completely overshadowed that of the Data Grids. After some migrating of code from the old ‘Report Manager’ program and having to spend a day or two researching and familiarising myself with the DevExpress API I had a great looking new application that the customer will be using to design and manage the reports.

Report Manager program

Report Manager program

The Report Manager allows you to design every aspect of a report through an intuitive drag and drop interface. Images and various graphics can also be added to beautify the design, though that wasn’t something I did nor had the time to attempt! The data objects can be arranged as desired and the ‘data source’ information for the report is saved along with it’s design layout via a neat serialization function inherent to the ‘XtraReport’ object in the DevExpress library which is then stored in a reports table on the database server for later loading or building. You can also generate the report on-the-fly and export it into various formats such as PDF or simply print it. Another neat built-in feature is the ability to issue SQL query commands using a user-friendly filter for non-developers in the report designer which is then stored along with the layout, thus the user designing the report has absolute control over the data i.e a quick filter based on Department being “Customer Services” would return only that related message data without me needing to code in some method to do this manually like was the case when using the Data Grids.

In the top left you’ll see specific icons that provide the necessary plumbing for the database server. ‘Save’, ‘Save As’ and ‘Load’ respectively writes the serialized report layout to the database, creates a new record with said layout or loads an existing saved report from the database into the designer. Loading is achieved by retrieving the list of report records stored in the reports table and placing it into a Data Grid control on a form where you can select a report to load or delete. The ‘Recipients’ button brings up the interface to manage adding users who want to receive the report by email, this retrieves the user data imported by the Processing Service and populates a control that allows you to search through and select a user or manually type a name and email address to add a custom recipient. Additionally, upon adding a recipient to the report you must select whether they wish to receive the report on a daily, weekly or monthly basis. This information is then stored in the aptly named recipient table and then relates to the reports via a reportID field.

Report Service:

Nearly there (if you’ve made it this far well done), the last piece in the solution is another Windows Service called the ‘Report Service’. This program sits and waits to run as per a schedule that can be determined by a configuration app that i’ll mention shortly. Like the Processing Service, as part of it’s logic, it needs to check if it’s the right time of the day to execute the program, of course the service continuously polls itself every few minutes to see if this is the case. Upon running it looks to see if it’s the right day for daily reports, day of week for weekly reports, or day of month for the (you guessed it) monthly reports. If it is, it then it runs and grabs the ‘joined’ data from the reports and recipient tables and proceeds to build each report and fire them out as PDF email attachments to the associated recipients. It makes a final note of the last time it ran to prevent it repeatedly running on each valid day.

Configuration Tools:

Two configuration apps were made, one for the Processing Service and one for the Report Service. These two services have no interfaces since they run silently in the background, so I provided a method via an XML settings file and the two apps to store a variety of important data such as SQL connection strings, server authentication details (encrypted) and additionally also through the need to provide certain manual debugging options that may need to be executed as well as providing an interface to set both services run times and the report delivery schedule.

Screens below (click to enlarge):

So that’s the solution start to finish, depending on time I’m told it’s possible it could be turned into a product at some point which would be great since other customers could potentially benefit from it too.

The great thing about a creative industry like programming, whether business or games, is that you’re ultimately creating a product for someone to use. It’s nice to know people somewhere will be getting use and function out of something you have made and just one reason why I’ve thoroughly enjoyed working on the project. I’ve learned a lot from my colleagues while working on it and hope to work with them again. You also get a taste for real life professional development and how it differs in various ways to academic teachings, which although are very logical and sensible are also idealistic (and rightly so) but in the real-world when time is money and you need to turn around projects to sustain the ebb and flow of business, you have to do things in a realistic fashion that might mean cutting some corners when it comes to programming or software design disciplines. I always try my best to write as clean code as possible and this was no exception but ultimately you need to the get the project done first and foremost and it’s interesting how that can alter the way software development pans out with regards perhaps to niceties like extensive documentation, ‘Use Case’ diagrams and robust unit testing potentially falling to to the wayside in favor of a more speedy short-term turn around. Certainly I imagine, larger businesses can afford to manage these extra processes to great effect, but for small teams of developers it’s not always realistic, which I can now understand.