Headstart Microservices Architecture using AWS Lambda in your favorite programming language (Node.js or Python or JAVA)

The way we build software has changed stupendously from Monolithic Architecture (MLA) using single programming language to a Microservices Architecture (MSA) where pieces of code developed in multiple different programming language that fits the problem in hand. Microservice has numerous benefits and encourages Single Responsibility Principle (SLP) with quick deployment and easy to change lifecycle without impacting the fatty software.

AWS offers easy way to implement microservices through RESTful APIs wrapped against AWS Lambda functions in a server less backend that can auto scale with built in security via its IAM. AWS offers 12 months free tier which enable you to get hands-on experience with AWS Cloud Services.

To demonstrate the power of AWS API Gateway & Lambda, I quickly dropped my flicker photo search node.js code as lambda function wrapped in AWS API Gateway and exposed it as REST API Endpoint.


var Client = require('node-rest-client').Client;
var _ = require("underscore");
var json2xls = require('json2xls');
var fs = require('fs');
//API Explorer : https://www.flickr.com/services/api/explore/flickr.photos.search
var flickr_api_endpoint = "https://api.flickr.com/services/rest/?method=flickr.photos.search&format=json&nojsoncallback=1";
var flickerPhotos = [];
exports.handler = function (request, context) {
//1) Resolve input and execute REST calls
try {
var flickr_api_key = request.api_key;
var flickr_search_tags = request.search_tags;
var flickr_search_text = request.search_text;
var flickr_search_limit = request.search_limit;
var flickr_search_currentpage = request.search_currentpage;
flickr_api_endpoint = flickr_api_endpoint + "&api_key=" + flickr_api_key + "&tags=" + flickr_search_tags + "&text=" + flickr_search_text + "&per_page=" + flickr_search_limit + "&page=" + flickr_search_currentpage
var client = new Client();
client.registerMethod("flickrSearch", flickr_api_endpoint, "GET");
catch (e) {
context.fail("Something wrong in the input! please check. [" + e.message + "]");
//2) Process REST call response and send it back to the client.
var args = { headers: { "Content-Type": "application/json", "Cache-Control": "no-cache" } };
client.methods.flickrSearch(args,function (res, rawdata) {
try {
console.log("Length = [" + res.photos.photo.length + "]");
_.each(res.photos.photo, function (item, index, list) {
//console.log("http://farm" + item.farm + ".staticflickr.com/" + item.server + "/" + item.id + "_" + item.secret + "_b.jpg");
flickerPhotos[index] = "http://farm" + item.farm + ".staticflickr.com/" + item.server + "/" + item.id + "_" + item.secret + "_b.jpg";
catch (e) {
view raw flickerphotos.js hosted with ❤ by GitHub

AWS lambda does not support 3rd party npm so make sure to upload all of your 3rd party node_modules. AWS Lambda & API Gateway Getting Started guide will be your good head start.


POST https://www.getpostman.com/collections/9f4773dd73a512b92e7a






Inbound Email processor using Node.JS Mail-Notifier

I have written a simple but full fledged inbound email processor application using mail-notifier npm which internally uses imap & mailparser. The complete source code is in my Git Repository for download or clone


[Note: Image created using Lucid chart online diagramming tool]

In this application, I have taken care of the following business functionalities

  1. Continuously look for all incoming email messages
  2. When email arrived, process it and form a modified email JSON object with the following properties
    • From
    • To
    • Subject
    • Plain Text Body
    • Message ID
    • Date of the email
    • Attachments
  3. Keep the configurations in a separate config file i.e. Email credentials, Mongo connection string etc.
  4. Save the email object into Mongo DB store
  5. Save the email object as .txt/.json file in “uploads” folder
  6. Iterate through attachments if any and store it in “uploads” folder [TODO: MongoDB GridFS]


How to run the application?

  1. Clone the repository or download the zip and extract into a folder
  2. set the SMTP configuration in “config.js” – use your email credentials
  3. Set MongoDB configuration – this is optional; if you don’t need it then comment out following line in “mailprocessor.js”
    1. db.save(JSON.stringify(msg));
  4. Set mongo collection as “emails” – This is applicable only if point # 3 is valid.
  5. run “npm update” – this will download all the dependent node_modules recognized from the package.json
  6. run “node mailprocessor.js”


Why I created this?

  • Business Motive – We have our home-grown helpdesk / ticketing system for our employees to make any kind of support tickets online however employees are comfortable in sending helpdesk tickets thru email to helpdesk@ourdomain.com; it is our process that the helpdesk / IS team should make a ticket entry for every request received thru email. This application is built to automate ticket creation from email.
  • Passion – I wanted to try out in Node.JS as it is very simple due to the availability of loads of ready made NPMs but choosing the correct npm is the key
  • No full-fledged sample – During the course of development I could not find complete references and samples easily i.e. mailparsing, attachments etc. I collected lot of references from stackoverflow Q&As

Scope for improvement (or TBD)

  1. Email filters and rules to process only those emails which matches a specific condition
  2. Store the attachments in the MongoDB GridFS


  •  JavaScript IDEWebMatrix, it is free from Microsoft and simple to use with ability to integrate with Git Repo
  • MongoDB Query Analyser – MongoVUE

Git set & unset proxy to avoid “git clone can’t resolve proxy” error

I keep get “git clone can’t resolve proxy” error while I use Git for my open source projects at home and in my office (Behind Firewall) interchangeably. I need to manually set and unset proxy based on where I work (Home – > unset proxy; Office – > set proxy)


Set Proxy [If you are behind Firewall]

C:\>git config –global http.proxy <your-win-user-name>@proxy.<yourdomain.com>:8080”

Example :

C:\>git config –global http.proxy http:senthilnathan@proxy.strobe.com:8080”


Un Set Proxy [Usually if you work from your home internet, no firewall]

git config –global –unset http.proxy




When I work Node Applications, I need to set the proxy if I am behind the firewall for node package manager (npm) to pull the required node dependencies


E:\>cd project\strobe > npm config set proxy “http://proxy.strobe.com:8080/

E:\>cd project\strobe > npm install

“STROBE” my new OSS Project in GitHub, an Enterprise Class Web Application Starter Project for Sails.JS (Node.JS)

I started a new rapid web application development project called “STOBE” in GitHub using Sails.JS. The project is active and we are making regular commits.

What is Sails.js?

Sails is a Rails like web application development framework in Node.JS. What the Sails.JS team says about “Sails.JS”.


Sails.js philosophy, performance and community




The wonderful guys from Balderdash who built the Sails.JS made the framework super duper simple to quickly create Enterprise Class Web Applications in Node.JS but I could not find any good starter application other than some “TODO” or discrete samples hence I wanted to build a production ready (mainly the UI layer) enterprise class web application with the following objectives:

    • Minimalistic User Interface Design using Twitter Bootstrap
    • Ability to change the Color theme of the Website using LESS
    • Loosely coupled and highly reusable rich client side widgets / components developed as partials – All components exposes apis, emits events which other components or partials & pages can subscribe for using Pub/Sub  pattern, dynamic javascript function names to avoid name collision if we use the same partial more than once in the same page
    • Client side data caching for components/widgets like dropdowns, multi-select dropdowns,  radio button groups using local storage with ability to invalidate the cache
    • Ready made server side pagination control for the bootstrap data tables (the component calls two REST apis to get the result collection & count of records as jquery promise ajax call)
    • Easy to use with no-coding client side validation attachment for form controls
    • Standard and consistent user experience for form controls, data tables, validation, error notifications, progress indicator for long running operations etc.
    • Server calls from the UI layer through Jquery Promise (Ajax) which consumes the controller actions (Blueprint & custom) as REST API calls
    • MongoDB backend (for schema less approach)
    • MongoDB GridFS for as document store
    • Automatic CRUD forms code generation for master data management. Some example master data are:
          • Department
          • Location
          • Role
          • Category
          • Bla bla bla…

All masters require, CRUD forms, their data needs to be shown in dropdowns, multi selects, radio button groups, look-ups, list view, datagard with search in another forms etc. It requires lot of repetitive code which is automated through scaffolding

  • Ready to use common use cases for any projects
    • Base controller & Models (Inheritance through lodash)
    • Login form
    • Dynamic navigation & menus
    • Users Creation with multiple roles (one-to-many)
    • User details & list view with server side pagination
    • Master data forms i.e. Role, Department
    • Forgot password
    • Email notification through “mailgun” cloud email engine
    • Result with total records (combining Find() & Count() in promise) for server side pagination
    • Encrypted passwords
    • Utilities library to host common functions


  • User interface to search and filter log files
  • JSON driven user manual page
  • File uploader widget with cloud integration.
      • Adapter to store document in the MongoDB GridFS, Azure Blob storage etc.
      • File uploader has following loosely coupled partials
          • File uploader form with progres indicator
          • File downloader
          • File listing
        • No attributes in the model. All are dynamic like in C# dynamics
        • Custom configuration files for Application settings & Web Settings
        • Queuing architecture for long running processes – RabbitMQ server & client (Pub & Sub)


All of the above said are up and running in my ASP.NET MVC implementation (ALT.NET way coding) which I am porting to Sails.JS with the help of my company VmokshaTechnologies Pvt. Ltd. colleagues Poulomi, Asharani, Sivaprasad & karthiga who generously agreed to contribute.

What we will develop finally in STROBE?

It is a generic implementation but I am planning to build an IT Asset Management System

Want to see the sneak peek of the strobe Application? watch the screencast



Dive into STROBE!

I will be writing series of Blog posts for Strobe. Please stay tuned to follow along the development of STROBE. Even you can download the code in the as-is state and see it working but you may have to go through tough time in understanding yourself.

Email to tweet using Node.JS, MongoDB, Twitter Bootstrap

Use case

There was a need to read & process all unseen email in a given email inbox and look for a specific pattern in the email message and if present extract the words / sentences and post (tweet) it into the twitter then insert that word / sentences into the MongoDB for audit trail

This article guides you, how the above use case is accomplished using Node.JS and will walk you through the code, design approach, configuration & setup.


1) Install node.js in your preferred environment. I have setup in Windows 7. If node.js is not installed, head over to http://nodejs.org/download/ download and install from the .msi installer for windows

2) Keep a valid email account. I have used yahoo email account.

3) Set up a twitter account and twitter application. After successful registration with twitter

a. head over to https://dev.twitter.com

b. Click My application as shown below


c. Click on “Create new application. Fill in the form, it will ask for callback url which you can give any valid url to be called post tweet. This is something like a hook script you would like to call post tweet. I have used a drop box url (empty public folder of mine) i.e. https://www.dropbox.com/sh/nq9jyown1mv5o4q/ezUBimRBMi


d. Once the application is created, note down the consumer key, consumer secret and then generate your access token and then note down the access token & access token secret. Important: Make sure the access level is set as “read, write & direct messages”

4) Install MongoDB either in your local or use one from cloud i.e. mongoHQ. I have used mongoHQ, there is a free sandbox version available which is good for starters and easy to register, setup and running in minutes. The setup procedure involves the following

a. Create a database as “TweetsDB” (or your choice)

b. Create a collection as “Tweets”

c. Click on “Admin” link and note down the mongo URI which will look like this



d. Click on “Users” and add new user & password for this database. Once created use that user and password in the respective placeholder in mongo URI mentioned above.


5) IDE of your choice. Node.js can be coded in any text editor but I have used a free IDE from Microsoft called WebMatrix V3.0. We can create Node.JS project . It has good javascript intellisense support , source control integration right in the tool and everything come free of cost.. This step is optional

6) Web server of your choice. This is an optional step to deploy the web client to see the list of tweets inserted into the MongoDB have used Apache web server.

Getting into the real meat, coding

Download the source code from my Github repository. It has two projects

1) Email2Tweet.Server – Contains Node.JS code to process the email, post it to twitter and insert into the mongo

2) Email2Tweet.Web – A simple HTML, CSS & Javascript client to pull all the tweets inserted in the MongoDB. This one was developed for fun. I have used Twitter Bootstrap for presentation layer, Jquery for Ajax calls and Moment.JS (a javascript date library for parsing, validating, manipulating, and formatting dates), Twitter bootstrap Table plugin to present the results in a nice looking datagrid with search, sort & pagination.

Email2Tweet.Server has got the following files

1) Config.json – For convenience, I have kept all the configurations in this config.json file so that changing the application name, key, email are simple without out altering the code


2) Email2Tweet.js – All the logics are embedded inside these files. For brevity I have not shown the code but listed the npm’s used and given the reason why

· Imap – For email processing

· Mongodb – For mongo DB. There are other easy to use npms available i.e. mongoose etc. but I have chosen native mongodb driver.

· Cheerio – To process the email messages for pattern matching. Email body is read as (html) string, string parsing using regex pattern is cumbersome whereas html dom query using css selectors is pretty simple which I am used to do in day in & day out using Jquery. I found out this npm is jquey equivalent in server side which made the life so simple. In this case I have to look for a pattern of strings starts with “BUY” & “SELL” from the html formatted email body. I have used dom inspector of chrome to learn the html formatting and used JS fiddle to identify the pattern. Head over my fiddle to learn the html body, Jquery dom selector used to identify all the sentences starts with “SELL” which are under html “p” tag with css class as “MsoNormal”  http://jsfiddle.net/senthilsweb/6Mx5H/4/


· Twitter – For to (Post) in Twitter.

· Cron – Run the entire process at regular interval

3) Server.js & Tweets.JS – A simple REST API server implementation which queries MongoDB and sends the results as JSON. CORS is enabled to make api’s accessible across cross domain. Following are the API’s exposed

a. /tweets/ – returns all tweets from the mongo DB

b. /tweets/{id} – returns tweets for a matching id from the mongo DB




Email2Tweet.Web – Contains Index.html & related JS, CSS files. Inorder to run this, keep all the files inside a virtual directory in Apache / IIS web server whichever is your choice. I have used Apache. Once successfully configured you will the result as shown below.



There are lot opportunities for improvement and code optimization which I will be doing as and when I get time and commit the source code in the Github.

There are plenty of other uses we can fit in this implementation or part of its implementation. One of the use case which comes to my mind is email to DB for helpdesk, we have our home grown helpdesk system and tickets are logged by logging into the system and key in the details in the form; instead people can send email with a specific “Subject” then a program like this can sniff the emails and log the ticket and sends the confirmation.