Sending SMS with Twilio using node.js

This blog post is more of a tutorial for myself.  Prior to trying this out, from the online twilio account buy a phone number that allows SMS communication.

The first step is to simple install twilio sdk for node using following command in the directory where you want to send messages from.

mkdir ~/twilio-code
cd ~/twilio-code
npm install twilio

Create a file under the directory twilio-code and name it sendTest.js


//require the Twilio module and create a REST client
var client = require('twilio')(accountSid, authToken);

to: "+11234567890",
from: "+19876543210",
body: "How you doin?",
}, function(err, message) {

Now running node sendText.js command from the command line, you will be able to send the text message easily.  Each text message sent using twilio is charged some amount so be careful with your testing or use the test SID and AuthToken that Twilio provides.


Scheduled call forwarding using Twilio, AWS Lambda & API Gateway


I was recently tasked with implementing time based call forwarding at my workplace.  My company has an onsite team that takes calls on a zendesk provided number during business hours and we have a team offshore that would get any incoming calls outside our business hours.  We wanted the ability to automatically route incoming calls to Zendesk between the hours of 5 AM to 5PM and to the third party vendor outside of these hours.  Initially, we were working with 8X8 to use their service but due to the need to route calls and the call quality provided, we switched to Twilio.


We are going to use Twilio as our provider for phone numbers. Any time there is a call to the twilio provided number, we can take action by connecting to a certain URL.  In other words, Twilio provides the ability to call a POST or GET on some URL to manage how we handle calls.  For low volume of the calls initially, we will be using AWS Lambda/API Gateway as it is easily scalable and relatively cheap service. In future, if the call volumes increase beyond a certain point, we can expose an endpoint from within our app to handle the response for call forwarding.

TwiML Bins

For simple call forwarding with no scheduled switch, Twilio provides a nice XML like language called TwiML (Twilio Markup Language).  When we port numbers to Twilio, twilio provides us with the ability to call external URL using TwiML (Twilio Markup Language) or webhooks.twilio_1

If you navigate to Phone numbers and purchase a number, you will be taken to the screen as shown above.  You can setup simple call forwarding at all times by clicking on the + sign next to TwiML dropdown which will take you to the screen below and allow you to configure the call forwarding to number 1-123-456-7890 in my sample case.
For more information on the TwiML syntax for the Dial verb, you can navigate to Dial Verb API.  For our requirement to allow call forwarding for specific hours, however, this approach will not work.  TwiML doesn’t allow complex conditional call forwarding.  To allow for that, we need to host our own webhook code online.

AWS Lambda

AWS Lambda is a compute service that allows user to upload code that will be run as a service on AWS infrastructure.  In our case, we want to trigger the execution when we have a call event on any of the twilio numbers.  I am using Node.js for the project because it has a good support community for Lambda on AWS.  What we want to do is check for the time of the day when the call is generated and based on that respond with a different TwiML to forward call.

API Gateway

The API gateway will be used by us to provide REST endpoint for the Twilio webhook.  On any incoming call, Twilio will hit the API Gateway endpoint that will trigger the Lambda function and pass back the XML response to Twilio webhook thus forwarding the call.

Node.js callForwarding.js code

  • Install node.js
  • Create a folder called twilio-code
  • npm init

    This will setup the node project with package.json file

  • npm install moment

    I have used moment library to compare the hour of the day. AWS uses UTC time so factor that in the code.

  • Create the twilioCallForwarding.js code and test it locally
  • Final step is to zip up the contents of twilio-code folder in a file named

twilioCallForwarding.js code

var moment = require('moment');
var now = moment();

exports.handler = function (event, context) {
// Event will contain the information passed from Twilio to Lambda.
console.log("Event information: "+JSON.stringify(event));

qryObject = parseQuery(event.reqbody);
var numCalled = qryObject['To'];
var openXml = '<?xml version="1.0" encoding="UTF-8"?><Response><Dial timeout="60">';
var closeXml = '</Dial></Response>';

// Check for time and act accordingly
var hour = now.hour();
responseXml = '';

// Remember that AWS uses UTC time and I used PDT time
if ((hour >= 12) && (numCalled == '+19876543210')) {
responseXml = openXml+'+11234567890'+closeXml;
} else if (numCalled == '+19876543210') {
responseXml = openXml+'+12345987600'+closeXml;

// Log responseXml for verification


// Twilio call passes parameters as application-x-www-urlencoded that need to be parsed for our use and hence we will use the following method to parse it.
function parseQuery(qstr) {
var query = {};
var a = qstr.substr(0).split('&');
for (var i = 0; i < a.length; i++) {
var b = a[i].split('=');
query[decodeURIComponent(b[0])] = decodeURIComponent(b[1] || '');
return query;

AWS provides the handler function with two arguments of event and context (additionally you can also have error as an argument) and refer the lambda documentation to refer how we use them.  Event normally stores parameters that were passed as part of the POST request and context will store the state and response value from Lambda function.

Setting up AWS Lambda

Once you login to the AWS management console, navigate to Lambda under compute section.  On clicking the create a new function link, you will be navigated to blueprint page.  In this case, select runtime of ‘Node.js 4.3’ and filter for ‘twilio-simple-blueprint’.

This will take you to the configure triggers page.  We want our trigger to be an HTTP POST call from Twilio and we will be using API Gateway to handle the event trigger.  Configure the service as following figure shows:


Lastly, we get to the configure function page.  Name your function the same name as the .js file which will be the entry point to the Node.js code and that contains the exports.handler function.  Modify the handler on the config UI to be twilioCallForwarding.handler.  As we are not going to access any other AWS data (RDS, S3, etc) we don’t need to define a VPC for the function.  The final screen after configuration is as follows:


Notice that I have used an existing role that I had created previously for another project.  This role has policies defined that allow access to AWS Lambda, Cloudwatch, API gateway and S3 bucket operations.  Upload a zip named which contains the .javascript source files and all necessary node_modules and head onto the next review screen.  Once you review the function, we can proceed to test the function and verify results.

Setting up API gateway

Proceed to Amazon API Gateway service and click on Create API and enter values as below.


This will create an API for us that we need to add methods/endpoints to. Amazon refers to them as resources.  Click on Actions dropdown and create a New Child Resource with name CallForwarding having resource path of callForwarding.  Select the newly created resource and add a new method to it for POST operation.



Next, we will setup the POST method to use our Lambda function located in us-west-2 region.  This will take us to a screen with Method execution flow outlined.  Click on the Integration Request and open the Body mapping templates.  Twilio uses form-urlencoded type to send parameters across which include information such as From number, to number, region of call origination, destination of call, state of the call, etc.  Add a mapping template for type ‘application/x-www-form-urlencoded’ and text as shown in the diagram below and click save to save the changes.


Modify the Method Request to be open for everyone so as to allow everyone access to the API without the need for API key.  This will allow Twilio to call our Endpoint from outside the amazon network.

Next, navigate into the Method Response section and add a Response Model to allow for application/xml response as Twilio expects xml response from our service to redirect calls.  Don’t worry about the response headers as they will be set automatically later on.


Lastly, we need to configure the Integration Response section and add Body Mapping Template that will return application/json with following template:


Now navigate back to the method execution screen and click on our resource method. From the Actions dropdown, select Enable CORS and test the API endpoint.

As a final step, we need to set the URL provided by the API gateway in the Twilio webhook section for phone numbers as shown in the diagram below.  Now calling this Twilio number will forward calls depending on the time of the day.


Windows USB/External HDD on OSX

It is such a pain to open any external hard drive or USB stick that has been written and formatted on a windows machine when trying to open it up on a macbook.  I was trying to access photos on my external hard disk that had been originally formatted using a Windows 7 OS (thus NTFS file system) and I could only read the files on my macbook but make no edits to them.  I read online that there are some paid tools that allow you to access the files on such drives but I try not to pay for tools that I’ve never heard of.  The other option is to re-format the drive on mac but that causes wiping out all data on the drive.

I came across the following steps that allow you to enable Mac OSX 10 to access files on such external hard drive without formatting it or losing any data.  Note: You need to have sudo rights (or be the administrator with root privilege) to perform the operation.

Run the following steps on mac OS terminal (can be started by pressing Command + space and typing terminal) :

$> sudo vi /etc/fstab

In the vi editor, add the following line:
LABEL=WESTERNDIGITAL none ntfs rw,auto,nobrowse

and then type :wq! to exit the vi editor window. On the terminal run:

$> diskutil unmount WESTERNDIGITAL
$> diskutil mount WESTERNDIGITAL

Note:  Replace WESTERNDIGITAL with the name of your drive in all lines above with the drive name as showing up in finder window.  If your drive name contains spaces, use quotes around the name e.g. “WD DRIVE”.  You can also find the name by running :

$> ls /Volumes/

There will be a Macintosh HD and the other will be your USB/External HDD. It might happen that the finder window will not show your drive once you have made the change mentioned above.  To open the drive with RW mode, run following in your terminal window:

$> open /Volumes/WESTERNDIGITAL

Lastpass Password Manager

I was recently introduced to this Chrome browser extension called Lastpass.  This is a really handy extension that allows you to store passwords for multiple sites.  The advantage is that I have one single 20 character mixed password that I use to log into Lastpass and then using lastpass, I can generate passwords that are 50-100 characters and I don’t have to remember them.  It provides the following benefits:

  1. Extra security:  I no longer have to use the same password for multiple sites so that I remember it.
  2. Autochanging password: One of the nice features is that lastpass reminds you to change password every set duration.  This is useful for banking or applications that contain sensitive data
  3. Sharing passwords: I can share passwords with other users within my team without giving out my actual password and I can then change the password

I just love this service and you should also give it a try.  They have a free version that is really good and once you start using it, you can upgrade to the premium service that is 12$ a year.

MongoDB 3.0 Authentication Issue

We upgraded to MongoDB 3.0 version and that caused some of the tools that we used for visualization of Mongo data to stop working.  We use Robomongo at my workplace and it started failing with Authentication error.  The error I saw was as follows:

Failed to authenticate admin@admin with mechanism MONGODB-CR: AuthenticationFailed MONGODB-CR credentials missing in the user document

There is a JIRA ticket that is open on the Mongo Issue list which can be found here.  As mentioned in the issue, MONGODB_CR was an older authentication algorithm that was used prior to 3.x versions.  This causes third party tools to stop working when we move to Mongo 3.x because they still try to authenticate using the older mechanism instead of using the newer SCRAM-SHA-1.  I had to perform the following steps to fix the issue and it seems to have worked:

  1. Start MongoDB in non authenticated mode. From the command line, it will be the command
  2. mongod
  3. Switch to the admin database, which stores the usernames and passwords for all users that might need authentication.  If you are using some custom database to store the users, switch to that database
  4. use admin
  5. Now we need to change the authentication schema used for authentication to MONGODB_CR. The value assigned for newer schema is “5” and for the older one is “3”
  6. var schema = db.system.version.findOne({"_id" : "authSchema"})
    schema.currentVersion = 3

Once the above change is applied, I still was not able to use mongo with authentication. I had to drop and recreate existing users for Robomongo to successfully connect to mongo db in authenticated mode. Once the third party tools also upgrade to work with proper authentication, refer to the link here to again upgrade to using the SCRAM-SHA-1.

Oracle: Export empty tables

When taking a database dump from Oracle 11g recently, I found that the export did not include tables that had 0 rows.  The Oracle11g instance implements a space saving measure where if your table has no data, it will not be exported.  Space is allocated only when you add data to the table.

A workaround to the issue is to pre-allocate some space to such empty tables.  Execute the following oracle queries to allocate space for all tables before taking the export:

select 'alter table '||table_name||' allocate extent;'
from dba_tables
where segment_created = 'NO'
and owner = 'DB_USER;

Running the above query will generate some alter table statements for all tables with empty data.  Executing these SQL statements will allocate space to the empty tables and allow you to export and import such tables.

Mongodb JAVA Driver: QueryBuilder class

I was recently assigned a task where I had to convert a SQL like where clause query into Mongo query on the fly.  There are some good drivers available to do the same but they are not free.  The best example is UnityJDBC but it was a bit expensive for our use case.  The query needed to be super simple and we made the following assumptions:

  1. We need to support simple SQL statements that don’t have brackets.  Hence, type = exam and score >= 60 is acceptable but type = exam and (score >=60 or score < 30) is not acceptable
  2. The query will only work against double for >, <, >=, <= and will support String and Double datatype when querying for = and != operators.

We decided to come up with something in-house that would be sufficient enough to satisfy our needs and we can build on top of the framework over time like adding support for brackets, Logical operators like XOR, etc.  Let’s use the following sample data to proceed with the example:

{ "_id" : { "$oid" : "50906d7fa3c412bb040eb577" }, "student_id" : 0, "type" : "exam", "score" : 54.6535436362647 }
{ "_id" : { "$oid" : "50906d7fa3c412bb040eb578" }, "student_id" : 0, "type" : "quiz", "score" : 31.95004496742112 }
{ "_id" : { "$oid" : "50906d7fa3c412bb040eb579" }, "student_id" : 0, "type" : "homework", "score" : 14.8504576811645 }
{ "_id" : { "$oid" : "50906d7fa3c412bb040eb57a" }, "student_id" : 0, "type" : "homework", "score" : 63.98402553675503 }
{ "_id" : { "$oid" : "50906d7fa3c412bb040eb57b" }, "student_id" : 1, "type" : "exam", "score" : 74.20010837299897 }
{ "_id" : { "$oid" : "50906d7fa3c412bb040eb57c" }, "student_id" : 1, "type" : "quiz", "score" : 96.76851542258362 }
{ "_id" : { "$oid" : "50906d7fa3c412bb040eb57d" }, "student_id" : 1, "type" : "homework", "score" : 21.33260810416115 }
{ "_id" : { "$oid" : "50906d7fa3c412bb040eb57e" }, "student_id" : 1, "type" : "homework", "score" : 44.31667452616328 }
{ "_id" : { "$oid" : "50906d7fa3c412bb040eb57f" }, "student_id" : 2, "type" : "exam", "score" : 19.88180838833524 }
{ "_id" : { "$oid" : "50906d7fa3c412bb040eb580" }, "student_id" : 2, "type" : "quiz", "score" : 1.528220212203968 }
{ "_id" : { "$oid" : "50906d7fa3c412bb040eb581" }, "student_id" : 2, "type" : "homework", "score" : 60.9750047106029 }
{ "_id" : { "$oid" : "50906d7fa3c412bb040eb582" }, "student_id" : 2, "type" : "homework", "score" : 97.75889721343528 }

Let me start by showing a simple program that we can use to get all records for student with student_id of 2.  There are two classes available to allow us to do this.

1. Using BasicDBObject class:

import com.mongodb.BasicDBObject;
import com.mongodb.DB;
import com.mongodb.DBCollection;
import com.mongodb.DBCursor;
import com.mongodb.Mongo;

public class BasicDBObjExample

public static void main(String[] args) throws UnknownHostException
// Connect to the Mongo database
Mongo mongoConn = new Mongo("localhost", 27017);
DB mongoDb = mongoConn.getDB("DP_TEST");
DBCollection collection = mongoDb.getCollection("DATA");

// Building the query parameters
BasicDBObject studentFinder = new BasicDBObject();
studentFinder.put("student_id", 2);

// Fetch the records for the query.  collection.find() will return Cursor with records that match the query DBObject
DBCursor dbCursor = collection.find(studentFinder);



2. Using QueryBuilder class:

import com.mongodb.QueryBuilder;
import com.mongodb.DB;
import com.mongodb.DBCollection;
import com.mongodb.DBCursor;
import com.mongodb.Mongo;

public class QueryBuilderExample

public static void main(String[] args) throws UnknownHostException
// Connect to the Mongo database
Mongo mongoConn = new Mongo("localhost", 27017);
DB mongoDb = mongoConn.getDB("DP_TEST");
DBCollection collection = mongoDb.getCollection("DATA");

// Building the query parameters
QueryBuilder studentFinder = new QueryBuilder();

// Fetching the records for the query. get() method will convert QueryBuilder -&gt; DBObject class of query parameters
DBCursor dbCursor = collection.find(studentFinder.get());



Notice that both the examples return the same set of records.  The only difference is in the way we construct the query to be run against the Mongo database.  The QueryBuilder class comes handy when you have A < 10 and A > 20 kind of queries.  For example, let us find records where student_id > 0 but < 2.  The query construct will be as follows for BasicDBObject :

BasicDBObject studentFinder = new BasicDBObject();
studentFinder.put("student_id", new BasicDBObject("$gt", 0).append("$lt", 2));

The syntax is simpler in case of QueryBuilder as follows :

QueryBuilder studentFinder = new QueryBuilder();

Finally, based on the above example, we can create queries at run time.  We are using groovy language and Java 1.7 for our framework and so the sample method to convert filter string at run time will be as follows:

	// Process queries of type A > 10 and B < 5 and B > 1
	private static QueryBuilder getQueryBuilderByString(String filter) {
        if(filter == null || filter=="") return null;
        String[] filters = filter.split(" ");
        QueryBuilder q2 = new QueryBuilder();
        for (int i = 0; i < filters.length; i++) {
            if (i == 0) {
            switch (filters[i]) {
                case ">":
                    if (i + 1 <= filters.length)
                        q2.greaterThan(filters[i + 1].toDouble());
                        return null;
                    i += 1;
                case ">=":
                    if (i + 1 <= filters.length)
                        q2.greaterThanEquals(filters[i + 1].toDouble());
                        return null;
                    i += 1;
                case "<":
                    if (i + 1 <= filters.length)
                        q2.lessThan(filters[i + 1].toDouble());
                        return null;
                    i += 1;
                case "<=":
                    if (i + 1 <= filters.length)
                        q2.lessThanEquals(filters[i + 1].toDouble());
                        return null;
                    i += 1;
                case "=":
                case "==":
                    if (i + 1 <= filters.length)
                        if (filters[i + 1].isDouble())
                  [i + 1].toDouble());
                  [i + 1].toString());
                        return null;
                    i += 1;
                case "<>":
                case "!=":
                    if (i + 1 <= filters.length)
                        if (filters[i + 1].isDouble())
                            q2.notEquals(filters[i + 1].toDouble());
                            q2.notEquals(filters[i + 1].toString());
                        return null;
                    i += 1;
                case "and":
                case "AND":
                    if (i + 1 <= filters.length)
                        q2.and(filters[i + 1]);
                        return null;
                    i += 1;
        return q2;

Refer the QueryBuilder API documentation for more info.

ORA-28001: The password has expired

I am a heavy user of Oracle database and I have installed the Oracle 11g on my desktop at work.  We use it as our primary database and I created several different instances of database users on my local edition of the RDBMS.  Recently, I logged into an old DB user that I had created on my desktop after quite some time.  It was an older version of our software and so when the Services team reported a bug, I wanted to spin up the instance and try to debug the reason for the error.

When I tried logging into the DB, I kept getting the ORA-28001: The password has expired error.  Now, I am familiar with changing password and I fired up a command prompt, logged in as the root.  I ran the query below to change the password:

        SQL> ALTER USER system_36 IDENTIFIED BY system_36; –system_36 is the existing username and password

When I tried logging in as system_36, it again gave me the same error of ORA-28001.  On doing some research I found that by default the password expiry is set to 180 days.  The account doesn’t fix itself simply by changing the password on it.  I am outlining the solution that worked for me below:

1. Connect to the database with sysdba privileges (sys user).
2. Execute the following script to identify the profile for which you want to set the password life to unlimited from the regular 180 days default setting.        SQL> SELECT * FROM dba_profiles;
3. In case you want to apply the behavior to all future users, execute the query below else replace the DEFAULT profile with profile you want to change
4. Now to unlock the user account execute the following query
5.  Now you can change the password/apply the existing one again using the following query
SQL> ALTER USER system_36 IDENTIFIED BY system_36;

The user information is stored in dba_users table if you want to check and unlock other users on the same server.


UberConference is a new offering that one of my colleagues mentioned to me about.  It is the latest offering in a crowded space of conference calling.  There are several other offerings on the market like WebEx, GoToMeeting, etc.  However, the thing I liked about Uber is that it is very light weight and easy to use.  It also provides better features if you are only looking for telephonic conferencing.  Some of the cool features of Uber are as follows:

  • Automatic authentication based on the number dialed from
  • Automatic call out to invitees when the conference starts
  • Monitor users on the call and controls like mute or hang-up specific users
  • Call Recording in mp3 format and call summary at end of each call
  • Relatively cheap

With a professional account, UberConference is much better than WebEx teleconferencing.  I would highly recommend this to all professionals.  Their app is available on Android and iOs.