Author Archives: admin

Security Strategy

Amidst the highly publicized security breaches, such as the LinkedIn hacked passwords, hacktivists defacing high profile websites, or online thieves stealing credit card information, one of the under-reported security breaches are nation states or unknown groups stealing Intellectual Property information from companies such as building designs, manufacturing secret formulas, business processes, financial information, etc. This could be the most damaging security breach in terms of its effect on the economy.

Companies do not even know they are being hacked, or are reluctant to report such breaches. And the sad truth is that companies do not even bother beefing up their security until they become victims.

In this day and age, all companies should have a comprehensive security program to protect their assets. It starts with an excellent security strategy, a user awareness program (a lot of security breaches are done via social engineering), and a sound technical solution. A multi-layered security is always the best defense – a firewall that monitors traffic, blocks IP addresses that launches attacks, and limits the network point of entry; an IDS/IPS that identifies attacks and gives signal; a good Security Information and Event Management (SIEM) system; and good patch management system to patch servers and applications immediately once vulnerabilities are identified, to name a few.

Cost is always the deciding factor in implementing technologies. Due diligence is needed in creating cost analysis and threat model. As with any security implementation, you do not buy a security solution that costs more than the system you are protecting.

Harvard Club of Worcester

On June 7, 2012, the Harvard-Radcliffe Club of Worcester held its 106th annual meeting and dinner at the Beechwood Hotel. The event was well attended and was very successful. The keynote speaker, Frederick Eppinger, CEO of Hanover Insurance, gave a very interesting speech on the current improvements and the future plans for the City of Worcester.

During the meeting, I was asked to serve as the secretary of the club and I gladly accepted the role. In fact, I am very excited to serve along with an excellent company of officers. I look forward to very fun-filled and successful events.

I have been an active member of the club for the past couple of years – joining sporting events to cheer on the Harvard Crimson football, basketball, and hockey teams; serving dinner to kids at the Worcester Boys and Girls Club during the Harvard Community Day of service, and many others. News and pictures of the past events can be found here.

If you are an Harvard alum living in the Worcester or Central Massachusetts area, and want to get involved or network with fellow alums, please join our club events  or contact any one of the officers.

Disaster Recovery using NetApp Protection Manager

In our effort to reduce tape media for backup, we have relied on disks for our backup and disaster recovery solution. Disks are getting cheaper and de-duplication technology keeps on improving. We still use tapes for archiving purposes.

One very useful tool for managing our backup and disaster recovery infrastructure is NetApp Protection Manager. It has replaced the management of local snapshots, snapmirror to Disaster Recovery (DR) site, and snapvault. In fact, it doesn’t use these terms anymore. Instead of “snapshot,” it uses “backup.” Instead of “snapmirror,” it uses the phrase “backup to disaster recovery secondary.” Instead of “snapvault,” it uses “DR backup or secondary backup.”

NetApp Protection Manager is policy-based (e.g. backup primary data every day @ 6pm, and retain backups for 12 weeks; backup primary data to DR site every day @ 12am; backup the secondary data every day @ 8am and retain for 1 year). As an administrator, one does not have to deal with the nitty-gritty technical details of snapshots, snapmirror, and snapvault.

There is a learning curve in understanding and using Protection Manager. I have been managing NetApp storage for several years and I am more familiar with snapshots, snapmirror, and snapvault. But as soon as I understood the philosophy behind the tool, it gets easier to use it. NetApp is positioning it for the cloud. The tool also has dashboards intended for managers and executives.

Vmware Datastore via NFS

One of the objectives of our recently concluded massive storage upgrade project, was to replace our vmware datastore from iSCSI to NFS. I have been hearing the advantages of using NFS versus block-level storage (ie, iSCSI or Fiber Channel), and true enough NFS did not disappoint.

We have been using iSCSI on NetApp as datastore on vmware for a long time, and it has been running pretty well. But when we perform maintenance on the NetApp storage, the virtual machines were often times affected. In addition, restore procedures can be a pain.

While Fiber Channel (FC) is still the standard storage for most vmware implementation because of its proven technology, in my experience here are the advantages of using NFS over iSCSI or FC:

1. Robust, as long as you follow the best practices guidelines. For instance, separate the NFS network from the general use network. Vmware and NetApp released white papers discussing the NFS datastore best practices. In our environment, we did several failover on the NetApp storage to upgrade the Data ONTAP version, and the virtual machines were never affected.

2. Easier to configure both on the vmware side and the NetApp side.

3. Easier to backup, via NDMP on the NetApp side.

4. Easier to restore vmdk files using the snapshots on the NetApp side, since there is no need to mount LUNs.

5. Vmware and NetApp built great tools for seamless maintenance and operations.

Thoughts on Information Security

I cannot stress enough the importance of information security. Almost everyday we hear stories about security breaches – hacker groups defacing websites for political purposes, countries stealing proprietary information from other countries and companies, organized crime stealing credit card information and selling those in the black market.

Cloud computing and mobile devices have exacerbated the problem.

The thing with security is that it is at odds with convenience. We want to get things done quickly, but security slows us down. For instance, we are required to enter hard to guess passwords to access our bank account online or access our company’s applications. Why not just let us in right away? Remembering passwords (and lots of them) and being required to change them every three months take some time and effort.

But if we want ourselves and our companies we work for to be secure, we should give up a little convenience. There is no other way.

A lot of technical solutions and innovations have been devised to improve information security. But no amount of technical innovation can solve the weakest link in security – social engineering. Remember the “I Love You” virus several years ago? It was a virus that was spread when you open an email with the subject line “I Love You.” Who wouldn’t want to open an email with that subject line?

User awareness is the key. Companies and individuals should at least invest in training on security and privacy.

The sad thing is that many companies and individuals do not take security very seriously, until they become victims. True, we should not spend significant amount of time and money for security. The resources we spend on security should be proportional to the assets we are protecting. You should not buy a 1 million dollar vault to protect your 100K painting.

When I obtained my CISSP certification several years ago, I didn’t plan on specializing on information security. I have, however, incorporated good security practices in system and network design and implementation, virtualization, storage, and almost all aspect of IT. But with the tremendous need for IT security professionals these days, I might consider specializing in information security.

Book Review: The Big Switch – Rewiring the World from Edison to Google

The Big Switch: Rewiring the World from Edison to Google. Nicholas Carr. New York: W. W. Norton and Company, 2008. 278 pp.

The future of computing, the book argues, is utility computing. Information Technology (IT) will reside “in the cloud” in a centralized fashion, and will be controlled by a few service providers who have built massive data centers. Just like electricity, IT will be delivered as a service to home users and to small and big companies. The IT departments of these companies may become irrelevant. There will be no need for them because “individuals and business units will be able to control the processing of information directly.”

High bandwidth availability makes utility computing possible. Soon, companies will outsource all of their IT functions from storage to applications to programming, to service providers. As a service provider, Google has started this trend with their Google Apps. Similarly, Amazon has offered software and hardware as a service. For instance, if a company needs an application, all they have to do is tell one of these service providers and the application will be available in no time. They don’t have to go through the hassle of procuring equipment, hiring programmers, and developing the application.

This next big thing has many names – cloud computing, utility computing, grid computing, and software/hardware as a service (SAAS) – but the book called it the World Wide Computer.

The premise of the switch from internal IT to the World Wide Computer is that too many resources are wasted on IT – labor, hardware, software, redundant systems, and overbuilt IT assets. The book contends that IT costs too much for what it delivers. There is just an excess in servers and computing capacity. Ultimately, it’s not the technology but the economics of it that will prevail. The cloud will make efficient use of IT resources.

Because everything is wired, physical location will not matter anymore. The same is true with software licensing. The model will be much like the electricity – the client pays for usage, not the costly software license that have made companies like Microsoft very rich. The new model, the book argues is very much like the Google Apps model. Users will be empowered when tapping the World Wide Computer – the possibilities are endless with its infinite information and computing power.

For people who have been following the computing revolution, Carr’s concept of utility computing is old news. IBM and other IT visionaries have been talking about utility computing for years. However, his book has successfully articulated the concept by drawing the parallelism of the evolution of electrification and the evolution of computing.

The history of electrification was well researched from the first waterwheels to windmills to the current centralized power generators. Similarly, the history of computing was well researched too, from Hollerith’s machine to IBM mainframe to personal computing, to client-server computing, and web computing. Along the way, Carr infused the business and economic forces that shaped their current form. He likewise talked about the social impacts of these – how it has changed societies and consequently changed people’s lives for the better. He discussed in great length the economic and social impact of the World Wide Computer – how the world will become more increasingly multi-polar instead of being united, the weaknesses of free flowing information, and the loss of human privacy.

Inasmuch as I agree with Carr’s position of utility computing, I do not believe that everything will go to the “cloud”. In my opinion, the future will be hybrid computing. There is so much computing power in every personal computer, laptop and mobile device that not utilizing them is a waste.
The IT department of large corporations will not disappear. The book missed the point that for some companies, the IT system is strategic, and they cannot simply outsource all of their IT functions. For instance, financial companies rely heavily on their IT system. Take it away from the stock market, for example, and trading will halt. The point is that: IT has varying degrees of importance for each company. But for electricity, there is none. Everybody needs electricity since it’s a commodity and can easily be sourced from other sources (such as using internal generators). IT cannot simply be commoditized – companies need specialized applications.

Another issue is data security and privacy. In the cloud, we don’t know where the data is stored. Intellectual property and company knowledge are just too important for the company to be hosted somewhere where security and privacy laws are not well defined. Unless there is a global law on data security and privacy, companies will hesitate to put their precious information in the cloud.

Finally, there is the law of unintended consequences. We cannot simply have a complete picture of the future. It is ironic for instance that because of the current concern for the environment, companies and homes alike may be generating their own power using solar, windmill or other means, thus decentralizing the electricity generation once again. The use of electrification as a metaphor for the World Wide Computer may not be accurate after all.

Interviewing Harvard College Applicants

Two years ago I was asked if I would like to interview students applying for admission to Harvard College in Worcester, MA area. As an Harvard alum, I was glad to volunteer to interview four to six applicants during the admission period (October to March).

The qualities of the applicants I interviewed are very impressive – excellent academic grades, lots of extra curricular activities, and great personalities. But I realized that not all of them will get accepted. In fact, with 6% acceptance rate (only approx 2,000 out of over 33,000 applicants get accepted), getting into Harvard College is a tall order. Only the best of the best are accepted.

I knew it will take a long time before I get to interview that person who will make it to Harvard College, if at all. But last winter, I was lucky to interview a very promising applicant. True enough, she got an acceptance letter. I was very happy when I learned about it. Congratulations Taylor Benninger of Spencer, MA!

Skiing

Winter has been particularly mild this year in New England. But that did not stop me and my daughter, Justine, to go up the mountains and ski.  Most ski resorts have snow making capabilities anyway and the mountains in northern New England usually have natural snow.

Aside from the usual skiing trip to Wachusett Mountain, which is only 35 minutes from our house, this year we also went to Bretton Woods in New Hampshire and Okemo Mountains in Vermont with my wife’s friend, Dick.

I used to hate the snow. I don’t like to clean up our driveway after a snow storm.  It’s so much more comfortable to curl up in front of the fireplace. 🙂 And I don’t like to drive in the snow because it can be dangerous.  Several years ago, my car spun 360 degrees in the highway.  I was lucky there was no other vehicle near my car.  And I grew up in the tropics, so warmer weather is much more suited for me.

But there is no way we can avoid the snow in New England.  We need to find outdoor activities to avoid cabin fever. So, two years ago, my daughter and I signed up for skiing lesson, and soon realized that we loved it.

I also realized that it’s a good activity to bond with my daughter.  I’m glad we tried skiing and I’m looking forward to many skiing and bonding trips with my daughter for many years to come.

Toastmasters International Speech Contest

I just won first place at the International Speech Contest at our Toastmaster at Abbott BioResearch Club today, March 13, 2011.  I was very honoured to compete against three other seasoned Toastmasters.

My speech was about how I lost weight and why my daughter inspired me.  The title of my speech was “Persevere, Overcome, Succeed.”  The event was very well attended.

I’ve been with the Toastmasters Club for more than five years and I have completed my Competent Communicator (CC) award last year.  But this is the first time I joined the speech contest.  There was a different feel to it compared with our regular Toastmasters bi-weekly meeting, knowing that I was competing. But, it was a very rewarding experience.  I had to write and practice my speech three weeks in advance.  I guess it paid off.

On to the Area E, District 31 speech contest on March 27, 2012.

Backup Infrastructure

I have been designing, installing , and operating backup systems for the past several years.  I have mostly implemented and managed Symantec Netbackup (used to be Veritas Netbackup) for larger infrastructures and Symantec Backup Exec for smaller ones.

These software worked very well although some features are not very robust.  I’m very impressed for instance of the NDMP implementation in Netbackup.  Backing up terabytes of NetApp data via NDMP works very well.  However, I do not like the admin user interface of Netbackup since its not very intuitive. Their bare metal restore (BMR) implementation also is a pain.  Some of the bugs took years to fix.  Maybe because there are not too many companies using BMR.

Backup Exec works very well with small to medium systems. It has very intuitive interface, it is relatively easy to setup, and it has very good troubleshooting tools.  Lately though, Symantec has been playing catch up in their support for newer technologies such as VMware. It is so much easier to use Veeam to manage backup and restore of virtual machines.  In addition, Backup Exec has been breaking lately. Recent Microsoft patches have caused backup of System_State to hang.

But I think the biggest threat to these backup software are online backup providers. Crashplan, for instance, was initially developed for desktop backup, but it will not take long before companies will use it to back up their servers. When security concerns are addressed properly by these providers, companies will be more compelled to backup their data online. It’s just cheaper and easier to backup online.