Electronics Rocks – 2014

erocks_logoBeen thinking of writing about Electronics Rocks 2014 (eRocks), finally able to pen-down few things. For people who don’t know what eRocks, here is a brief – it is one of the most popular electronics conference organized by EFY media. Last year I attended as a participant found some interesting things in the conference. This year, after joining Emertxe we got opportunity to be a community partner of the event and offer a presentation on the Internet-Of-Things (IoT) design challenges. The event happened during October 10th and 11th at NIMHANS convention center, Bangalore that attracted 3000+ participants for the event. For my session about IoT which was the key focus for the conference attracted 200+ participants. Post presentation we have received very positive feedback from the community.

While there are many things about the conference, here are my top-3 learnings from the conference:

  • Not only Open software: The field of electronics has become more interesting in the recent years mainly because of open source software and easily available/affordable hardware. While devices like Raspberry-Pi, Arduino has already become very popular, I found some of the new devices like UDOO which are becoming very powerful around which many cool things can be built. Going forward I see many companies flocking into this space which is yet to be tapped to its full potential.
  • Product design: While the previous point gives opportunity to build around so many ideas, there seems to be very large gap when it comes to product design knowledge. While student level knowledge is enough to build a prototype (ex: Agriculture automation) making it as a complete product required a different set of knowledge. During my discussions with many enthusiasts I found there is a severe lack of knowledge about Productization using real hardware.
  • IoT is not new: While there is a lot of opportunity around the IoT space, in my opinion it is not something very new. Connecting devices to network (say LAN) is been existing for a long time, which has taken a upgraded as IoT thanks to multiple advancements happening in embedded & web application development. During my talk also I mainly stressed about this aspect, where fundamentals needs to be taken care to build products in the IoT space.

Here is the Slideshare link to my presentation, comments are welcome. Couldn’t spend much time across various tracks due to my time constraints hopefully next year I will be able to do better by listening into multiple tracks (ex: Jugaad innovation).

My talk on “Open Source and Embedded Systems”

Lounge47 is one of the new age entrepreneurial platform connecting Entrepreneurs, Ideas and Businesses. Couple of weeks back I got a chance to talk on the topic “Tracing the evolution – Open Source & Embedded Systems” among entrepreneurs, enthusiasts and seasoned professionals. The talk was for about 45 minutes followed by Q & A which triggered many interesting questions.

Here is the presentation slides:

Electronics Rocks 2013

Electronics Rocks 2013
Electronics Rocks 2013

Last weekend, Electronics for You (EFY) folks organized a very interesting electronics related conference (called as Electronics Rocks) at NIMHANS convention Center Bangalore. I got a chance to attend the conference after a long time. The primary objective was to attend the hands-on workshop organized by Kits and Spares folks on their Mango Pi development board as I wanted to get more insights into these embedded learning kits, which was coming as a free item along with the workshop. Though workshop was a major flop show (details below), there were many interesting takeaways.

As mentioned in my previous post on Embedded learning kits, I spent quite some time surveying development boards that can be used for educational/learning purpose.  I started off by looking into the latest Beaglebone black board (by Texas Instruments) with ARM 335x Cortex-A8 processor, which doesn’t come with TI DSP for media processing. Upon conversation I understood, there are applications available, which can do decent graphics processing. However for higher end graphics related stuff, Pandaboard still works better as it comes with multiple media interfaces. Definitely there is also a 4 times price difference between these two (Beagle comes at USD 45 v/s Pandaboard at 174 USD). On business side, I could observe all these boards are sold by multiple re-sellers, who were having different stalls in the conference. Considering the price point and target audience, it perfectly makes sense to take re-seller option.

On the electronics components side, I could see many vendors who were showcasing their list consisting of various parts. Had some conversation with folks from RS components, who are into component selling from almost all the major semiconductor manufactures.  They also seem to have a hub of interesting embedded projects in platform called Designspark, where design engineers can can exchange their ideas and create projects. Along with components, there were also many vendors demonstrating debuggers, tools and embedded design services capability. The debugging space is definitely interesting, but I was not paying much attention as it makes sense only for devices with at least a JTAG interface, which was not my area of focus. From governance side, erstwhile Indian Semiconductor Association (ISA) has changed their name into Indian Electronics and Semiconductor Association (IESA) with responsibility of promoting electronics ecosystem in India.

Coming to Mango Pi workshop, it turned out to be a major flop. During initial promotion they mentioned this workshop as “Build a wireless robot in 60 minutes” using Mango Pi board and mentioned participants can take away a board at the end of the workshop. I was excited about it and registered for the session by paying extra 1500 INR, specifically for this workshop. It was a total chaos where they messed up everything starting with schedule. I was supposed to attend 11:30 AM slot, but they asked me to to attend the 12:30 PM slot due to increased number of participants . I waited for almost an hour, where there was a big queue and people were flocking into a small room. The workshop co-ordinators were relatively junior guys, who couldn’t manage this chaos, eventually mentioned they will do the next session in a bigger room located upstairs.

The upstairs location was an open one, where the sound system was not at all conducive for a workshop environment. On top of that, workshop speaker was of very low quality, neither he was good in communication nor he had much idea about technical aspects. He went on demonstrating building robots with totally a different kit (where Mango Pi board only plays a part), where there were other components like Arduino board, RF sender and receiver etc. I got totally irritated with this poor organization  ended up leaving the workshop within 15 minutes. When they organize an event of this scale, proper attention to be paid as it creates a strong impression on the whole product that is getting demonstrated. Executing it in such a ad-hoc manner has resulted in nothing short of a disaster.

Apart from the items mentioned above, there were a series of talks happening on multiple themes which I couldn’t attend due to personal time constraint. Probably I should plan and attend those sessions next time. Overall it was a decent conference, which provided me deeper insights into many aspects of embedded systems, educational kits, open source, Linux and related technologies.

Embedded self learning kits

Embedded systems is been my area of interest, ever since I attended computer networks course during my engineering days. In those days mainly networking devices were meant to be primary source of embedded systems as custom designed hardware and software would make networking (packet switching, routing, configuration & management etc…) faster. As a student Linux (or UNIX) was the primary source, where testing of target embedded image to be done in the same PC. In such cases getting the real kick of ‘embedded’ software was absent. Developing the embedded software in a host PC, using cross compiler/linker to generate target image, deploying it in target hardware (typically a board, which is supposed to perform certain functionality) was something an individual can only get in professional work environment to make the ’embedded’ learning complete.

Over a period of time I see the landscape changing significantly with multiple low cost self learning kits/devices flocking the market. Starting off with Texas Instrument’s Panda board, learning kits ecosystem started moving into a different level altogether.  Entry of Raspberry Pi  at $25 price-point about an year back, brought in further changes. Once these hardware folks release the initial hardware is out in the market, tons of open source enthusiasts backed by community are creating necessary software (ex: SDK) and projects by complementing it. This has opened up a new gamut of self learning opportunity, where individuals can learn latest embedded system concepts, programming and complete interesting project right from their homes or hostel rooms. As long as one has a booting Linux machine, it is enough to get started on these embedded learning kits.

Off-late there are multiple domestic providers in this field as well. The Kits and Spares online shop provides a whole bunch of such devices with which an individual can create small and useful projects. There are also specific training service providers like Thinklabs, who not only provide kits but also train in interesting projects like Robotics that can be built around the device. It’s been real fun to see combination of low cost democratic hardware with open source software, which is making Embedded systems learning very easy.

Shortly I am looking forward to lay my hands in one of these devices. Will share more details after that.

Data structure assignments and torture

Data structure & algorithms form backbone of programming. It is expected that any computer science graduate to have very good experience in using various data structures like linked lists, queues, stacks, trees, hash tables etc. During my REC Warangal days, passing thru the Data structure course was a real torture. The professor will ensure we slog thru our bones by having a very strict evaluation mechanism to evaluate every other assignment. Let me explain this in detail.

To start with, every week a new assignment topic will be given. We will create a basic design and start coding them, while theoretical part was still thought in the classroom. By the end of the week (Sunday 5 PM) we were supposed to copy the corresponding C file into a particular directory with a particular format. If it is done even at 5:01, it will not be allowed as an automated script would block write access to directory. Followed by this, every C program will be turn thru a Shell script, which will find 20% of total lines randomly and delete from the program by placing some special character (ex: /* $ */) as a placeholder. There is also a mandate that all assignments should not have any comments, thereby preventing students escaping by filling up comment lines instead of actual C program statements.

The story is not over yet! During lab session (in the next week), one hour of time will be provided where each one of us needs to fill-up deleted 20% lines followed by successful compilation and execution of the program. Here also timings are very strict. Just after one hour a script will automatically logout each of us from the computer. In the final phase of evaluation each of us should show the truncated program (from previous phase, whatever state it may be) and explain/answer some difficult questions related to data structure, asked by the professor. At any point of time the professor gets doubt (of copying assignment) whole assignment score will be nullified.

At the age of 18, it was too much of a pressure to handle. Completing program on time, copying to specified directory before Sunday 5 PM, Missing a meal/dinner, Skipping sleep, trying to fill-up missing lines within a hour and answering questions was a too long a process. In order to make it effective, the professor distributed marks across all these phases, thereby one cannot escape so easily without working hard. At the end of evaluation, I used to get a huge sigh of relief and sleep like a baby for hours together. Each one of us used to curse the professor for torturing us so much!

Today when I look back, I get a totally different perspective. If not for that strong evaluation mechanism, each one of us could have become lazy and never learned the art of programming. We could have mugged up some programs and passed exams. In my another post about going technical hands-on, I mentioned about debugging some of my old Kernel programs within a week, even though I was out of touch from programming for years together now. Definitely, the DNA which got injected in form of data structure programming is still there in my blood, which is helping me to pick-up programming with ease.

I also tried out a Shell script (of deleting 20% random lines from a C program), will upload it soon.

Technically, hands-on

Often ‘Technically, hands-on’ becomes a critical skill to have no matter the type of role/responsibility an individual is handling in an organization. Bitten by the same bug I thought of making my hands ‘dirty’ by working on some of the older Linux programs I have created. Long time back Yahoo used to offer their Geocities services, where individuals can create their personal websites. There were no automated wizard those days, where I ended up creating HTML pages on my own and uploaded into particular location (provide by Yahoo) by keeping all my older projects in ‘cloud’. Eventually Geocities services was discontinued by Yahoo, luckily a replica maintained by *.ws domain. I was able to retrieve all my old projects and corresponding data from this updated domain. Here is the URL, where I learned my first baby steps about personal website development and writing: http://geocities.ws/b_jayakumar2002

Cut to Linux! I have downloaded two of my older projects which perform the following functionality:

Both programs were written in older version of Kernel (2.4.2) whereas current mainline is running with 3.11. In the mean time the Glibc (GNU C library) also gone thru significant changes, some of the older routines and data types may not work as expected. Since I have been out of programming for years together, I had some initial difficulty to get these programs working. However I was taken by surprise the way open source help system has evolved over years. Let me state my key observations as follows:

  • Thanks to Virtualization, I was able to get a development machine up and running in a matter of 30 minutes using VMware. After trying out multiple distributions (Suse 11, Open Suse 12.3 and Ubuntu 12.04) I decided to go with Open Suse 12.3 as it offered all pre-built libraries. No doubt, Ubuntu offers excellent user interface, suites more to a desktop users than programmers
  • Compiling kernel and booting up new image has become much simple. There are very less manual steps to be followed as some of them (ex: making entry into grub) is created automatically. I still remember how much challenging it was to get kernel 2.4.2 image with lilo loader up and running!
  • There are tons of Linux related documentation, help sites and real-time experience sharing happening which makes getting help much easier. Especially Stackoverflow (http://stackoverflow.com/), TLDP (http://www.tldp.org/) and Linux-cross reference (http://lxr.free-electrons.com/) seem to provide anything ranging from syntax to data structure tracing inside Kernel
  • For any theoretical reference, Slideshare (http://slideshare.net) is having excellent presentations, where I was able to quickly refer back in case any theoretical questions
  • Linux Kernel debugging ecosystem also matured as lot. There are a bunch of diagnostics tools available (I only used strace, printk though), which makes Kernel debugging much easier. Need to explore more on both user and kernel space Linux debugging tools

It was fun to catch-up with programming after a long time. Will share more ,as I explore more into the world of Linux, Kernel and Open source!

What killed the Linux desktop?

Recently there was an article titled ‘What killed the Linux desktop?’ by Miguel de lcaza. Miguel is one of the popular free software programmers, played key role in creating popular desktop environments like GNOME. In his article Miguel clearly states some of the key reasons why Linux has not become a successful desktop operating system.

Based on his article, I would like to add some of my viewpoints as follows:

Loose coupling of Kernel and GNU: One of the key contributors for poor user experience with Linux based desktop is loose coupling between Kernel and GNU software. Kernel, which is an engineering marvel by itself (thanks to strict governance model), never had any commercial intent in mind at least when it started. When various GNU software got bundled to create a desktop Linux distribution loose coupling started happening. If a novice user faces any issues, he will not be able to figure it out why and what exactly going wrong. Version incompatibilities, dependent library/binary issues, Unclear/ambiguous documentation, not so strong community support for a normal user has become bottleneck, which hampered faster adaptation. While engineers and technologists enjoy this distributed, democratic model of development and got ‘kick’ out of experimenting anything and everything, it is far away from providing good user experience for a normal user.

Reverse approach by OS-X: The article also talks about a total opposite approach taken by OS-X, which initially focused on user experience by targeting normal users. Today everybody understands how advanced OS-X based products are when it comes to user experience. Even though it was initially perceived as ‘closed’ system by developers and hackers, slowly the move happened when they started providing more programmer centric features as a part of OS-X. Even though I have not experimented any OS-X terminals, I am sure as a programmer I will be able to achieve almost the same thing what I would be able to do with Linux.

Fundamental philosophy: The bottom line of Linux desktop challenge leads to fundamental philosophy on which Linux or open source software is built. Since the major objective was to provide free (freedom) for users, it is still lead by programmer centric thinking than user experience centric. Because when somebody starts thinking about user experience, it invariably leads to commercial intent, which may not go well with free software philosophy.

Unless tightly integrated system is built (ex: Android) around Linux, it is still far from being popular desktop operating system.

Customizing open source software

The Open Source and Linux saga seem to be never ending for me!

All of a sudden my Windows 7 installation stopped booting up, probably due to virus attack. Again bitten by the interest of Open Source I installed Open SUSE 12.1 as my desktop operating system as I was pretty happy with it using from my Virtual Box earlier. The installation process was a breeze, all basic functionality including wireless interface (where I head problems with Ubuntu 10.01) came up without any issue. Just about when I thought everything is fine (which I have been thinking for the past 10 years) one major problem popped up.

I have a Toshiba Satellite L640 model, which started heating up a lot after the Open Suse installation. Add to my woes, battery backup was hardly happening for 10 minutes. In spite of searching many online forums (and reading some stuff about ACPI interface) I couldn’t find a solution to fix the problem. While many of the threads in discussion forms acknowledge the problem, there was no solution available. Even if it exists it would be too geeky, might involve making some hacks which was not so obvious. With pretty decent understanding of Linux internals I was not able to figure it out the solution, let alone a novice customer finding it. The bottom line is many of the consumer (ex: laptop issues) specific issues don’t have an organized approach of solving the issue.

This incident popped few interesting questions in my mind. As Open Source provider, we can’t expect Open SUSE community to provide solutions for every other possible consumer hardware available in the market. Since the open source development done by thousands of developers around the world, we can’t expect them to know the vendor specific implementation information (ex: hardware spec) available. On the other hand I am not sure if it is Toshiba’s responsibility to release compatible software. Does this incompatibility issue offer some business opportunity?

In the enterprise side Redhat has implemented a model where basic Open Source software is provided at a very nominal cost but they make money by selling customization services. In the similar lines does providing open source software customization services for consumer markets offer valuable proposition? Can some innovative options thought of implementing such services and make it business viable? Currently not able to do a complete business analysis of this, but definitely this area can be explored with some innovative approaches.

Making Virtualization work

The post about shortcomings of Linux touched upon some of the key aspects, which needs to be fixed in order to make Linux as successful consumer desktop operating system. However I badly wanted to get Linux up and running in my Windows desktop (without harming existing Windows 7 installation) for some key learning activities. One of my senior colleagues suggested me to take the Virtualization option (using Oracle’s Virtual box), where I can have Linux as the guest OS inside the Win 7 machine itself. After some initial glitches (details below), I got what I wanted – A safe Linux installation with wireless internet.

Installation of Open Suse 12.1 as a guest OS using Virtual Box was very smooth. All I need to do was to create a new virtual machine instance by inserting the live CD. That’s it! I was able to see Open Suse 12.1 working along with all related applications. However I was still not able to browse the Internet as the wireless device was not getting detected. The “lspci” command (which lists PCI devices available in the host) didn’t list the wireless 802.11 interface. Ouch! For a second I thought of being at ‘square-one’ with wireless driver problem, which I encountered during Ubuntu 10.03 installation. But this time around I thought of getting the Internet connectivity using the wired (802.3 Ethernet) interface before experimenting with wireless.

When I executed “ifconfig” command as a super user, eth0 interface displayed IP address from a different subnet (10.x.x.x, ref image1), whereas the Windows installation had 192.168.1.x subnet (ref image2).


Image1 - Guest OS Linux Network interface



Image2 – Native Windows Network interface

Assuming it was an issue, I immediately changed the eth0 IP into 192.168.1.x network. After this change, I was not even able to ping into the wireless DSL gateway (with IP of, let alone get the Internet connection working. I figured it out something basically wrong happening! After discussing with another colleague I understood few important (rather basic) things about Virtualization:


  • Virtual Box creates a virtual network driver (ref image3) in the Windows, which acts as a DHCP server for the guest OS installed. This applies both for wired and wireless connections.
Image3 - Virtual network interface


  • The guest OS virtual interface uses NAT (Network Address Translation) mechanism for transmitting and receiving packets using the native Windows operating system. This explained me how the Linux installation had 10.x.x.x address assigned for the wired interface by default.
  • Even though the “ifconfig” is listing only one wired interface (with NAT address) it will automatically take care of routing packets using both wired and wireless interfaces. All it required to do was only one thing — do nothing!

Finally I am able to get the safe Linux installation working with wireless Internet with ZERO configuration change. I have learnt quite a lot by running small experiments; rather that is the only way you learn about Linux.

The Business of Open Source

In my previous post about Linux, I touched upon some of the key aspects that Linux lacks in order to become a successful consumer desktop operating system. However, as a programmer/engineer Linux is probably the best tool I can think of, when it comes to community based development and collective wisdom. Last few weeks I have been spending quite a lot of time studying the Open Source ecosystem and how various businesses are leveraging this ecosystem for their benefit. Let me lay down three major observations based on my study so far.

Open Source - Big deal of Business

Platform Linux – Unlike few years back (say 2007), the Open Source Linux has become much mature by looking into the Operating System as a platform. By adopting OS as a platform, products (mainly embedded) can do a lot of customizations depending on their need. This main advantage is aided by multiple middleware platforms (ex: ENEA) who offer options integrated like – build framework, Open Source package integration, patch configuration and management, tool-chain support for various architectures (mainly for ARM/PPC/x86) which is making very comfortable for any product to adapt Linux, which was a huge challenge few years ago. After getting the basic things working with the middleware platform, products can develop custom applications depending on the functionality. Based on my observation, I could see customizing and making Linux as a platform itself is a huge business, which most of the popular vendors (Windriver, Suse, Monta Vista) are doing. This is a remarkable change I see compared few years ago.

Quality at every step – The Open Source community of developers has attained a high level of maturity, which is built over two decades. Since the main source code for Kernel (kernel.org) and individual projects (souceforge.net) is maintained by volunteers, purely driven out of passion they ensure proper code review is done and approval process is followed before committing any change into the main branch. In case any issues, ‘self-detection-and-self-healing’ approach adopted by Open Source ensures it is rectified at the earliest. Also a bunch of benchmarking tools (ex: Linux Test Project) available to quality the changes made. Since everything is volunteer driven who stick to a set of common goals, Open Source is no longer a toy in the hands of geeks.

Corporate support – Now that big organizations have realized the power of Linux, they seem to support the community in a very strong manner. Every other Linux conference or event receives huge amount of financial and resource boost from these large organizations which is very heartening to see.

Again, these are my initial observations as I try to learn more into the world of Linux and Open Source. Some of them might look like very basic observations for an expert. Going forward, I will add more in-depth information based on my learning.