A Blog that seems to work sometimes... Listening on port 65536

Sunday, December 18, 2005

The Uncertainity Principle for Technology

Lately, I have had a lot of free time to myself. Not that there's been no work - no - but basically, I've been goofing up for some time - primarily because of this bloody disease called "common cold". Anyway, with a light head (literally empty) and some time at my disposal I expected myself to come up with something ridiculous for my blog - so here it is.

How does one measure what people want from technology? In other words, how does a technologist determine which problem should he/she solve? Well, (s)he looks at the current scenario of things and says hey, this could be done better - or maybe, see something which is not being done and start doing it. And then, (s)he comes up with a new technology/method/algorithm/notion/... which attempts to solve the problems. So what's wrong with this? Evolution, that's what!!

Let's take the example of computers. In their early days, people[1] knew nothing about computers. Then, IBM, Apple etc. popularized computers to the extent that several individuals got involved and then started using computers. As a result, the very things that were required of computers - changed (or more correctly, evolved). TTBOMK, computers are being used almost everywhere. And you can find people in every nook and corner who know tons about computers. So what is really needed from computers now?

My point here is that people are evolving to learn a lot of technology. We can draw a parallel with Heisenberg's uncertainity principle here. Crudely, the principle said that if you use light to measure the position and momentum of a particle, you can't do both accurately, because the light's energy would move the particle or change it's momentum.
Similarly, if you try to evaluate the success of a technology, you will notice that people would have evolved to that technology and the latter would hardly be measuring up to it.

I can take a small example - something which I have noticed. There's this notion of data mining, which is supposed to help the non-IT sector companies with business intelligence. With increasing awareness to data mining techniques, thanks to extensive customer-training programs, these non-IT companies have started expecting more from the software. It is like you were constructing a complex solution to somebody's IT problems, and somebody "educates" that person about the things involved in constructing such a solution - your efforts will mostly be wasted. So how does one estimate what a customer expects? I mean, go figure.


[1] Normal people - not nerds/geeks/engineers/idiots

Friday, December 09, 2005

Adobe - the first close encounter

I recently got selected for a job at Adobe. Here's how it all happened, and my views.

[The process]
The selection process consisted of a written test followed by three rounds of interviewing. The written test had some 4 sections- two of them basic (school) mathematics and short puzzle-type questions. The other sections tested basics of algorithms and programming. The interviews, on the other hand, tested all areas of computer science, including Database systems, programming, Object-oriented concepts, problem-solving, geometric algorithms, operating-systems and programming-language principles. This was contrary to what
had been expected of an organization that works predominantly in multimedia software. Many attendees of the interviews expected them to test their knowledge in the more specialized a reas of computer-graphics and vision. However, it turned out that the interviewers were least bothered if we had any background at all in computer vision/graphics or its allied areas of study. This came as a surprise to many interviewees. The interviewers were keen to test systems concepts and most of the time core computer science concepts dominated the interviews. This gave some of those students who had explored a bit of all the worlds a slight edge over the specialists.

There was one interview which tested the students on raw problem-solving. Apparently, the interviewees were not expected to solve the problems completely - but were required to demonstrate skills of exploring solutions in different ways.

Considering the manner in which they conducted the interviews, there seemed to be a wide gap between their expectations and what the interviewees were prepared for. They were expecting thoroughness in operating-systems concepts while the preparation was for vision/graphics etc.

[My Experience]
I started my exam about one and half hour later than the others. The people from Adobe very kindly allowed me to write the test. But basically, I had to rush through the written test. I believe that I wrote the test pretty well. The good thing about the written test was that the aptitude test was meant only for an initial screening. It was not _the_ thing for them, unlike TCS. The next day the interviews started at 11:00 for me. My first interview was not very encouraging. The person asked me some basic geometric algorithms and programming, and I was not very confident in answering them. The worst part of it was that those were the questions I was best prepared for. I have been coding for over 8 years now, and I felt that I could not have been any worse than I was at that moment. There was this guy who questioned me - french beard, and nasty stare, and all. I was constantly distracted with his staring at me and putting something down on his laptop. He gave me exceedingly simple problems to solve and it took me eternity (I thought) to solve them. However, they allowed me to go through to the second round of interview.

The second round was not as perplexing as the first one. This person asked me basic questions on: -
C++: Inheritance, Overloading, and Virtual functions, Late binding, etc.
Systems concepts: Program image/executable structure, segments, implementation of call-backs, RPC,
synchronization, etc.
Database concepts: Transactions, locking, basic SQL, buffer-management.
Computer Networking: IP, TCP, Sliding window protocols, etc.
Operating systems: Memory management, Kernel calls, Scheduling etc.

I told him that I was not familiar with Windows concepts, since I have never really used Windows for development. So most of the time, the questions revolved around Linux (which is one of my favourite topics).

Then, he asked me about my favourite project. I told them that all of my projects were quite dear to me - but he insisted on talking about one. So I decided on my final year B.Tech. project which was a web-browsing systems (web-browser plus caching-proxy) which we (me and my teammates) had built in my undergraduate program. This was basically an enourmous C++ - based text web-browser built from scratch, very similar to Lynx. He asked me how we designed it and what kind of problems we faced while designing it - and I told him everything, because that project was _really_ done well - complete with all documentation in UML and thoroughly designed on paper. So that was quite okay.

The last interview was the most interesting. The person in the interview literally guided me to the solutions to the puzzles he asked me. He asked me some puzzles and asked me explore possible solutions. The interview required me to explain the solutions on the black-board (thanks to my TAship experience, there
was no hesitation there). All in all, the game was to think (imagine, if you please) as much as one could. This last interview was the happy part of the interview - with no tension brewing on any side.

[ My Views ]
In an objective manner, here's what I think of the interview and out preparation for the interviews:

1) The interviewers were looking for general computer science.
2) They expected thoroughness in system concepts
3) They seemed to be very critical of our institute - however, that is something that they did _not_ indicate in my interview. Others (mostly under-graduate students) told me that they were criticizing the institute for not promoting basics as much as research.
4) People who had explored more than one area of computer science to a reasonable depth had a slight edge in the interviews.
5) Programming on paper (courses like graduate problem solving) helped me a lot.
6) Knowing the general things well is the most important thing for an interview.
7) The model for the selection process, in my opinion, is _not_ good. This holds true for not just Adobe, but any organization that judges individuals in one/two days. How can they do that?

There ought to be better channels for recruitment - maybe through the faculty of the institute. They are the people who observe their students' performance. There could obviously be a lot of argument against this, but in an institute like IIIT, I think, this could be the best model. How nice it would be if every student in this institute worked well towards some end, and ultimately gets placed in an organization that works in his/her area of interest.

Getting a great GPA seems to be a _Requirement_ for recruitment. I think, that doing well in academics is one thing and getting grades, another.