- Problem 1 : Transferring Mathematical Intuition Using AI effectively often demands a deep understanding of the mathematics underlying whatever AI 'paradigm'/algorithm you use. For e.g. to even think about whether Neural Networks are a good solution for a problem, one needs to understand Linear Algebra at very intuitive level. To many people maths == a set of equations or symbols to be manipulated. And this manipulation and re arrangement of concepts to achieve precise effects is something programmers are naturally good at and so this tendency is even more deeply rooted in good programmers. For e.g., a lot of us have learned the equation F = m * a , where 'F' is Force, 'm' is mass and 'a' is acceleration. And to get through our exams we treat this as some kind of pluggable system in which two quantities are known (or can be derived from the given data) and the third has to be derived. This looks fairly simple. To check whether you understand the reality underlying the equation ask yourself this - "Does Force cause acceleration? Or Vice versa? Or both? Why? When?". The use of equations to *calculate* a quantity is different from being able to think effectively in terms of the concepts underlying the equation. To use mathematics effectively one has to constantly translate between the world of mathematics and the domain of the problem. Very few people(including mathematicians) feel the need to acquire this skill. People who use mathematics to get work done (e.g. Physicists/ Astronomers/ Race Car Designers) acquire this skill out of sheer necessity. Forcing oneself to translate (and asking students to translate) back and forth between English(without using mathemetical terms) and Mathematics is a valuable exercise. (try doing this for a simple differential equation and you will see what i mean) Thus, it isn't possible for someone to grab a few equations out of the latest AI/Pattern Recognition/Robotics etc book and apply them straight away to solve real world problems. To even know what is possible takes an understanding of the mathematical undepinnings of the various "families" of AI algorithms and this intuition takes a long time(at least for folks like me) to acquire. If programmers are just shown some algorithms or mathematical proofs, it is very unlikely they will be able to program an effective system or even maintain a system to meet changing environments. This is complicated by the fact that intuition may take a few years to acquire but has to be transferred in a week or less. While there is no perfect solution to this, I find that one can transfer large "chunks" of intuition through carefully selected (sequences of) motivating examples. In my first "iteration" of teaching, I said (in effect), "o.k. here is the (basic) theory. This is the algorithm in pseudocode. Use this and you will get the results you want". Everyone seemed happy but the whole effort "thrashed " for quite a while before delivering the (astoundingly effective) results. So these days, I try to instill an understanding of the theory distinct from the programming effort. More of a "teach how to fish" while giving the student enough "fish" so he doesn't starve till he learns to fend for himself. Thus instead of just throwing out the algorithms for (say) prediction and monitoring of data streams using Markov models, I start with real world examples of prediction vs monitoring and slowly feed in the maths (equations first, trivial programs next, then proofs, then real world programs). Given sharp "students" (as most of the programmers I interact with on a daily basis are, (Thank God!)) this works suprisingly well. I sometimes feel I am spinning a rope bridge across yawning chasms but the notion of a "slice" through a system works just as well in mathematics as in agile "story card" based implementation. Once a few cables are thrown across, programmers feel brave enough to explore the abyss a few feet at a time without too much assistance.
- Problem 2 : Getting beyond "right" and "wrong" Sometimes in spite of my best attempts at simplifying and communicating, my "students" (I think of them as peer programmers rather than as students, hence the quote marks) sometimes misunderstand and "screw up" the concepts and write strange programs or advance baffling arguments. In the beginning my response was "That is NOT what I said! THIS is what you need to do. PLEASE look at the examples/notes/code. Aaaargh!!!! Not Again! ". A more effective way is to try to figure out what the student's mental model is, faulty though it be. So now, instead of reacting to a totally off track argument by (mentally) banging my head on the nearest wall, I say to myself "What he says makes sense to him. So what mental/model thought processes has he adopted which causes him to see the world in a fashion that makes this argument logical?". The moment I can identify this precisely I am able to come up with an illuminating example that demonstrates the error. And sometimes it turns out they are on to something important and it is my perceptions that need fixing. "Learn Twice" indeed.
- Problem 3 : Ability to Do != Ability to Teach Sometimes the "teaching" gets so fatigue-inducing that I think to myself "I could have coded up all this in one tenth the time it taks me to communicate to someone else". My client, (being wiser than I) insisted on the teaching approach. What I have realized that many people (including Yours Truly) know how to do many thngs but are not often able to explain how they do them or teach others to do likewise. When you teach others you have to know each concept in a crystal clear fashion and also grok all the inteconnections between the concepts. Teaching often involves re-arranging concepts in increasing order of complexity and interdependence, seeking real life and programming examples that illustrate each facet of a concept with great clarity and slowly transitioning into complex real world problems and solutions. After teaching others, I understand many things at a much deeper level than I used to. That being said, teaching consumes massive amounts of time. I have to work 10 hours or more to prepare for a three hour "lecture". Thus "teaching" conflicts with "doing". For the time being, I'll focus on being a programmer more than a "teacher" type but I wonder how others manage. There is a lot more I could write but this post is long enough. In another entry I will look into the difference between "Real world" and "Toy" AI.
Ravi Mohan's Blog
Monday, January 30, 2006
To Teach is To Learn Twice ... And More
I am sometimes asked by clients to implement Artificial Intelligence based solutions to problems involving massive datasets, real time requirements etc. A non trivial part of this work is to transfer the knowledge of AI algorithms and the underlying maths to the client's programmers. After doing this a few times, I now have a clearer idea of some of the difficulties involved and some partial solutions for them.
Wednesday, January 25, 2006
Some people really get it..
from Xooglers
Sergey once asked a large assemblage of Googlers what our greatest corporate expense was. “Health insurance!” was one answer shouted back. “Salaries!” “Servers!” “Taxes!” “Electricity!” “Charlie’s grocery bills!,” came back others. “No,” said Sergey. “Opportunity cost.” He explained that the products we weren’t launching and the deals we weren’t doing threatened our economic stability more than any single line item in the budget.
How insightful is that? No wonder Google has no problems hiring good people.
Laptop Wars 3 - And the Winner is
an IBM -Lenovo Thinkpad z60m. The model I chose is a marginally less powerful (and less expensive) than the one reviewed in the article I linked to, but I have added some goodies like a hot swappable battery. I paid less than 2k $ for a jazzed up to the gills Thinkpad.
The macbook finally lost out because (a)I don't want to pay good money for beta testing apple's new hardware architecture (The G4 powerbook is too underpowered, and (soon to be) abandoned by Apple anwyay, not to mention terrible display issues on some of them which Apple pretends not to notice). (b)reliability issues with a macbook are likely to be life sucking given the atrocious customer service here in India. If I lived in the USA or Europe I may have been more tempted by the macbook.
IBM has excellent customer service in Bangalore and I have taken out a 3 year warranty. (130$ vs 340$ for the macbook). The Thinkpad outmatches the powerbook in sheer ruggedness. Hopefully in a couple of years Apple will have worked the kinks out of their new hardware choices and I will wait till then to drink the Apple Kool Aid.
My new laptop is on its way to my friend (who lives in the USA) and I should get my hands on it in early March and I will post my experiences installing Linux on the new laptop then.
Sunday, January 22, 2006
Laptop wars 2 - Macbook Pro Vs IBM Thinkpad
A brief update.
I am now thinking of buying either a MacBook Pro (if it comes out before Feb 28, the day my friend - who will bring it to Bangalore- leaves the USA) or an IBM/Lenovo Thinkpad (extremely well suited to Linux - multiple sources confirm that suspend to disk and power management work well and the Thinkpad has a reputaion for rugged construction). The Macbook will cost me about 2600$ (including taxes) and the Thinkpad should be about 1200$. The trouble with the macbook of course is that it is based on new hardware and Apple has historically had problems with new hardware with the 12 in iBook G3 having a particularly horrendous history.
In either case, I'll be buying the machine from the USA. Prices in India are insane with a surcharge of almost 25% on any model, the macbooks being particularly pricey. For that kind of price differential I could buy a 20 in cinema display in addition to the macbook. Anyone returning from the USA can bring in a laptop with no customs duty thus making it cheaper by about 25-30%. Duh! Am I the only one who thinks that is a particularly stupid way to run an economy? And India is going to be the 21 st century SuperPower? Yeah Right!
Saturday, January 14, 2006
The Agile Religion and the need for Merciless Pragmatism
Does anyone else get the feeling that "Agile" is now a religion with its holy books, many patriarchs, its churches, and commandments? ("Thou shalt work in pairs"). It is just a set of useful practices, master them, adapt them, use what works, discard the rest and get on with life.
The moment 'agile' becomes some kind of religion with itinerant preachers of the Holy Word attempting to convert the unwashed heathen to the worship of the True God, its time to step back and take a hard look at what is being attempted. Kent Beck made a list of practices which he found useful. Later a whole bunch of marketing was thrown at it and we have the present situation where people speak of "true agile" , "distributed agile" "xp 2nd edition vs xp 1st edition" and so on.
Another factor often ignore is that most of the "agile" methodologies (to use the word loosely) have their origins in the world of enterprise programming where teams of programmers wrestle with coding up systems for banks, insurance companies etc. Thus there are a lot of assumptions built into the extreme programming and other agile methods which simply don't hold in other contexts.
As a simple example, think about "Customer" and "onsite customer" in the context of a massive open source effort like the Linux Kernel. These terms just don't make sense. Any efforts to twist the meaning of the word to mean "the community" etc just robs the word of any meaning.
There is an even deeper notion of the separation of "what to program" (the customer) from "how to program" (the coder).A "luminary" who spoke on Agile recently said "Programming is all about taking knowledge from others and converting it to code". (Really! I am NOT making this up!)
"Take knowledge from others". Bzzzt! Cluelessness Alert! This may be true in the practice of consulting/business app/enterprise programming etc today but not necessarily elsewhere (Kernel hacking, scientific programming, compilers, embdded programming etc)
I think this view of the programmer as some kind of "coding body" with the "domain knowledge" residing elsewhere is deeply embedded into many "schools" of "agile". More about this in another blog entry but this separation of "what" and "how" has many built in assumptions about the context in which programming occurs.
There are whole worlds of programming outside the "enterprise" world where the 'agile practices' apply very tenuously, if at all.
And even within the enterprise programming domain, agile /xp/ whatever-the-latest-buzzword should be constantly (re) evaluated, adapted and modified, not adopted and propogated mindlessly (I have been guilty of such behaviour in the past and use this blog post to unreservedly apologize to my victims).
If I want some kind of religious experience I can go to my nearest Descent-Of-The-Holy-Spirit-Scream-and-Yell-And-Speak-In-Tongues- Church. When I go to work I want logic, pragmatism and rationality.
I have adopted the "constantly growing test suite","refactoring" and "continous integration" ideas from XP to my work and jettsioned the rest. It is simply impossible to "pair program", do "test driven design" etc in the context in which I work.
TDD is, in retrospect, an insane approach to design. But more about that in another blog entry.
To conclude, the "agile" schools of programming do have many useful ideas. But "agile" is neither a scientific theory (like Relativity) nor some kind of Divine Revelation. They are just a list of practices which work(ed) for some people in some contexts. Beware of "Agile Consultants", "Agile Enablers", etc. Lock that chequebook away and do some focussed thinking and experimenting. Pragmatism trumps religious fervor any day.
Tuesday, January 10, 2006
PowerBook Vs Linux Laptop
I have always wanted a powerbook. And with some "out of band" money coming in from a consulting assignment, I was planning to buy one.
I actually wanted to buy one pronto and if I were not waiting for Steve Jobs's address tomorrow, I would probably have bought one.
After I got home I was reading some reviews of the PowerBook and came across this.
WTF?
I have to pay Apple a premium price so they can dump bad hardware on me? My Compaq Presario 2100 has been treated very roughly over the last 3 years and still works without a hitch.
I am now inclined towards getting a good Linux laptop and am now waiting for my friend (and Kernel Hacker) Mridul Jain to reccomend a good laptop for Linux. I particularly require hibernate (to disk not to RAM) to work well and flawless power management.
Later in the day I'll investigate the display problems in detail. I am hearing more and more horror stories about Apple's machines which makes me very leery of buying them in a country with terrible consumer protection laws.
Update : the PowrBook is definitely out of the picture. Linux forever!
Sunday, January 08, 2006
Startup !
It is official. My friend Rajesh and I just shook (virtual) hands over an interstate phone connection on an agreement to create a startup company.
Right now there is just the intent to create a startup. We were talking of various ideas and suddenly it was "Hey why don't we start a company and do this instead of talking about it?--Yeah ! makes sense! Let's do it. Pick a day to launch--How about April Fools Day?--I like it! ".
So don't ask "What are you building? " (I wouldn't tell you anyway :-)). We have, as always, many ideas being kicked around and both of us are quite ...errr... unstable in different ways so what we are doing will probably get nailed down hours before the launch of the company (slated for Aril 1, 2006 ,the Paperwork Gods consenting). I would expect that there will be a stealth mode for quite some time after the official launch. I have quite a bit of stuff to wind down before April.
I am looking forward to this. The dynamics of a Jedi/Sith alliance promises to be interesting. Our strengths and personalities are very complementary.
Oh yeah. I am finally getting that 17" PowerBook as well :-).
This is going to be fun.
Friday, January 06, 2006
What is your company's M/E ratio?
In simpler words, how many managers do you have per engineer? Look around you . Count the number of managers in your floor/division/office. Count the number of people who actually write or test code. Divide the former by the latter. That is your M/E ratio.
Now reflect on this snippet from this very interesting essay
"...For instance, you can do what Google does, and give each manager 200 direct reports, rather than the classic 7 or 8, so your management overhead grows with the log base-200 of the number of engineers, rather than the log base-8. That's the kind of clever thinking that comes from hiring math-y Ph.D.s, you know..."
As a thought experiment, imagine that your organization hired brilliant engineers and every manager had 200 of them reporting to him and that your CEO, CTO , and the seniormost people in the organization were all PhDs or well known hackers. What do you think might happen? I believe most "don't rock the boat and get my huge salary for pushing paper" type managers would flee in horror. With 200 reports, they won't have the time to harass people and act like the Dilbertian PHB. They might , you know, really add some value! I think you will end up getting really good managers (no, "good manager" is NOT an oxymoron. They are rare but they do exist).
And more importantly with log-base-200 growth rate, the empire building politics playing manger will be relentlessly squeezed out.
In the last company I worked at, this ratio was about 1/8. Would have been much better, in retrospect, to have had an M/E ratio of about 1/20 (at least). Something to keep in mind if I ever get to create a multi billion dollar company.
I think the biggest favor Google has done the world is to prove that an engineer centric, low manager count, ignore-wall-street-and-please-the-customers company can make a lot of money. So no matter what their eventual fate, you can be sure a new generation of entrepreneurs will be paying attention. And learning.
Thursday, January 05, 2006
Monads And Causality
Geenerally I don't post "just links" on my blog. But this time I'll make an exception, because these are very lucid explanations to difficult questions.
Question 1 : How do I explain Monads to a programmer without resorting to mathematics (Category Theory)? (This in turn came out of an email exchange with Peter Van Roy about explicit state vs monadic state (as used in Haskell))
The answer is here.
This is the best answer I've ever seen which skirts Category Theory and various Haskellisms completely.
Question 2 : How can you really know if one factor causes another? (This came out of a conversation I had with a programmer working on a probabilistic classifer [I am working with them on integrating Machine Learning into their product suite])
Here are two sets of slides explaining the approach pioneered by Dr Judea Pearl at UCLA. To oversimplify, a new operator is added to probability theory with a separate semantics. This is a bit beyond standard probability but quite enlightening once you "get it".
On admitting what you got wrong
from Jeffrey Shallit's blog.
....Blondlot's tale is a cautionary one. By contrast, I offer a case where the proper behavior was displayed, from Richard Dawkins' 1996 Richard Dimbleby lecture:
A formative influence on my undergraduate self was the response of a respected elder statesmen of the Oxford Zoology Department when an American visitor had just publicly disproved his favourite theory. The old man strode to the front of the lecture hall, shook the American warmly by the hand and declared in ringing, emotional tones: "My dear fellow, I wish to thank you. I have been wrong these fifteen years." And we clapped our hands red.
Admitting you are wrong is a basic part of the mathematical and scientific ethic. ...
I think it ought to be part of one's personal ethic too.
So here is a (tongue firmly attached to cheek) list of what I was wrong about. Be warned, this is a jumbled list of "lessons learned" and is intrinsically very subjective. Don't expect any coherence.
- Ruby has a perl flavor.
- Ruby is way better than Java.
- A software developer in Bangalore can't do innovative, interesting work and make good money.
- Developing Enterprise Software, especially "outsourced" software is a worthwhile career.
- A knowledge of Mathematics/Programming Language Theory -insert other 'deep' subject here- is unnecessary to build good software.
- Mathematics is boring
- There is plenty of time to find out what to do in life and there is plenty of time to do it.
- Objects are the best way to design programs.
- Programmers are logical people and not prone to delusions and religious mania.
- Agile Methodologies will work (better than what they are already doing) for most teams/products.
- Most managers have a clue about their work/organizations/what their jobs are all about.
- It is impossible to make massive amounts of money doing nothing/ by making other people miserable/delivering negative value (see above).
- All managers are technically clueless.
- In practice, Management, specially in the offshored software world, really adds value and is not a big con game.
- It is impossible to combine being a manager (to get those massive salaries while doing nothing important) and a techie (to retain a of self worth and give life meaning).
- Dilbert is fiction.
- One person can't change the world
- You have to be extra ordinarily talented to change the world.
- The software industry is immune to political correctness fads because it is populated by logical, edcucated people (witness the noise around "diversity" or "we need more women in software")
- Engineers can't master business and are bad at "reading the market", "understanding business value" etc and need support in the form of analysts/managers.
- Some behaviours are so morally dubious that people who indulge in them exist only in fiction . Even if such people exist in the real world, they are very rare and you won't encounter them.
- If you know a person well, you can predict what he or she will or will not do.
- Talented people have a greater probability of being morally upright.
- In real life the good guys don't always win.
- Believing in God is better than being an atheist/agnostic.
- Being an atheist/agnostic is better than being a believer.
Subscribe to:
Posts (Atom)