Vivek Haldar
Archive / RSS
About / Popular / Follow @vivekhaldar
-
2012-04-07
Otherwise, I’ll go to Vegas. If you’re not drinking or gambling, Las Vegas is a surprisingly good city for writing: when you get stir crazy, you can walk somewhere new. There are lots of restaurants, and no one looks at you strangely for being alone.
— John August, on his writing routine.. Who woulda thunk — Vegas has a contemplative side.
-
2012-04-06
…the liberal arts cliché about teaching you how to think is actually shorthand for a much deeper, more serious idea: learning how to think really means learning how to exercise some control over how and what you think. It means being conscious and aware enough to choose what you pay attention to and to choose how you construct meaning from experience. Because if you cannot exercise this kind of choice in adult life, you will be totally hosed.
— David Foster Wallace, giving a commencement speech.
-
Making a stand.
-
2012-04-01
Code review. Just do it.
There is one practice that has had more of an impact on my professional programming life than anything else–code review.
The role of code review at Google has been pretty well-documented by now. It is absolutely central to the way it builds and operates engineering systems. There is a small river of posts by programmers about how great code review is. And I’m going to add to that, because I don’t think is is something that can be stressed enough.
No matter how great or experienced a programmer you are, code review will always improve your code.
The sad truth is that you are blind to your own code. Once it compiles and runs and does what you want it to do, you don’t want to go over it agian. “It runs! Ship it!” No. A fresh pair of eyes will tell you exactly what you missed. And you will go, “Doh! Shoulda fixed that.”
-
2012-03-30
Her less-than-refined writerly day began with finding her notebook, which surely she’d left right there. Then, having found a notebook (not the one she’d used yesterday), and staring in stunned amazement at the illegible chicken scratchings therein, she would finally settle down to jab at elusive characters and oil creaky plots. Most astonishing, Curran discovers that for all her assured skewering of human character in a finished novel, sometimes when Christie started her books, even she didn’t know who the murderer was. Ah! It makes sense—a brilliant mystery writer must first experience the mystery! Or does it?
— The Mystery of the Messy Notebooks
-
2012-03-28
A recent review of Jonah Lehrer’s new book, Imagine, delves into how even a science writer as skilled and technically trained as Lehrer can fall into the trap of letting the narrative run over the science. (Disclaimer: I haven’t read the book.)
If dubious interpretations of scientific data appeared only once in Imagine, it might be a worrisome fluke; but they appear multiple times, which is cause for real concern. Lehrer steps over the line again when connecting amphetamine use to creativity. He states that “Because the dopamine neurons in the midbrain are excited…the world is suddenly saturated with intensely interesting ideas.” Such definitive statements imply that neuroscience has already charted a causal course from neurotransmitter chemistry to a complex cognitive process — which simply isn’t true. That it should have come from a writer who so clearly has the ability to write about science critically and intelligently still comes as a bit of a surprise.
We need good translators of science to the general public, and Lehrer has the public’s ear and the public’s trust. He is at his best when putting his considerable talents to the task of telling a story that is true according to the facts as we know them, rather than telling a story people want to hear.
Lehrer actually studied neuroscience in grad school and is fully capable of grasping the technical nuances of the research in the field. This is why practicing scientists shy away from writing “popular science”. They are scared of giving up the fine chisel of the lab for the broad strokes of prose.
-
2012-03-27
And yet writing stories is one of the most assertive things a person can do. Fiction is an act of willfulness, a deliberate effort to reconceive, to rearrange, to reconstitute nothing short of reality itself. Even among the most reluctant and doubtful of writers, this willfulness must emerge. Being a writer means taking the leap from listening to saying, ‘Listen to me.’
— Trading Stories, by Jhumpa Lahiri
-
2012-03-25
McWay Falls and Cove, California Coast.
-
2012-03-24
Computational thinking
In Why coding is not the new literacy, the author argues:
Coding is not the new literacy because it will never be a requirement that every man, woman and child must know how to code in order to communicate fully. It is a function of the development of communication, not its application. As such, it is more true to compare coding to skills such as book-binding, or newspaper printing. It is sufficient for a relatively small proportion of the population to understand it in order for its benefits to accrue.
I think this is a shortsighted view to take. What qualifies as a “literacy” has drastically changed throughout history. The ancient Greeks considered oratory skills such as rhetoric and debating to be the foundation of their education. When writing came along, they resisted, thinking that committing thoughts to tablets would weaken their minds.
At the same time, I do realize that “coding is the new literacy” has become a cliché. But if you take a broader view, and think about not coding per se, but the thought processes around it, the concept becomes much more powerful. Thinking computationally, even about problems and domains that don’t involve coding, is rapidly becoming an essential skill, and a mode of thinking that one needs to be familiar with to understand the modern world around us.
Jeanette Wing captures this reasoning perfectly in her description of computational thinking:
Computational thinking is a fundamental skill for everyone, not just for computer scientists. To reading, writing, and arithmetic, we should add computational thinking to every child’s analytical ability. We have witnessed the influence of computational thinking on other disciplines. For example, machine learning has transformed statistics. Statistical learning is being used for problems on a scale, in terms of both data size and dimension, unimaginable only a few years ago… Computational biology is changing the way biologists think. Similarly, computational game theory is changing the way economists think; nanocomputing, the way chemists think; and quantum computing, the way physicists think. This kind of thinking will be part of the skill set of not only other scientists but of everyone else.
-
2012-03-20
Themes: Language
(Continuing a series where I look back on themes that have emerged in this blog.)
And another thing I’m fascinated by is language and etymology.
- English words lifted from Hindi
- Nouns as verbs, verbs as nouns
- The 18th Century Origins of Lolcat-speak
- Parsing “and/or”
- Englsh Wtht Vwls
- Upsight
-
2012-03-19
Showing it
Or is it possible he just loves programming? No doubt the filmmakers considered this option, but you can see their dilemma: how to convey the pleasure of programming—if such a pleasure exists—in a way that is both cinematic and comprehensible? Movies are notoriously bad at showing the pleasures and rigors of art-making, even when the medium is familiar.
(From Zadie Smith’s essay, Generation Why)
This is a problem that programmers often face. How exactly can they convey the sheer awesomeness of the stuff they create? Painters can point to their paintings. Mechanical engineers can point to shiny metallic objects. What can programmers show? Pages of incomprehensible code?
This problem is all the more exacerbated for programmers deep in the backend–those who hack infrastructure. At least programmers closer to the product and the front-end can point to shiny UIs and things happening on the screen and say “I made that!”
-
Themes: The Economics of Programming
I’ve been looking at themes that have emerged in my posts.
Another theme is the economics of programming:
- Why developers should learn the economics of code
- The 0.1x developer
- The CS assignment I wish I had
- Innovator’s dilemma in programming languages
-
2012-03-18
Themes: Modern Work
I’ve noticed a few themes emerging in my posts over the last couple of years. I didn’t plan to focus on those themes, but when I look back I notice clusters. I want to collect together posts along a common theme.
One of the more prominent themes is the nature of modern work, and how it relates to Taylorism:
- Taylorism in the modern tech industry
- Engineering is all about failure
- Minimalism is not a viable intellectual strategy
- The programming assembly line
-
2012-03-16
Blue Java
(Follow-on to the last post)
There’s another interesting reason for using a specific language, but it becomes a factor very rarely — indirect competition. This is when company chooses and promotes a language primarily to beat back a competitor.
This happened in the 90s when IBM decided to throw all its weight behind Java. Microsoft was ascendant, and so was its developer stack. IBM was terrified that it would end up in a position where it would have to use (and license!!) Microsoft’s languages and compilers, and be shackled by the APIs of Redmond.
Along came Java, that too from a company whose CEO at the time was a sworn enemy of Redmond. It was a godsend for IBM. It made a strategic decision to throw all its weight behind Java.
Java’s rise, particularly inside the enterprise, had a lot to do with Big Blue backing it over the next decade.
-
2012-03-15
Innovator’s dilemma in programming languages
(Follow on to this post.)
What is the job that programming languages are hired to do?
Here is a partial list of criteria for picking a programming language:
- Technical: the language is superior along some technical dimension. It might have great support for parallel message passing applications, for example. Unfortunately, most programming language debates stay stuck here.
- Economic: you run large applications on humongous numbers of machines, which cost a gargantuan amount of money. You would very much like to not waste any of that hardware, utilize it as much as possible, and reduce the marginal expenditure when the size of your application, or the number of its users, increases.
- Business: your code must react very quickly to changing markets, and new customer requirements. Maybe the requirements are not clear, and the only way to learn is to release something, see how it does, and then adjust your product. Agility, and reducing the time it takes to implement a change, are of paramount importance, or else you will simply be squeezed out by more nimble players.
A large company with acres of data centers and a small startup renting VMs over the net will “hire” languages to perform very different jobs.
Let me propose a new dividing line for programming languages:
- Sustaining languages: these are the ones commonly thought of as “large-scale industrial languages.” They stress efficient use of hardware resources (CPU, RAM) over personal productivity and succintness. They are typically used for “stable” applications, where the market and requirements are well-known and slow-changing1. They have been around for a long time. However, a continuous stream of incremental improvements has sustained their position.
- Disruptive languages: the new kids and upstarts. They stress agility of expression, succintness and individual productivity and expression, and are willing to use resources inefficiently to do so. They are usually looked down upon by the establishment (i.e. those who work with sustaining languages). They address a very different market. They are typically used at small scale, and for products whose requirements are not yet known, and will only be known by releasing something, getting user feedback, and iterating.
Currently, C++ and Java are examples of the first, and Python and Ruby are examples of the second.
It is no surprise that C++ and Java came out of large companies, and Python and Ruby came out of individuals scratching an itch.
I call them disruptive languages because like Christensen’s disruptive technologies they address a market not catered by the establishment, and are considered to address the “lower end” of the market.
But disruptive technologies are disruptive because they move “upmarket” and dislodge the incumbents. How might that happen with languages?
What is the tipping point at which a small company should consider re-engineering its software? In other words, when should it stop spending hardware on people, and start spending people on hardware? That starts to happen when hardware costs begin to approach people costs. For a small startup, that might happen when hardware costs cross the annual salary of one developer (or a few). If you look at AWS prices, you’ll see that you can rent a lot of hardware for that much.
But the cost of hardware has a clear trajectory over time — downwards. This means that the afore-mentioned tipping point will get pushed further over time. That in turn means that the scale (size of data, number of users, or whatever metric is relevant) up to which you can economically use disruptive languages is increasing over time.
-
There is a single-digit number of companies in the world that operate at a scale where even experimental products with wildly changing requirements need to use sustaining languages. That is a whole different story. ↩