Webster & Associates LLC

We can help.

  • Categories

  • Archives

  • Recent Posts

    • So long, Steve, and Godspeed.
    • The need for e-discovery tools
    • But, wait! (More on SSDs and e-discovery)
    • “The Information: A History, a Theory, a Flood ” — a review by Freeman Dyson
    • E-discovery and solid-state drives (SSDs)
    • Oops.
    • How not to handle a cold contact e-mail mistake
    • The Sessions paper — an analytical critique
    • US District Court Judge Throws Out Software Patent, Citing Bilski
    • In re Bilski goes to the Supreme Court

So long, Steve, and Godspeed.

By bfwebster on Oct 5, 2011 in Books, Intellectual property, Main, Technology | 0 Comments

The second personal computer I ever owned[1] was an Apple II, with no floppy drive. I bought it, along with a small color TV, from my close friend Robert Trammel while we were both living in Houston sometime around 1980.We had already spent hours together programming on it, then carefully (though not always successfully) saving our programs out to cassette tape. After three months, I sold the computer and TV back to Robert — not because I didn’t like it, but because I was spending far too much time on it.

A few years later — in 1982 — my close friend Wayne Holder hired me into his nascent software company, Oasis Systems, in part to help with his existing and planned word processing utilities (The Word Plus, Punctuation + Style), but mostly to develop computer games. And we did, developing Sundog: Frozen Legacy on the Apple II, a game for which I still get e-mails (and which Wayne is even now working on resurrecting for modern platforms). In January 1984, a few months before Sundog shipped, we were invited by Guy Kawasaki to come up to Apple to see a preview of the Mac and to talk about what software we could port to the Mac. Through my connections with computer stores in San Diego, I was able to get a personal loan of a Mac for a few days at home prior to the official announcement in Cupertino later that month, which Wayne and I attended as well. That was my first time seeing Steve Jobs in person, and it remains a memorable highlight of my professional life.

When the Mac shipped a few days later, I went down to the one computer store in San Diego that I knew would be getting machines from Apple. I took $3000 in cash with me and managed to convince the store owner — a friend — to let me have one of the three Macs he had to sell. Through a connection with Phil Lemmons — editor-in-chief at BYTE — I ended up writing the official BYTE review of the 128K Macintosh (August 1984 issue). By the end of 1984, I was writing full-time for BYTE, including on-going coverage of the Macintosh, particularly once my BYTE column started in mid-1985. After a few years of writing for BYTE, I switched to writing for Macworld magazine. Steve was now long-gone from Apple, and Apple was having some of its own problems going forward.

But in late 1987, I was contacted by Addison-Wesley. They were interested in having me write a book about Steve Jobs’ new project at NeXT. Folks at NeXT had apparently suggested me to Addison-Wesley, probably due to my writing at BYTE and Macworld. I leapt at the opportunity, particularly since in coincided with our family moving from Utah to just outside Santa Cruz (where I would be doing technical writing for Borland on a consulting basis). Once there, I found myself invited to visit NeXT HQ on Deer Creek Road, sit in on meetings, and attend the 0.3 NeXTstep Dev Camp. And, yes, that meant getting actual face time with Steve Jobs as well — not a lot, but this was a man whose creations had been impacting my personal and professional life for over a decade at this point.

The writing of the book dragged out as I waited to get my hands on an actual NeXT cube, which finally happened (if I recall correctly) at the end of 1988 or early 1989. I wrote the first several drafts of the book on that NeXT cube itself. The book came out in the fall of 1989; it remains the single most successful book I’ve ever written, due to the intense interest in NeXT itself, more than any particular writing skills or technical insight on my part.

The following year, I found myself working with a world-class typographer (Mike Parker) and graphic designer (Vic Spindler) to create a design-oriented desktop publishing system. I was doing all the software prototyping on my NeXT cube, and we made the decision to make the NeXT our first target platform. For five years — 1990 to 1995 — I served as chief architect and CTO at Pages Software Inc, where we developed Pages by Pages and then WebPages, while spending nearly two years just trying to raise venture funding. We closed on funding at the start of 1992 and shipped our first version of Pages in early 1994. We quickly sold all that we were going to in the all-too-small NeXTstep market. My frustrations at seeing larger firm try to leverage off of NeXT’s incredible innovations led to an op-ed piece in the November 1994 issue of BYTE, “Whither NextStep?” The day that issue came out was the last time that Steve Jobs and I spoke — he called me from the back of a car somewhere to ask me what the hell I was doing writing that. I said, telling the truth. Pages would close its door the next year, unable to secure additional funding to move its technology to Windows.

When Steve engineered his brilliant reverse takeover of Apple — getting Apple to buy NeXT for $400 million, then slowly moving himself into the CEO seat — I was not optimistic. I still had unconditional praise for the NextStep technology, but I was dubious about Steve’s ability to sell technology to markets and to compete with Microsoft.

Boy, was I wrong. I was not only wrong about his abilities at Apple, I was wrong in my BYTE article about NextStep being on a downward slope. NextStep, of course, was the foundation of Mac OS X, and Steve transformed Apple into the most-admired, most-imitated, and most-valuable company in the world. And I was tickled that, when Apple brought out its own word processor, it was named “Pages”. Steve had always liked that name when we were developing (and shipping) our own product years before; glad he was able to use it.

To quote John Perry Barlow over on FB, “The world is suddenly a less interesting place.” ..bruce w..

[1] The first was an HP-67 card-reading programmable calculator.

[Cross-posted from And Still I Persist]

The need for e-discovery tools

By bfwebster on Mar 5, 2011 in Lawsuits, Main, Management, Technology | 0 Comments

John Markoff in the New York Times has a detailed article on the emergence of software tools to help in the analysis of electronic documents (and, one could easily presume, scanned and OCR’d physical documents) in litigation:

Now, thanks to advances in artificial intelligence, “e-discovery” software can analyze documents in a fraction of the time for a fraction of the cost. In January, for example, Blackstone Discovery of Palo Alto, Calif., helped analyze 1.5 million documents for less than $100,000.

Some programs go beyond just finding documents with relevant terms at computer speeds. They can extract relevant concepts — like documents relevant to social protest in the Middle East — even in the absence of specific terms, and deduce patterns of behavior that would have eluded lawyers examining millions of documents.

“From a legal staffing viewpoint, it means that a lot of people who used to be allocated to conduct document review are no longer able to be billed out,” said Bill Herr, who as a lawyer at a major chemical company used to muster auditoriums of lawyers to read documents for weeks on end. “People get bored, people get headaches. Computers don’t.”

Since I work in litigation support myself and have to search and read documents — I’ve had individual cases with over a million pages of documents — I welcome such tools as these. Still, the overall theme in the article seems to be that humans will lose jobs due to these tools:

Quantifying the employment impact of these new technologies is difficult. Mike Lynch, the founder of Autonomy, is convinced that “legal is a sector that will likely employ fewer, not more, people in the U.S. in the future.” He estimated that the shift from manual document discovery to e-discovery would lead to a manpower reduction in which one lawyer would suffice for work that once required 500 and that the newest generation of software, which can detect duplicates and find clusters of important documents on a particular topic, could cut the head count by another 50 percent.

What the article does not address is the enormous financial burden that e-discovery has imposed on both parties in modern civil litigation (see also here and here).

It used to be that each side would gather physical documents from files and storage boxes, review them for responsiveness, then produce them to the other side. Outside of large corporations, most parties would not have great volumes of such documents.

However, with the advent and pervasiveness of digital technology — computers, e-mail, servers, instant messaging, and so on — even a relatively small firm or organization can find itself with gigabytes, if not terabytes, of potentially relevant production. And, to avoid court sanctions, all of that has to be combed through and reviewed for responsiveness. Most organizations below a certain size just can’t afford to do that the old-fashioned way with “a platoon of lawyers and paralegals who [work] for months at high hourly rates”.

The article also does not address the existing document organization and search tools that are a bit less artificially intelligent — and a lot cheaper — than the tools he covers, but that still allow complex search terms to be set up. I make heavy use of dtSearch in most of the cases I work on (and can recommend it highly). I’ve also used such standard legal document management tools as Summation and Concordance.

But, wait! (More on SSDs and e-discovery)

By bfwebster on Mar 1, 2011 in Lawsuits, Main, Risk management, Technology | 0 Comments

OK, just last week I wrote a post on a report out of UCSD regarding difficulties in erasing data from solid-state disks (SSDs). But now out of Australia comes a somewhat contradictory report that some SSDs actually erase ‘slack’ (unused) space themselves without any user or system intervention — and, furthermore, that they can do so even when disconnected from a computer and a ‘write blocker’ installed:

After conducting a series of experiments comparing a sample Corsair 64GB SSD with a conventional Hitachi 80GB magnetic hard drive (HDD), the team found a layer cake of data recovery problems caused by the ‘garbage collection’ or purging algorithms used in SSDs to keep them at peak performance.

After examining an SSD for traces of data after it had been quick formatted, the team expected the purging routines to kick in around 30-60 minutes later, a process that must happen on SSDs before new data can be written to those blocks. To their surprise, this happened in only three minutes, after which only 1,064 out of 316,666 evidence files were recoverable from the drive.

Going a stage further, they removed the drive from the PC and connected a ‘write blocker’, a piece of hardware designed to isolate the drive and stop any purging of its contents. Incredibly, after leaving this attached for only 20 minutes, almost 19 percent of its files had been wiped for good, a process the researchers put down the ability of SSDs to initiate certain routines independent of a computer.

For comparison, on the equivalent hard drive all data was recoverable, regardless of the time elapsed, as a forensic examiner would expect

“Even in the absence of computer instructions, a modern solid-state storage device can permanently destroy evidence to a quite remarkable degree, during a short space of time, in a manner that a magnetic hard drive would not,” the team concludes.

The results are concerning on a number of levels, forensic, legal and technical.

Yeah, I’ll say, though I suspect most of the concerns are legal and forensic. Hat tip to Slashdot.  ..bruce..

 

“The Information: A History, a Theory, a Flood ” — a review by Freeman Dyson

By bfwebster on Feb 27, 2011 in Main, Surviving Complexity, Technology | 0 Comments

Freeman Dyson and James Gleick are both authors always worth reading. In this case, Dyson has reviewed Gleick’s newest book in an essay titled “How We Know”. An excerpt:

According to Gleick, the impact of information on human affairs came in three installments: first the history, the thousands of years during which people created and exchanged information without the concept of measuring it; second the theory, first formulated by Shannon; third the flood, in which we now live. The flood began quietly. The event that made the flood plainly visible occurred in 1965, when Gordon Moore stated Moore’s Law. Moore was an electrical engineer, founder of the Intel Corporation, a company that manufactured components for computers and other electronic gadgets. His law said that the price of electronic components would decrease and their numbers would increase by a factor of two every eighteen months. This implied that the price would decrease and the numbers would increase by a factor of a hundred every decade. Moore’s prediction of continued growth has turned out to be astonishingly accurate during the forty-five years since he announced it. In these four and a half decades, the price has decreased and the numbers have increased by a factor of a billion, nine powers of ten. Nine powers of ten are enough to turn a trickle into a flood.

I’ve lived through most of the information age (if we peg it at 1940 or so; I was born in 1953), and I’ve worked in information technology for almost 40 years (since 1974). I remember seeing a second megabyte of core memory that was being added to the IBM 360/65 on campus in the mid-1970s: it was the size of a giant shoebox, and it cost $250,000. Now the cost per megabyte for 1GB RAM SIMMs sold over Amazon is as low as $0.02, and there’s 1,024 megabytes on that little stick. I have something approaching 15 terabytes of hard disk storage on the various computers in our house, and — as per Dyson’s review and Gleick’s book — my biggest challenge is keeping it organized and accessible. There’s a high level of redundancy, with numerous minor (or major) variations; at the same time, I fear that a disk failure will forever deprive me of files that I might someday want or need again, whether that’s tomorrow or ten years from now.

I’ll read the book when it comes out (I have a copy preordered from Amazon); in the meantime, Dyson’s review is well worth reading.

 

E-discovery and solid-state drives (SSDs)

By bfwebster on Feb 24, 2011 in Lawsuits, Main, Risk management, Technology | 1 Comment

E-discovery — the recovery, analysis, and production of evidence stored in digital form on various media — has become a major issue in litigation because of how much data simple devices can hold and the resulting duplication and multiplication of documents, files, and other digital types of evidence. Because of the risks and costs of e-discovery in litigation, many organizations have established written policies, often backed by automated tools, to erase and otherwise dispose of electronic files past a certain date or when no longer needed (absent pending relevant litigation, that is).

At the same time, classic rotating storage devices — such as platter-based hard disk drives — are slowly being supplemented and in some cases supplanted by solid-state media, that is, storage devices that have no moving parts at all. In particular, solid-state disks (SSDs) look and behave just like classic hard disk drives, but with usually faster response times and much lower chance of hardware failure. In particular, USB drives have become ubiquitous as a mechanism for moving files between unconnected computers, while laptops are starting to offer SSDs as the principal internal storage for both speed and weight improvements.

So far, so good. Except that out of the University of California at San Diego comes research that shows that in many cases, standard file- and disk-erasing techniques leave behind recoverable data when used on SSDs:

At the Non-volatile Systems Laboratory we have designed a procedure to bypass the flash translation layer (FTL) on SSDs and directly access the raw NAND flash chips to audit the success of any given sanitization technique. Our results show that naïvely applying techniques designed for sanitizing hard drives on SSDs, such as overwriting and using built-in secure erase commands is unreliable and sometimes results in all the data remaining intact. Furthermore, our results also show that sanitizing single files on an SSD is much more difficult than on a traditional hard drive. We are working on designing new FTLs that correct these issues and also exploit properties of flash memory to maintain performance while sanitizing the flash drive.

I imaging that word of this will quickly spread to companies that perform computer forensics (including recovery of data for storage devices). In the meantime, many organizations may continue to make us of USB drives and internal SSDs in laptops (and, eventually, desktops), and by so doing may leave themselves open to discovery of data that they thought purged in the course of normal, documented operations.

« Previous Entries
Next Page »
gipoco.com is neither affiliated with the authors of this page nor responsible for its contents. This is a safe-cache copy of the original web site.