Evolving Systems
July 12-13, 2005
Miami, Florida


Index to Papers

The URLs are offered for reference to the original web-based documents, or to documents available on the TTI/Vanguard web site. The page length may vary slightly, based on your browser settings.

 

The Free Lunch Is Over: A Fundamental Turn Toward Concurrency in Software by Herb Sutter

Speaker Herb Sutter explains why we have reached a fundamental turning point in software development, describes the changing face of hardware, and shows why concurrency in software is the only way to take advantage of hyperthreading and multicore architectures.
http://www.gotw.ca/publications/concurrency-ddj.htm (10 pages)

  The Future of Evolutionary Biology by Richard Lenski

In this paper honoring the legendary evolutionary biologist Ernst May, speaker Richard Lenski muses about the future of evolutionary biology. He notes there two revolutions on the horizon: molecular-genetics the emergence of artificial life.
http://www.ttivanguard.com/miamireconn/EvolutionaryBiology.pdf (34 pages)

 

Testing Darwin

Who and what are the people who run the Digital Evolution Laboratory at Michigan State up to? They’re a crew of computer scientists and biologists (including Richard Lenski) who use the computer program Avida to monitor and watch the evolution of new digital life forms and complexity.
http://www.carlzimmer.com/articles/2005/articles_2005_Avida.html (9 pages)

  Scientists Catch Quick Darwinism by Robert Boyd

SA look at the world of experimental evolution, which, beyond helping to explain the process of evolution, has practical goals in medicine, agriculture, manufacturing, and the environment.
http://www.signonsandiego.com/uniontrib/20041006/news_1c6evolve.html (3 pages)

 

Transitions from Nonliving to Living Matter by Steen Rasmussen, Liaohai Chen, David Deamer, David Krakauer, Norman Packard, Peter Stadler, and Mark Bedau

The authors summarize research presented at two workshops concerning how simple life forms (artificial cells or protocells) could be synthesized in the laboratory.
http://www.ees.lanl.gov/EES5/staff/steen/papers/963.pdf (3 pages)

 

Life Built to Order by Michael Stroh

A detailed profile of speaker Steen Rasmussen and his work to create a brand new life form. His protocell will be created from inanimate molecules and will be thousands of times as small as a typical bacterium.
http://www.popsci.com/popsci/medicine/article/0,20967,1014147,00.html (4 pages)

 

Anatomy of Give and Take by Robert Lee Hotz

The burgeoning field of neuroeconomics is explored as two women, each inside a brain scanner at Baylor College of Medicine, play a game involving trusting each other for mutual financial gain.
http://www.hnl.bcm.tmc.edu/cache/la-sci-brain18mar18,
1,429153,print.story.htm (7 pages)

 

Five Questions with Dr. P. Read Montague

Are humans hard-wired to trust one another? Yes, say Dr. P. Read Montague, director of the Human Neuroimaging Laboratory at Baylor College of Medicine. He describes experiments involving trust as subjects are simultaneously put into special MRI scanners at Baylor and CalTech.
http://www.hnl.bcm.tmc.edu/cache/HouChronTrust.htm (3 pages)

 

E-mail Authentication. Then What? By Dave Anderson

As organizations increasingly depend on the reliable delivery of their outbound e-mail and the protection of their domain name, speaker Dave Anderson describes the progress being made to authenticate e-mail.
http://news.com.com/E-mail+authentication.+Then+what/
2010-1071_3-5629318.html?tag=nefd.ac
( 2 pages)

 

Supreme Court Mulls File-swap “Pushers” by John Borland

Recently, the US Supreme Court heard arguments in the case of MGM vs. Grokster. At stake: the legal status of peer-to-peer networks and whether file-swapping technology that can circumvent copyright should be outlawed.
http://news.com.com/Top+court+mulls+P2P+pushers/
2100-1027_3-5656010.html
(3 pages)

 

Court Nixes “Broadcast Flag” by Michael Grebb

A recent US Court of Appeals decision invalidates a Federal Communications Commission order that would have required makers of consumer electronic devices capable of receiving broadcast digital TV signals to recognize a broadcast flag.
http://www.wired.com/news/politics/0,1283,67447,00.html?
tw=newsletter_topstories_html
(3 pages)

 

The Beep Shall Inherit the Earth by Cory Doctorow

This article looks at how the BBC, through its creation of BBC Backstage and the Creative Archive, is embracing the digital future rather than fighting it.
http://www.wired.com/news/print/0,1294,67552,00.html (3 pages)

 

FAQ: How Real ID Will Affect You by Declan McCullagh

On May 10, the US Congress passed the Real ID Act that calls for the creation of electronically readable, federally approved IC cards for Americans. Does this amount to a national identity card?
http://news.com.com/FAQ+How+Real+ID+will+affect+you/
2100-1028-5697111.html?part=dht&tag=npro&tag=nl.e433
(3 pages)

 

Public Eyes, Private Eyes by Cynthia L. Cooper

Function creep and the right to be left alone are some of the privacy concerns of the Electronic Privacy Information Center, profiled in this article.
http://www.fordfound.org/publications/ff_report/view_ff_report_detail.cfm?
report_index=378
( 5 pages)

 

Comments on Richard Epstein's Liberty vs. Property by Solveig Singleton

Speaker Solveig Singleton suggests that since technology has made IP increasingly hard to enforce and easy to break, people’s expectations of what is fair and right have gotten out of whack. To get them back might require starting all over again. (For Richard Epstein’s paper, “Liberty Versus Property? Cracks in the Foundations of Copyright Law”, please see http://ipcentral.info/review/v1n1epstein.pdf.)
http://ipcentral.info/review/v1n1singleton.pdf (6 pages)

 

Identity Thieves’ New Ploy: “Pharming” by Dan Lee

A look at the latest Internet scam, pharming, in which users are directed from a legitimate web site to a fraudulent copy of the site without any warning signs.
http://www.siliconvalley.com/mld/siliconvalley/news/local/11324938.htm (2 pages)

 

Research: Spyware Industry Worth Billions by Matt Hines

The number of computers infected with spyware applications remains high despite the growing realization that spyware can be a major threat to personal and business security.
http://news.com.com/Research+Spyware+industry+worth+billions/
2100-1029-5693730.html?part=dht&tag=ntop&tag=nl.e433
( 2 pages)

 

Ethan Zuckerman: The Worldchanging Interview

In this extensive interview, speaker Ethan Zuckerman argues that digital democracy and new media tools will have to undergo profound changes to make a difference in the developing world.
http://www.worldchanging.com/archives/001098.html (17 pages)

 

An Open-source Call To Arms by Bruce Perens

Speaker Bruce Perens argues that organizations need worry about legal protection issues when using open source software. The cost of defending copyright suits usually menas the victor goes to the one with the largest pocketbook.
http://news.com.com/An+open-source+call+to+arms/2010-7344_3-5221365.html (3 pages)

 

The Firefox Explosion by Josh McHugh

Firefox has surpassed 50 million downloads. This article looks at the history, rationale, and beginnings of Firefox, along with the contributions of the Mozilla Foundation, Ben Goodger, and speaker Blake Ross.
http://www.wired.com/wired/archive/13.02/firefox.html (6 pages)

 

Firefox Architect Talks IE, Future Plans by Nate Mook

Firefox creator Blake Ross talks about Microsoft’s Internet Explorer, Firefox as a platform for development, and what’s next for the Firefox development team.
http://www.betanews.com/article/Firefox_Architect_Talks_
IE_Future_Plans/1101740041
(3 pages)

 

A Trajectory for the Evolution of SIMS Architecture by Paco X. Nathan, Mike W. Erwin, Jamie L. Pugh, and William W. Hurley

Methods of integrating security point solutions into a consistent framework are slowly emerging in the form of security infrastructure management systems (SIMS). The authors, including speaker William Hurley, look at the evolutions of SIMS and present a description for a SIMS appliance as the next step.
www.ttivanguard.com/miamireconn/SIMSEvolution.pdf (17 pages)

 

Non-Equilibrium Risk Models in Enterprise Network Security by Paco Nathan and William Hurley

This paper explores the notion of a security infrastructure as an autopoietic system and presents a form of quantitative analysis for risk aggregation called “non-equilibrium risk models” (NERM).
www.ttivanguard.com/miamireconn/NERM.pdf (31 pages)

 

Hey Google, Map This! by Daniel Terdiman

Though not authorized by Google, several companies are devising Google hacks, giving users new ways to use information from Google’s map service.
http://www.wired.com/news/print/0,1294,67514,00.html (4 pages)

 

Interview: Google CEO Eric Schmidt Talks Business Technology by Thomas Claburn

Google’s CEO Eric Schmidt talks about the business-technology market and how Google might change it, including sales models and the “Wal-martization” of the information economy.
http://informationweek.com/story/showArticle.jhtml?articleID=163701405 (5 pages)

 

Engineering Enterprises Using Complex-System Engineering by M.L. Kuras and B.E. White

This paper summarizes a complex-system engineering regimen for the deliberate and accelerated management of the natural processes that shape the development of complex-systems and proposes an approach for applying this regiment to enterprises.
www.ttivanguard.com/miamireconn/Complex-System.pdf (15 pages)


ADDITIONAL ARTICLES, WEB SITES, AND RESOURCES

Publications by speaker Richard Lenski
Richard Lenski has written papers in the following areas: ong-term evolution experiment with E.coli; evolution experiments with digital organisms; social bacteria; variable environments; interactions among mutations; mutation rates; host-parasite interactions; and antibiotic resistance.
http://www.msu.edu/user/lenski/

Publications by speaker Steen Rasmussen
Steen Rasmussen focuses on representing, generating, analyzing, and controling self-organizing and related systemic processes as they are manifested in natural and human-made systems. Current and recent projects include assembly of protocells, web-based decision support systems, and the development of simple urban dynamics simulations.
http://www.ees.lanl.gov/EES5/staff/steen/papers/

SPEAKER BLOGS

My Heart’s in Accra - Ethan Zuckerman
Speaker Ethan Zuckerman concentrates on his work and research on Africa, technology, and the media.
http://blogs.law.harvard.edu/ethan/

Sutter’s (Online) Mill - Herb Sutter
Speaker Herb Sutter is a part of Pluralsight, which emphasizes the building of a strong technical community that will benefit developers everywhere. The Pluralsight technical staff share a unique passion for software technology, with over 50 years of collective experience in software development, research, writing, speaking, and professional training focused primarily on the various Microsoft development platforms. Pluralsight focuses on three main areas: building community, providing professional training, and developing technical content.
http://pluralsight.com/blogs/hsutter/

Blake Ross on Firefox and Beyond – Blake Ross
http://www.blakeross.com/

Progress and Freedom Foundation
PFF is a market-oriented think tank that studies the digital revolution and its implications for public policy. Its mission is to educate policymakers, opinion leaders, and the public about issues associated with technological change, based on a philosophy of limited government, free markets, and individual sovereignty. PFF's underlying philosophy combines an appreciation for the positive impacts of technology with a classically conservative view of the proper role of government.
http://www.pff.org

Publications from the Progress and Freedom Foundation
The PFF’s primary areas of study include communications, intellectual property, e-commerce, energy, and competition.
http://www.pff.org/issues%2Dpubs/

IPcentral.info
IPcentral.info is the Internet arm of the Center for the Study of Digital Property, a wholly-owned part of The Progress and Freedom Foundation. IPcentral’s belief is that commitment to free markets and property rights, combined with the rule of law rather than the rule of micro-regulation or the whim of a ruling class, is the best way to organize human economic activity. Further, this wisdom applies to the rising world of intellectual creations and digitization as much as to the conventional worlds of physical goods. The site contains links to the following:

http://ipcentral.info/about/index.html

Electronic Privacy Information Center
EPIC is a public interest research center in Washington, D.C., established in 1994 to focus public attention on emerging civil liberties issues and to protect privacy, the First Amendment, and constitutional values. EPIC publishes an award-winning e-mail and online newsletter on civil liberties in the information age – the EPIC Alert. They also publish reports and books about privacy, open government, free speech, and other important topics related to civil liberties.
http://www.epic.org/

Privacy International: Privacy and Human Rights 2004: An International Survey of Privacy Laws and Development
This annual report by the Electronic Privacy Information Center and Privacy International reviews the state of privacy in over 60 countries around the world. It outlines legal protections for privacy, and summarizes important issues and events relating to privacy and surveillance. Each country report covers the constitutional, legal and regulatory framework protecting privacy and the surveillance of communications by law enforcement, new landmark court cases, most noteworthy advocacy work of non-governmental organizations and human rights groups, various new developments, and major news stories related to privacy.
http://www.privacyinternational.org/survey/phr2004/

Legal Briefs filed with the US Supreme Court in the case of MGM, et al v. Groskter

James V. DeLong and Solveig Singleton. “Brief of Amicus Curiae - The Progress & Freedom Foundation in Support of the Petitioners, in the Supreme Court of the United States, Metro-Goldwyn-Mayer Studios Inc., Et Al. (Petitioners) vs. Grokster, Ltd., Et Al. (Respondent).” The Progress & Freedom Foundation, January 24, 2005.
http://www.pff.org/issues-pubs/filings/050124groksteramicus.pdf

Brief for US Supreme Court case 04-480: MGM, et al. v. Grokster and StreamCast filed by attorneys for Groskter and StreamCast.
www.abanet.org/publiced/preview/briefs/pdfs_04-05/04-480Resp.pdf

Neuroeconomics at the Baylor School of Medicine Human Neuroimaging Laboratory (HNL)
HNL is a new, state-of-the-art facility dedicated to basic research investigations into the physiology and functional anatomy of the human brain using fMRI. Psychology and economics have successfully demonstrated that human behavior is not endlessly variable, but can be captured and quantified by testable laws. Modern neuroscience techniques can identify and link individual differences in decision-making behavior to differences in brain anatomy, brain responses, genetics, and so on. In particular, modern neuroimaging techniques provide the means by which human brain responses can be monitored while subjects are engaged in economic-based behavioral tasks. The lab has been employing this methodology to study well-quantified group behavioral scenarios in which monetary outcomes vary depending on how people cooperate, compete, or punish others. The goal of this work is centered on the idea of valuation, especially neural valuation. Valuation is a central concept in economics. In this domain, the value of goods must be put on a common scale in order to compare, contrast, and prioritize their value. In this context, the idea of a currency provides just such a common valuation scale.
http://www.hnl.bcm.tmc.edu/neuraleco.html

Neural Economics and the Biological Substrates of Valuation by P. Read Montague and Gregory S. Berns
A recent flurry of neuroimaging and decision-making experiments in humans, when combined with single-unit data from orbitofrontal cortex, suggests major additions to current models of reward processing. The authors review these data and models and use them to develop a specific computational relationship between the value of a predictor and the future rewards or punishments that it promises. The resulting computational model, the predictor-valuation model (PVM), is shown to anticipate a class of single-unit neural responses in orbitofrontal and striatal neurons. The model also suggests how neural responses in the orbitofrontal-striatal circuitmay support the conversion of disparate types of future rewards into a kind of internal currency, that is, a common scale used to compare the valuation of future behavioral acts or stimuli.
http://www.hnl.bcm.tmc.edu/articles/neuronread.pdf


REFERENCES FROM PREVIOUS TTI/VANGUARD CONFERENCES

Previous TTI/Vanguard Conferences have contained discussions and presentations
on a number of topics related to those being presented at this conference.
These may be accessed from the Members’ section of our web site
(www.ttivanguard.com) as Reinforcements and as the actual presentations.

EVOLUTION, SELF-REPLICATION, AND ARTIFICIAL LIFE
    

Where Life Meets Technology – TTI/Vanguard Conference
July, 2003 – Madrid, Spain

Technology and Evolving Systems: (Simplifying) The Complexity and Integration – TTI/Vanguard Conference
February, 1999 – Miami, Florida

The Future of Digital Biology – Dr. Peter Bentley
Biology—awe-inspiring in its complexity and versatile in its problem-solving approaches—holds secrets of significance not only to those interested in the forces of life but also to computer scientists. Indeed, surprisingly simple rules can often mimic life processes when implemented in digital systems. As portrayed in his book Digital Biology, Peter Bentley of University College London renders computational analogs of the biological wonders of evolution, insects and swarm dynamics, the brain, plants, the immune system, and the processes of growth and development. The result is a better understanding of life’s approach to the essential organismic capabilities of self-design, adaptation, self-repair, and self-building, as well as of their application to a wide variety of purposes far removed from those that serve to fuel and sustain biological creatures and ecologies.
April, 2002 – Toronto, Canada

Genomics and the Human Mind – Dr. Tom Ray
“The genomic databases are brand new, they’re vast information resources, they’re filled with the footprints of evolution, and they’re pretty unworked,” marvels the University of Oklahoma’s Tom Ray, and they entice him into their depths to tease out the connections between genetics and the subtleties of the human mind. The behavior of an organism is an interplay between genetic predisposition and environment. Little viable opposition greets the notion that genes causally lead to primary physical characteristics; nevertheless, acceptance of the link between the innate molecular definition of an individual and that being’s personality has seen slower adoption. In an effort to accelerate understanding in this area, Ray, although new to the field, takes a multifaceted approach to uncovering the influence of the genome on the human mind.
April, 2002 – Toronto, Canada

Artificial Life: A New Form of Intelligence – Dr. Chris Winter
It is a broadly held expectation that computers, mediated by software, will perform in predictable ways, with any deviations from anticipated results greeted with frustration. However, simulations offer an alternative to deterministic programming. Whereas traditional software tends to be effective when a single result is desired, open-ended problems are best addressed with an open-ended approach. Chris Winter of CyberLife Technologies is not locked in by the typical computer science constructs. So-called second-order programming—coding with the intent of creating a set of capabilities, as opposed to a specific function—requires adherence to an unconventional philosophy. The essence of artificial life (ALife) is the emergence of collective intelligence from an assemblage of individual entities, which nature exhibits in colonies of ants, flocks of birds, and schools of fish, for instance. Simple algorithms can model such behavior using one of several techniques, such as neural networks, genetic algorithms, case-based reasoning, fuzzy logic, or expert rules systems. However, a combination of methods allows for a richer set of outcomes than is possible with a homogeneous approach.
October, 1999 – Phoenix, Arizona

Complexity and the Evolution of Computing: Metaphors for Managing Emergent Systems – Dr. Steve Burbeck
“We are recapitulating the transition from single-cell to multicellular organisms in the domain of computation,” asserts Steve Burbeck of IBM Software, as he looks toward biology for metaphors for today’s networked computational environment. Clearly, the natural world does not disappoint. The single-cell analogy is a lone computer, such as an unnetworked PC, but enable interaction among them and something distinct emerges. Biology accomplished this feat more than half a billion years ago when multicellular organisms first enjoyed the benefits of overcoming the challenges of conjoined behavior; computer scientists lack such an extended head start. As such, multicellular computing has not progressed up the evolutionary ladder to rival the development of, say, mammals in the biological regime. Instead, networks today more closely resemble relatively simple collectives like biofilms or slime mold—relatively simple, but still rife with complexity.
September, 2004 – Los Angeles, California

All Species Inventory – Mr. Stewart Brand
The code for life might be in each organism’s DNA, but the functional foundation of life itself is its diversity of form and function, as well as the interdependence among components of the biosphere. In this domain, the species is the fundamental unit of measure, but not only have scientists not discovered, described, and categorized all of the planet’s species, estimates of the number awaiting discovery span an order of magnitude (~10M–100M). Of these, only a fraction have been named: 1.6M–1.75M species, with perhaps as much as 20% redundancy. With accelerating rates of extinction—due variously to natural processes, the introduction of invasive species through globalization, and human-mediated habitat destruction—the clock is ticking on scientists’ opportunity to discover what exists before it is gone forever. This forms the basis for the All Species Inventory, which is a grand-scale project cofounded by Stuart Brand that intends to name all life on Earth in the next 25 years.
February, 2002 – Pasadena, California

Strategies of Living Systems: Computation in the Wild – Dr. David Ackley
Biological metaphors are endemic in computer circles. Computers become infected with viruses, laptops hibernate, and programs die. But to what extent are human-designed machines analogous to living systems? “The fact of the Internet—the fact that we now have millions of interactions across administrative domains,” emphasizes David Ackley of the University of New Mexico, “means that manufactured computer systems are now facing the same environment that biological systems faced from the beginning.” This, however, was not always the case, with early computers existing in mutual isolation. Those times are over, and the historical period during which all qualitative distinctions between machines and life were apparent is coming to an end. At its essence, life is an interactive undertaking; networked computing has similarly reached a point of no return from interconnectivity.
July, 2002 – Brussels, Belgium

Things That Live: Progress Towards Living Artifacts – Dr. Christopher Langton
Christopher Langton of Swarm Corporation seeks to understand the emergent dynamics exhibited by complex adaptive systems. Through computer simulation, collections of individuals, each operating under a simple set of locally applicable rules, are seen to display coordinated behavior and together create a living tapestry whose global character cannot be predicted from the component elements. The area of study of human-made systems that feature behaviors characteristic of natural living systems has been dubbed “artificial life.” In addition to a deeper understanding of biological and ecological systems, progress in this field may lead to an increased grasp of phenomena in such diverse arenas as sociology, economics, geopolitics, and molecular dynamics, as well as the emergent behavior that may become evident as large numbers of machines become interconnected.
November, 1998 – Los Angeles, California

Digital Hormones and Self-reconfigurable Systems – Dr. Wei-Min Shen
In a PC-dominated environment, software designed for alternative platforms is sometimes overlooked. However, Wei-Min Shen of the Information Sciences Institute and USC applies his talents to modular robots, in an effort to gain full understanding of self-reconfigurablity in the mechanical domain. Without constraints imposed by legacy code and entrenched convention, his group instead has devised a system governed by distributed, autonomous, dynamic control that resembles the hormonal influence extant within biological organisms.
April, 2001 – Los Angeles, California

   
SECURITY AND PRIVACY
  Security and Privacy – TTI/Vanguard Conference
February, 2004 – Austin, Texas

Risk, Security, and Uncertainty – TTI/Vanguard Conference
February, 2001 – Austin, Texas

   
INTELLECTUAL PROPERTY
  Intellectual Property and the Entertainment Industry – Mr. Jack Valenti
The movie industry provides more to the United States than merely entertainment. It provides money. The aggregate contribution of copyright-protected media measures fully five percent of the U.S. gross domestic product, with a consistently positive international balance of trade. These media types comprise movies, television, home video, books, music, and computer software. As technology undergoes both evolution and revolution, protective methods fight to keep pace. Those who defend the owners of intellectual property use every arrow in their quivers, including the law, multi-industry covenants, and technological innovations of their own—and often all of these strategies in concert. Jack Valenti of the Motion Pictures Association of America (MPAA) energetically advocates for a safe haven for movie industry rights holders, principally the major movie studios, claiming that without “protective garments” within which to wrap their products, the substantial risk capital necessary to spur the creative process would cease to flow, thus leaving an artistic void in the theaters and on the home screens of America and abroad.
February, 2002 – Pasadena, California

Interplay of Intellectual Property and Information Technology – Dr. Robert Kahn
With the explosion of content being placed on the Internet, categorization of intellectual property is the current focus of Robert Kahn of the Corporation for National Research Initiatives (CNRI). By analogy to the International Standard Book Number (ISBN) system of categorizing books, standards are being developed to find, track, verify the accuracy, and control the use of intellectual property. In order to discuss the developing categorization infrastructure, clear definitions of the relevant terms are required. A copy is a physical object with a well-defined date of creation that can receive protection of copyright. That is, a copy is a material manifestation of a work. Documentation of the existence of the copy, even if only on a hard drive, is necessary for copyright protection. Since 1976, copyright comes into existence by the fact of creation of the work, not through the submission of an application. Distribution is the passing along of a physical object, i.e., a copy. Copyright law permits making copies of a performance, but not of the work that the performance is based on. It should be noted that the concept of fair use does not exist in copyright law, but rather in its application through the courts; fair use does not apply to copies, but to the arena of public display (i.e., performance).
September, 1998 – Washington, DC

The Architecture of Resiliency – Dr. Larry Lessig
“Creativity always builds on the past. The past always tries to control the creativity that builds upon it. Free societies enable the future by limiting the past’s ability to control it. Ours is less and less a free society.” This regressive argument forms the crux of Stanford University law professor Larry Lessig’s pessimism regarding the very future of ideas, hence his recent book of the same name. He examines the degeneration of the intellectual commons within the historical context of copyright law and its relationship with technology.
July, 2002 – Brussels, Belgium

The Right to Know – Mr. John Perry Barlow
Freedom on the Internet is far from uniquely defined. Electronic Frontier Foundation co-founder John Perry Barlow considers online freedom as comprising freedom of expression and the freedom to share ideas and information. In short, the Internet should be an intellectual commons. Conversely, the conglomerate of entertainment and communications enterprises—an increasingly small number of progressively more powerful commercial entities—summons the forces of law and technology to define freedom in terms that it believes will garner economic gain for its stakeholders. Despite the free-enterprise/protection-of-risk-capital spin with which this faction overlays its arguments, Barlow claims that content providers and distributors will ultimately render the domain of the Internet into a “private totalitarian” regime that might satisfy some of its own needs, but will do so at the expense of democracy and his version of freedom.
February, 2002 – Pasadena, California

   
OPEN SOURCE
  The Free-software Movement and Its Future – Mr. Richard Stallman
Freedom—the principle upon which the United States of America was founded. Yet, for most software and other published works, people are explicitly denied the freedoms of modification, sharing, and redistribution. As an analogy, imagine a society where recipes were similarly constrained: by force of law, a person who enjoyed cooking would be forbidden to play around with a recipe, write down a favorite version for friends to try, or pass along directions for producing a delicious dish that the cook received from a fellow culinary craftsperson. These, however, are the types of rules that govern the use of proprietary software. Richard Stallman, the originator and driving force behind the GNU project for free software, counts himself as a Saint in the Church of Emacs and lives his life in pursuit of freedom for all software.
April, 2001 – Los Angeles, California
   
SEMANTIC WEB
  Enterprise Semantic Web: The Changing Face of Corporate Data – Mr. Jeff Pollock
Will the current technology foundation of Java, XML, and relational databases (RDBs) satisfy the needs of the dynamic, multiorganizational, interdisciplinary network of enterprises in the future? No, believes Jeff Pollock of Network Inference; indeed it does not satisfy the internal needs of single enterprises today. Tim Berners-Lee has long held the vision of expanding his World Wide Web far beyond a web of documents and into an interoperable, cooperative network of machines and people. The World Wide Web Consortium (W3C) was established to facilitate this broad-based goal. Among its chief challenges is to develop a universal and extensible environment through which machines might autonomously share, interpret, and reuse data, regardless of the originating organization, application, or schema. Pollock acknowledges that there are holes in the W3C’s ambitious game plan; however, when applied more narrowly to the business environment, instead of the entire landscape of the Web, the outlook for the Semantic Web is promising indeed. His finite goal entails connecting ontologies to existing legacy data systems, be they RDBs, XML documents, unstructured sources, or object-oriented systems.
February, 2005 – San Francisco, California

Semantic Information Management – Dr. Zvi Schreiber
Within most of today’s large companies, a littering of databases plagues information technologists. With each database comes not only its own schema, but also, in general, a distinct semantic underpinning. The solution to the resultant morass tends to be one-to-one mappings between pairs of databases. In the extreme, N databases yield on the order of N2 such relationships—each such pairing must be independently updated when a database or business process is modified. At Unicorn Solutions, Zvi Schreiber offers a better plan for managing structured information assets within the corporate environment in the form of a hub-and-spoke model.
September, 2003 – Washington, DC

Bringing the Web to Programs – Dr. R.V. Guha
The World Wide Web makes publishing on the Internet sufficiently easy to permit anyone to participate—and everyone does. This vast human-readable forum has altered the way people plan their daily lives, gather information, and make purchases. Programs, however, are unable to access the wealth of data distributed across the Web, powerless as they are to read unstructured text. Even a site that has been augmented with structured extensible markup language (XML) metadata is substantially indecipherable by machines beyond the borders of its own domain, despite early promises of Web-wide compatibility. The source of the broken promise is a semantic disconnect that was built into the XML specification, although this seeming drawback is what enabled its very extensibility and thus should not be ridiculed unduly. The lack of a common set of descriptions divides data domains. Even when this problem is overcome, as is the goal of the Semantic (or Data) Web—as put forth by R.V. Guha of IBM Research, Almaden—how will machines know what data is trustworthy? This forms the second conundrum of the Data Web. These are the issues explored by the TAP project, collaboratively executed by IBM, Stanford University, and the World Wide Web Consortium (W3C). On the project’s behalf, Guha proposes a coordinating framework for disintermediated machine-to-machine computation and data sharing, but recognizes that it will require the emergence of a killer app to propel this concept forward so that it shares the prominence enjoyed by the web of documents.
September, 2002 – Miami, Florida

The Dublin Core and Internet Commons – Dr. Stuart Weibel
Stuart Weibel of the Online Computer Library Center and a founder of the DCMI was among those to see the need for metadata standardization of Web content as early as 1994. Back then, there were all of 500,000 individually addressable objects on the Web. Now, the Web is more than a collection of technologies and protocols; it has exploded into a platform for social, intellectual, and commercial discourse and interaction. Metadata, when constructively applied, can enhance the potential for both human and machine processing of the vast informational medley that awaits just beyond every browser window.
September, 2003 – Washington, DC

   
MISCELLANEOUS
  Which People and Which Technologies? – Mr. Ethan Zuckerman
The developing world faces daily challenges, many of which are unimaginable to people who live the comfortable existence found in countries of wealth. In the most dire districts, gnawing hunger, raging disease, persistent violence, or unstable government—or a combination of these harms—contribute to life at the edge of death. The call for international relief is heard, and attempts to heed it are made. What, then, about regions that are merely afflicted with acute poverty, inadequate transportation infrastructure, and a severe shortage of telecommunications opportunities, when measured by the standards of the North? For countries such as Ghana in West Africa, international development, not relief, is in order. Through the efforts of Ethan Zuckerman of Geekcorps, help is now on the ground in the form of (primarily) U.S. technologists who volunteer three months of their lives laying the intellectual groundwork for Ghanaian entrepreneurs to bootstrap their society into the information age.
November, 2000 – Atlanta, Georgia


Copyright 2005 TTI/Vanguard. All Rights Reserved
Send comments to: webmaster@ttivanguard.com