How will Human Life be Irreversibly Transformed in the next 25+ Years?

April 25, 2006

Singularity Summit at Stanford  May 13, 2006

I will be watching for online notes, powerpoints, audio/video recordings of these. If you are in the San Fran area, drop in, grab some audio/video recordings and post them.

The free event will be held in Stanford Memorial Auditorium, 551 Serra Mall, Stanford, CA 94305. Seating is limited. Please RSVP. For further information: sss.stanford.edu or 650-353-6063.

Speakers:
Nick Bostrom – Cory Doctorow – K. Eric Drexler – Tyler Emerson – Douglas R. Hofstadter – Steve Jurvetson – Ray Kurzweil – Bill McKibben – Max More – Christine L. Peterson – John Smart – Peter Thiel – Sebastian Thrun – Eliezer S. Yudkowsky

Among the issues to be addressed:
Bostrom: Will superintelligence help us reduce or eliminate existential risks, such as the risk that advanced nanotechnology will be used by humans in warfare or terrorism?

Doctorow: Will our technology serve us, or control us?

Drexler: Will productive nanosystems enable the development of more intricate and complex productive systems, creating a feedback loop that drives accelerating change?

Hofstadter: What is the likelihood of our being eclipsed by (or absorbed into) a vast computational network of superminds, in the course of the next few decades?

Kurzweil: Will the Singularity be a soft (gradual) or hard (rapid) take off and how will humans stay in control?

More: Will our emotional, social, psychological, ethical intelligence and self-awareness keep up with our expanding cognitive abilities?

Peterson: How can we safely bring humanity and the biosphere through the Singularity?

Thrun: Where does AI stand in comparison to human-level skills, in light of the recent autonomous robot race, the DARPA Grand Challenge?

Yudkowsky: How can we shape the intelligence explosion for the benefit of humanity?

Technorati Tags: , , , , , , , ,

Advertisements

Berkley’s course audio now on iTunes for Free!

April 25, 2006

UC Berkeley on iTunes U
Having difficulty with some subject? Wish you had studied more economics or art?

Berkley currently has audio course lectures in:

  • Computer Science
  • Computer Science Interdependent Studies
  • Biological Science
  • Integrative Biology
  • BioEnginnering
  • Electrical Engineering
  • Earth and Planetary Science
  • Physics
  • Art Practice
  • Philisophy
  • Chemistry
  • Nutritional Science
  • Environmental Science Planning & Management
  • Economics
  • History
  • Political Science
  • Geography

Stanford on iTunes
Stanford has some faculty audio lectures but they are not currently organized on this level. Most I have listened to have been popular lectures chosen from various areas.

MIT OpenCourseWare | OCW Home
MIT Opencourseware is predominately text based.

Technorati Tags: , , , , , ,


Link List: Reading 2.0 Summit

March 17, 2006

reading 2.0 logo.jpgSo much foundational work is happenning now that is insuring the solidity of the bedrock of the Library 2.0+ meme. The future of publishing is indeed wide open.

Oreilly radar > Link List:meme Reading 2.0 Summit

Link List:meme Reading 2.0 Permalink
By tim on March 16, 2006

Here’s a reading list of links to summarize the discussion at the Reading 2.0 summit held today in San Francisco:

  • Organizer Peter Brantley of California Digital Library opened the meeting. His blog.
  • I talked about Web 2.0 as it applies to the future of publishing. In particular, I talked about how ideas such as harnessing collective intelligence, the perpetual beta (dynamic content), the long tail, remixing, and open formats are being used on the consumer internet, and how these ideas ultimately need to be applied to the world of books and scholarship as well. I also talked about the idea that “worse is better” systems seem to propagate better than carefully designed systems that cross all the ‘t’s and dot all the ‘i’s. I used Bill Joy’s phrase “the future doesn’t need us” to remind participants that existing models have no guaranteed persistence, and suggested that competition to the book will come from very different forms of content that do similar “jobs” as particular types of books.
  • John Kunze of California Digital Library talked about Archival Resource Keys used by the CDL, also known as ARK identifiers. In brief: “The ARK identifier is a specially constructed, globally unique, actionable URL….Entering the ARK in a web browser produces an object. Entering the ARK followed by a single question mark (“?”) produces the metadata only. Entering the ARK followed by a double question mark (“??”) produces a CDL commitment statement.” I like the ? and ?? idea as a hack to overload traditional URLs more than I like the whole idea of a registry for persistent URLs. (Related, but not discussed at the conference: the way Connotea bookmarks are always followed by an (info) link that provides metadata.) A “commitment statement” is metadata about the permanence of the item — a lot of librarians are concerned about preservation, and this is a way to share the commitment to maintaining an item.
  • Herbert van de Sompel of Los Alamos National Laboratory presented his OpenURL framework. Briefly: “Citations to other works are familiar to any scholar- they ground a work of scholarship to a field of study, put new research into context, and often give credit where credit is due. The essence of citation is to identify the previous work with a set of metadata-author, title, and particulars of publication. The idea behind OpenURL is to provide a web-based mechanism to package and transport this type of citation metadata so that users in libraries can more easily access the cited works. Most typically, OpenURL is used by subscription-based abstracting and indexing databases to provide linking from abstracts to fulltext in libraries. A institutional subscription profile is used together with a dynamic customization system to target links at a user’s OpenURL linking service.” But more specifically, one of the types of services enabled by openURL metadata is navigation of the various permissions granted to various institutions by their subscriptions to scholarly content. The openURL services thus create a kind of transparent rights management layer. Here’s a list of the metadata formats most often used with OpenURL.

    OCOinS, or Open Context Objects in Spans, are a way of embedding OpenURL citations directly in HTML without using the associated lookup service.

    Adam Smith, product manager for Google Book Search and Google Scholar, points out how Google Scholar supports OpenURL, and notes that Scholar really shows a possible future for GBS if publishers do start providing their own digital content — the current scanning initiative is to bootstrap the acquisition of content that’s not currently in digital form. But in the scholarly world, the content is already generally digital, and Google has taken on its traditional role as a search engine, albeit in a world that requires something like OpenURL to navigate the permissions that are required for much of this content.

  • Chad Dickerson, now at Yahoo!, lives on the consumer side. He agrees: Worse is better. He talks about Yahoo’s use of microformats, simple conventions for embedding semantic markup in html documents. Discussed as examples hcal as used by upcoming.org, Creative Commons Search, rel=nofollow, and rel=tag.
  • Herbert was back up, talking about OAI-PMH, the Open Archives Initiative Protocol for Metadata Harvesting. This is a protocol for managing metadata about additions, deletions, and updates to XML collections, allowing for the easy creation of mirrors. Harvester asks what metadata formats are supported Dublin-Core, MARC/XML, MPEG-21, or METS. Once format chosen, asks for all time-stamped updates to metadata, then potentially makes selected data requests.

    Brewster Kahle notes that Yahoo! monitors the activity on the Internet Archive via OAI-PMH. But more importantly, it’s a method for accessing “the deep web.” Lorcan Dempsey also talks about the way that OAI is used to share info about repositories of theses and dissertations.

  • Lorcan Dempsey of OCLC talked about FRBR, Functional Requirements for Bibliographic Records, and FictionFinder as a demonstration of its utility. Which “Huck Finn” do you want? Very cool. You have a work, realized through an expression (e.g. an illustrated edition, a Spanish edition, an abridged edition, a spoken word edition), embodied in a manifestation (the 1954 Penguin edition), an item (an actual copy of that manifestation.) An adaptation, on the other hand, would be considered a different work. Only items actually exist. Everything else is a concept, metadata about that item that helps us to categorize it. FictionFinder not only lets you find different manifestations of a work, it also lets you do cool metadata search: find me bildungsromane taking place on the Mississippi, or detective novels taking place in Columbus OH. Not just the usual metadata! Apparently an improved version of FictionFinder will be out in a month or so.

    In a takeoff on my language, Lorcan notes that “OCLC is a way of harnessing the collective intelligence of catalogers.” It’s a database of about 47 million works (28 million print books), in 60 million manifestations. On average, there are 1.3 manifestations of a work — i.e. most works get only one manifestation. 87% have only one; 12% between two and five, and only 1% have more than 5.

    In some ways, the number of libraries holding a book can be seen as a kind of PageRank for the popularity of books (at least among librarians, if not among their customers :-). Some services based on this idea: Top 1000 works by library holding (The top ten: The Bible, the US Census, Mother Goose, the Divine Comedy, the Odyssey, The Iliad, Huck Finn, Lord of the Rings, Hamlet, Alice in Wonderland.) Audience Level, a service that estimates the audience level of a book by measuring the type and number of libraries that hold it. (For example, a book held by a public library receives a different score than one held by a research library.)

    Given an isbn, the xisbn service finds alternate editions of the same work. Nice.

  • Bill McCoy, Adobe: We shape our product thinking by understanding that there are four levels of content: level 0 represents content as actual bits such as ink on paper or pixels on a screen, level 1 gives final form representation (e.g. pdf) that is faithful to the bits but may be scalable etc., level 2 gives reflowable presentation (e.g. html + css), and level 3 data a real separation of content and presentation. He presents some non-bloggable ideas from the Adobe labs about how Adobe is thinking about moving content through those levels.
  • John Mark Ockerbloom of the University of Pennsylvania discusses his work on communities around preservation of content, and particularly focuses on the importance of preserving evanescent content such as periodicals, which give a window on an era. Hopes to see everything in the public domain becoming available online. In order to do that, we need to know what’s out of copyright.

    He describes his work on the catalog of copyright entries, and points to a directory of first copyright renewals for periodicals: “Most periodicals published in the US prior to 1964 had to renew their issue copyrights after 28 years in order to retain copyright on the issue….Below is a list of periodicials and their first copyright renewals, if any. The list below should include all of the more than 1000 periodicals that renewed between 1950 and 1977, and selected periodicals that renewed between 1978 and 1992. (After 1992, copyright renewal was no longer required.)”

  • Juliet Sutherland of Distributed proofreaders: “This site provides a web-based method of easing the proofreading work associated with the digitization of Public Domain books into Project Gutenberg e-books. By breaking the work into individual pages, many proofreaders can be working on the same book at the same time. This significantly speeds up the proofreading/e-book creation process….When a proofreader elects to proofread a page of a particular book, the text and image file are displayed on a single web page. This allows the page text to be easily reviewed and compared to the image file, thus assisting the proofreading of the page text.” Volunteers have produced over 8100 titles (periodicals or books); another 4000 in progress. We need to get beyond scanned images of works to plain text, which can be remixed and used in other ways. This is one way to get there. Juliet says: “Come do your page a day!

    A lot of similarities to Amazon’s Mechanical Turk in the way tasks are broken into atomic units (one random page at a time) rather than as larger units. 4-500 unique volunteers per day. Each physical page gets looked at 4 times. (This is good prior art if Amazon tries to patent the Turk. In addition, Yochai Benkler discusses the issues in splitting up work like this in Coase’s Penguin.)

  • Dale Flecker of Harvard talks about the lack of norms in citation: “When you pick up a pointer, there’s no standardized expectation of what you’re going to get.” He points to a bunch of sites that do interesting, but different things. Wishes for a system that gives multi-valued pointers — showing options for reaching different versions of an item.
  • John P. Wilkin, University of Michigan: There are a lot of rights, and no easy answers. Describes the methodology used by the University of Michigan to determine the rights status of works being digitized by the UMich library. Six status categories – public domain (pd), in copyright (ic), out of print and brittle (opb), orphaned because no copyright owner can be found (orph), undetermined (und), open to U Mich by contract (UMall), open to the world by contract (world). Reason codes – bibliographically by copyright date (bib), no copyright notice (ncn), by contract (con), or by due diligence (ddd). Some interesting discussion about the problem of embedded rights — e.g. does the work include other works (e.g. a photo) that has different rights. (At O’Reilly, we’re struggling with this problem right now. We want to put out some of our content under open source licenses, without actually granting our competitors the right to copy the format. (Think Hacks books or cookbooks.)
  • Ammy Vogtlander of Elsevier also talks about rights. Because they have a small community, infringement is easy to determine, and rights are determined by contracts between institutions. And because readers are also the authors, managing DRM violations is often as much a PR and community relations issue as much as a legal one. Would love to see some mechanism developed for search engines to recognize and skip content that has been discovered outside its permitted domain.
  • Jon Noring of OpenReader talks about “Open DRM.” He says that the acronym DRM is coming to be a lot like the acronym UFO, where an “unidentified” flying object is actually identified in the popular mind as an alien spacecraft, and thus the term has come to mean something very different from its literal meaning. A copyright notice is a type of DRM. DRM should not be confused with “TPM” or “technical protection measures.” And TPM doesn’t really work. If something is popular, it will be pirated. 48 hours after Harry Potter is published [12 hours for the latest book], people have retyped it in (correcting errors along the way) and put it into distribution. Meanwhile, the protection mechanisms irritate ordinary users. Publishers are also worried about protection systems controlled by a single vendor. There’s a desire for an open source type system that isn’t controlled by any vendor, and that allows flexible permissions. In discussion, Cliff Lynch points out that the Creative Commons is in fact just that: a mechanism of asserting rights, without inconveniencing readers.
  • Brewster Kahle of the Internet Archive and the Open Content Alliance offers to be a place to bring together best practices and a place for experiment. He requests more collaboration around issues and concrete projects that show what the digital future can look like. We can use the out of copyright materials as a testbed.
  • Cliff Lynch of CNI spoke about the value in the future of collections of content that we can not just read but compute on. Finding and clustering are just the beginning. What new services will we be able to build? We’ll be moving beyond individual people interacting with small groups of texts. He also talks about the difference between knowledge and information sources, which are somewhat fungible, and works of artistic expression, which are much less so. We may end up needing very different methods for dealing these two types of work.

    He also talks about sharing the burden of cleaning up the mess we’ve made in losing track of who owns what. This is so expensive we really don’t want to do it more than once.

Overall, a fascinating meeting. I learned a lot, and urge readers to follow some of the links above and learn about some of the amazing work being done by the library community.

Technorati Tags: , , ,


Wikis and Blogs to Ease Administration

March 5, 2006

Empowering the average user with tools that are freely available. Short application generation cycles in a culture with cheap access to information is making room for cheap coordination. Hopefully the ease of setup and customization of these tools will become somewhat more in line with the ease of usage soon.

Using Wikis and Blogs to Ease Administration | Linux Journal

System administration can be like sailing a ship. You must keep your engines running smoothly, keep your crew and the harbors notified and up to date and also maintain your Captain’s log. You must keep your eye on the horizon for what is coming next. Two technologies have emerged over the past few years that could help keep you on course, wikis and blogs.

Maintaining Good Documentation
I find that one of the most difficult aspects of system administration is keeping documentation accurate and up to date. Documenting how you fixed a pesky problem today will help you remember how to fix it months later when it occurs again. If you ever have worked with others, you realize how critical good documentation is. Even if you are the only system administrator, you still will reap the benefits of good documentation, even more so if another sysadmin is ever brought on board.

Some goals of a good documentation system should be:

  • Make it easy for you and your coworkers to find relevant information.
  • Make it easy for new employees to come up to speed quickly.
  • Make it easy to create, edit and retire documentation.
  • Keep revisions of changes and who made them.
  • Limit who sees or edits the documentation with an authentication system.

Unfortunately, keeping your documentation up to date can be a full-time job in itself. Documenting, though not a very glamorous task, certainly will pay off in the long run.

Why a Wiki?
This is where a wiki comes in. From Wikipedia: “a wiki is a type of Web site that allows users to add and edit content and is especially suited for constructive collaborative authoring.”

What this means is a wiki allows you to keep and edit your documentation in a central location. You can access and edit that documentation regardless of the platform you are using. All you need is a Web browser. Some wikis have the ability to keep track of each revision of a changed document, so you can revert to a previous version if some errant changes are made to a document. The only obstacle a new user must overcome is learning the particular markup language of your wiki, and sometimes even this is not completely necessary.

One of a wiki’s features is also one of its drawbacks. Wikis are pretty free flowing, and although this allows you to concentrate on getting the documentation written quickly, it can make organization of your wiki rapidly spiral out of control. Thought needs to be put into how the wiki is organized, so that topics do not get stranded or lost. I have found that making the front page a table of contents of all the topics is very handy. However you decide to organize your wiki, make sure it is well understood by everyone else. In fact, a good first document might be the policy describing the organization of the wiki!

TWiki
There are several open-source wikis available, such as MediaWiki [see Reuven M. Lerner’s article on page 62 for more information on MediaWiki] and MoinMoin, each with its own philosophy on markup and layout, but here we concentrate on TWiki.

Some of TWiki’s benefits are:

  • A notion of webs that allows the wiki administrator to segregate areas of collaboration into their own areas, each with its own set of authorization rules and topics.
  • A modular plugin and skin system that allows you to customize easily.
  • A well-established base of users and developers.
  • Revision control based on RCS.
  • It is Perl-based and mod_perl or FastCGI can be used.
  • Authentication is handled outside the wiki by mechanisms such as Apache htpasswd.

Installing TWiki is relatively easy, but still needs work. I hope, as the beta progresses, we will see improvements in ease of installation and upgrading along with clearer documentation.

First, you must create the directory where you want to install TWiki, say /var/www/wiki. Next, untar the TWiki distribution in that directory. Then you must make sure that the user with rights to run CGI scripts (usually apache or www-data), owns all of the files and is able to write to all files.

# install -d -o apache /var/www/wiki
# cd /var/www/wiki
# tar zxf /path/to/TWikiRelease2005x12x17x7873beta.tgz
# cp bin/LocalLib.cfg.txt bin/LocalLib.cfg
# vi bin/LocalLib.cfg lib/LocalSite.cfg
# chown -R apache *
# chmod -R u+w *

Now copy bin/LocalLib.cfg.txt to bin/LocalLib.cfg, and edit it. You need to edit the $twikiLibPath variable to point to the absolute path of your TWiki lib directory, /var/www/wiki/lib in our case. You also must create lib/LocalSite.cfg to reflect your specific site information.

Here is a sample of what might go into LocalSite.cfg:

# This is LocalSite.cfg. It contains all the setups for your local
# TWiki site.
$cfg{DefaultUrlHost} = “http://www.example.com”;
$cfg{ScriptUrlPath} = “/wiki/bin”;
$cfg{PubUrlPath} = “/wiki/pub”;
$cfg{DataDir} = “/var/www/wiki/data”;
$cfg{PubDir} = “/var/www/wiki/pub”;
$cfg{TemplateDir} = “/var/www/wiki/templates”;
$TWiki::cfg{LocalesDir} = ‘/var/www/wiki/locale’;

Here is a sample section for your Apache configuration file that allows this wiki to run:

ScriptAlias /wiki/bin/ “/var/www/wiki/bin/”
Alias /wiki “/var/www/localhost/wiki”
Options +ExecCGI -Indexes
SetHandler cgi-script
AllowOverride All
Allow from all
Options FollowSymLinks +Includes
AllowOverride None
Allow from all
deny from all
deny from all
deny from all

TWiki comes with a configure script that you run to set up TWiki. This script is used not only on initial install but also when you want to enable plugins later. At this point, you are ready to configure TWiki, so point your browser to your TWiki configure script, http://www.example.com/wiki/bin/configure. You might be particularly interested in the Security section, but we will visit this shortly. Until you have registered your first user, you should leave all settings as they are. If the configure script gives any warnings or errors, you should fix those first and re-run the script. Once you click Next, you are prompted to enter a password. This password is used whenever the configure script is run in the future to help ensure no improper access.

Once you have completed the configuration successfully, it is time to enter the wiki. Point your browser to http://www.example.com/wiki/bin/view, and you are presented with the Main web. In the middle of the page is a link for registration. Register yourself as a user. Be sure to provide a valid e-mail address as the software uses it to validate your account. Once you have verified your user account, you need to add yourself to the TWikiAdminGroup. Return to the Main web and click on the Groups link at the left, and then choose the TWikiAdminGroup. Edit this page, and change the GROUP variable to include your new user name:

Set GROUP = %MAINWEB%.TiLeggett
Set ALLOWTOPICCHANGE = %MAINWEB%.TWikiAdminGroup

(Three blank spaces at the beginning of each of these 2 lines are critical.)

These two lines add your user to the TWikiAdminGroup and allow only members of the TWikiAdminGroup to modify the group. We are now ready to enable authentication for our wiki, so go back to http://www.example.com/wiki/bin/configure. Several options provided under the Security section are useful. You should make sure the options {UseClientSessions} and {Sessions}{UseIPMatching} are enabled. Also set the {LoginManager} option to TWiki::Client::TemplateLogin and {PasswordManager} to TWiki::Users::HtPasswdUser. If your server supports it, you should set {HtPasswd}{Encoding} to sha1. Save your changes and return to the wiki. If you are not logged in automatically, there is a link at the top left of the page that allows you to do so.

Now that you have authentication working, you may want to tighten down your wiki so that unauthorized people do not turn your documentation repository into an illicit data repository. TWiki has a pretty sophisticated authorization system that is tiered from the site-wide preferences all the way down to a specific topic. Before locking down the Main web, a few more tasks need to be done. Once only certain users can change the Main web, registering new users will fail. That is because part of the user registration process involves creating a topic for that user under the Main web. Dakar has a user, TWikiRegistrationAgent, that is used to do this. From the Main web, use the Jump box at the top left to jump to the WebPreferences topic.

Edit the topic to include the following four lines and save your changes:

Set ALLOWTOPICRENAME = %MAINWEB%.TWikiAdminGroup
Set ALLOWTOPICCHANGE = %MAINWEB%.TWikiAdminGroup
Set ALLOWWEBRENAME = %MAINWEB%.TWikiAdminGroup
Set ALLOWWEBCHANGE = %MAINWEB%.TWikiAdminGroup,
–>;%MAINWEB%.TWikiRegistrationAgent

This allows only members of the TWikiAdminGroup to make changes or rename the Main web or update the Main web’s preferences. It also allows the TWikiRegistrationAgent user to create new users’ home topics when new users register.

Once you have verified that the Main web is locked down, you should do the same for the TWiki and Sandbox webs.

When you are done configuring TWiki, you should secure the files’ permissions:

# find /var/www/wiki/ -type d -exec chmod 0755 {} ‘;’
# find /var/www/wiki/ -type f -exec chmod 0400 {} ‘;’
# find /var/www/wiki/pub/ -type f -exec chmod 0600 {} ‘;’
# find /var/www/wiki/data/ -type f -exec chmod 0600 {} ‘;’
# find /var/www/wiki/lib/LocalSite.cfg -exec chmod 0600 {} ‘;’
# find /var/www/wiki/bin/ -type f -exec chmod 0700 {} ‘;’
# chown -R apache /var/www/wiki/*

As I mentioned before, TWiki has a plugin system that you can use. Many plugins are available from the TWiki Web site. Be sure the plugins you choose have been updated for Dakar before you use them.

Keeping Your Users in the Know
One important aspect of system administration that is sometimes overlooked is keeping users informed. Most users like to know when there is new functionality available or when resources are down or not available. Not only does it make users happier to be kept informed, but it also can make your life easier as well. The last thing you want to do when the central file server is down is reply to users’ questions about why they cannot get to their files. If you have trained your users to look at a central location for status of the infrastructure first, all you have to do after notification of a problem is post to this central place that there is a problem. Mailing lists also are good for this, but what if the mail server is down? Some people, for instance your boss or VP of the company, might like to know what the status is of things as they happen. These updates might not be suitable to send out to everyone daily via e-mail. You could create yet another mailing list for these notifications, but you also might consider a blog.

If you are not familiar with a blog, let us refer back to Wikipedia: “a blog is a Web site in which journal entries are posted on a regular basis and displayed in reverse chronological order.”

The notion of a blog has been around for centuries in the form of diaries, but blogs recently have had an explosion on the Internet. Many times a blog is started as someone’s personal journal or as a way to report news, but blogs can be extremely useful for the sysadmin.

Blogs can help a sysadmin give users an up-to-the-minute status of what they are doing and what the state of the infrastructure is. If you faithfully update your blog, you easily can look back on what you have accomplished so you can make your case for that raise you have been hoping for. It also will help you keep track of what your coworkers are doing. And, with many blog software packages providing RSS feeds, users can subscribe to the blog and be notified when there are new posts.

WordPress
There are a lot of blog software packages out there today, but here we cover WordPress. WordPress is fast and has a nice plugin and skin interface to allow you to customize it to your heart’s content. The only requirements for running WordPress are Apache, MySQL and PHP. I don’t go into how to install WordPress, because the on-line documentation is very clear and easy to follow. Instead, I start where the installation leaves off and introduce some useful plugins. I suggest starting with WordPress v1.5.2 even though v2.0 is currently out. There have been some problems with the initial 2.0 release that warrant waiting for v2.0.1. Also, many of the plugins have not had a chance to update to the new system.

The first thing you should do after installing WordPress is log in as the admin user. Once logged in, you are presented with the Dashboard. At the top of the page is a menu of options named Write, Manage, Links and so on. You should first create an account for yourself by clicking on the Users option. Once that has loaded, two tabs labeled Your Profile and Authors & Users are available under the main menu. Click on Authors & Users, and scroll down to the Add New User section and fill in the text fields. Once your user has been added, it appears in the Registered Users section above. There are several columns of data, and one is Promote, which you should click on. Promoting a user makes that user an author and also allows that user to have more privileges based on its level. Once your user has been promoted, it will have a level of one. There are plus and minus signs on either side of the level to use to increase your user’s level. Increase it to nine, which is the highest level a non-admin user can be. Should you ever need to delete users that have been promoted to authors, all you need to do is decrease their level below one and then delete them. I have included a link to a more in-depth description of the privileges of each user level in the on-line Resources.

There are a few other options you might consider changing. In General Options, there are check boxes to allow anyone to register to become a blog user and to require users to be logged in to add comments. You may or may not want these options enabled, depending on your security concerns and the openness of your blog. At our site, users cannot register themselves, though anyone can post comments without being logged in. You should explore all the menus and all their options to tweak them for your site.

WordPress Plugins
WordPress has a very modular plugin system, and a lot of people have written many plugins. WordPress also has a notion of categories. Categories can have many uses, but one might be to create mini-blogs for different communities of users or to group posts about a specific aspect of the infrastructure. But, you might not want all users to be able to see every category. The Userextra plugin, in conjunction with the Usermeta plugin, allows you to control exactly this sort of thing. Once you have followed these plugins’ installation instructions, two more menus are available under Options and one more under Manage that allow you to refine access.

Another plugin you may find useful is the HTTP Authentication plugin. This plugin lets you use an external authentication mechanism, such as Apache’s BasicAuth, as a means to authenticate to WordPress. This is great if you already have an LDAP directory or Kerberos realm that you use for authentication and you have mod_auth_ldap or mod_auth_kerb up and running.

Many more plugins are available for WordPress from the WordPress Codex and the WordPress Plugin DB. If you feel some functionality is missing, there are plenty of examples and documentation available from the WordPress Web site, and these plugin repositories can help you write your own plugin.

Wrapping Up
I hope that after this whirlwind tour of wikis and blogs you have come to see how they can be beneficial to help your shop run a smoother ship and provide your users with all the information they might want. Just as there are many different sails to keep your ship sailing, there are many different wiki and blog software packages out there. The right package for you is the one that keeps your users happy and you productive.

Resources for this article: www.linuxjournal.com/article/8832.


10 things that will change the way we live

February 22, 2006

10 major changes that will probably occur within our lifetimes. Ponder this, I’m sure the VCs will.

Forbes.com – Magazine Article

10 Things That Will Change The Way We Live

Fuel Cells

In fuel cells, the energy of a reaction between a fuel, such as liquid hydrogen, and an oxidant, such as liquid oxygen, is converted into electrical energy. Fuel cells will change the global economy, and not just because they will be as big a development in motoring as the internal-combustion engine was. They will also be used as cell-phone batteries and power generators, among other things. And they will eliminate the problem of what to do with used batteries: Theoretically, fuel cells are renewable forever.
Image © Shutterstock

10 Things That Will Change The Way We Live

Gene Therapy

Although the FDA has not approved any human gene therapy for sale, the potential for using it to correct defective genes responsible for disease development is enormous. Gene therapy works by inserting genes into cell tissue, essentially replacing a defective gene with one that works. So far, researchers have been exploring how gene therapy could be used to combat or eradicate diseases caused by single-gene defects, such as cystic fibrosis, hemophilia, muscular dystrophy and sickle cell anemia. With time, however, it is hoped that it will not only revolutionize the treatment of all disease but will also be able to prevent hereditary diseases, such as Down syndrome and heart disease.
Image © Shutterstock

10 Things That Will Change The Way We Live

Haptics

Whether people know it or not, haptics has been subtly making inroads into everyday life in the form of vibrating phones, gaming controllers and force-feedback control knobs in cars (BMW‘s iDrive system uses the technology). But the science of haptics has the potential to do much more. Products, such as the CyberForce “whole-hand force feedback system” from Immersion Corporation and SenseAble Technologies, let users interact physically with virtual objects. For instance, by using a sensor-equipped glove and a force-reflecting exoskeleton, you could literally feel the shape, texture and weight of an onscreen 3-D object. Such devices are used now for virtual modeling, medicine and the military, but as costs decrease, haptic interfaces could become valuable communication tools. Using haptics technology, people will be able to shake hands virtually over the Internet, and doctors will have the ability to remotely diagnose and operate on patients.
Image © Shutterstock

10 Things That Will Change The Way We Live

Internet2

Internet2, or UCAID (University Corporation for Advanced Internet Development), is the next-generation Internet. It is a nonprofit consortium developed by many of the leading universities in the U.S., as well as by companies such as Cisco, Intel and Comcast, in 1996, to deliver video and data at much faster speeds than are possible over the public Internet. The reason is that it is connected to the Abilene national backbone–provided by Qwest Communications–by regional fiber networks, which will soon have a capacity of 10 gigabits per second through the use of optical-networking technologies. This will allow for faster downloads of more complex packets of data and facilitate activities such as peer-to-peer applications, high-definition videoconferencing and, yes, gaming.
Image © Shutterstock

10 Things That Will Change The Way We Live

LifeStraw

What’s the most precious liquid on earth? If you said oil, you’re wrong. It’s water. Even though more than 70% of the earth’s surface is covered in H20, many parts of the world suffer from a persistent and crippling shortage of potable drinking water. LifeStraw hopes to change all that. The 10-inch-long, 1-inch-in-diameter device is made by Vestergaard Frandsen S.A. of Lausanne, Switzerland, out of a patented resin that kills bacteria on contact. Its filters remove bacteria, such as salmonella and staphylococcus, from surface water in rivers and lakes. Reusable and, at $3 to $4 each, affordable, it has the potential to not only reduce the outbreak of disease but also to improve living standards and sanitation in many of the world’s poorest regions.
Image © Shutterstock

10 Things That Will Change The Way We Live

MRAM

MRAM, or Magnetoresistive Random Access Memory, could change the way we work. Researchers at IBM have shown that MRAM can be six times faster than the current industry-standard memory, dynamic RAM (DRAM). It is almost as fast as static RAM (SRAM) and is much faster and suffers less degradation over time than Flash memory. Unlike these technologies, MRAM uses magnetism instead of electrical charges to store data. As a result, it is lower in density and in cost. In December 2005, Sony engineers verified operation of a spin-torque-transfer MRAM in the lab with data-write speeds of two nanoseconds. If adopted as a universal standard, MRAM could have significant military communications applications.
Image © Getty

10 Things That Will Change The Way We Live

$100 Laptop

If we are to accept that the world economy is now fully dependent upon the information economy, then it stands to reason that those people who are left out of the global information network are doomed to an endless cycle of poverty. The Massachusetts Institute of Technology Media Lab has designed a fully functional laptop computer that can be sold for $100, so that children in poor or developing nations can get access to the Internet. To keep costs down, the laptop will use a $35 dual-mode display (the kind found on cheap DVD players), a 500-megahertz processor, a slimmed-down operating system and will have only one gigabyte of storage. Users will be able to plug it into a wall outlet or charge it by a crank-driven battery, and it will connect to the Internet via a wireless card. To be sure, these laptops are not going to be playing Quake 4 anytime soon, but they could give disadvantaged kids a shot at taking part in the digital community. MIT hopes to have a working prototype by November 2005 and production units shipping to government education ministries by the end of 2006.
Image © Shutterstock

10 Things That Will Change The Way We Live

$200 Barrel Of Oil

It’s not an invention, but it will have a dramatic effect on the way everyone lives. Although the predictions range from terrifying to calming, all experts agree that a dramatic rise in the cost of fossil fuel would have a devastating impact not only on the global economy but on global society as well.
Image © Shutterstock

10 Things That Will Change The Way We Live

VoIP

Voice-over-Internet Protocol lets people make telephone calls over the Internet or any other IP-based network. Because the voice data flows over a general-purpose packet-switched network, instead of dedicated, circuit-switched voice transmission lines, the cost of making telephone calls for both business and residential users is much less than with traditional telcos. The reason it is so cheap is that the high-speed Internet providers essentially bundle VoIP free with Internet access. Another advantage is that it is mobile: All one needs is an Internet connection to make a phone call from anywhere. But there are a few drawbacks–although these are being smoothed over–such as quality and reliability.
Image © Getty

10 Things That Will Change The Way We Live

WiMAX

WiMAX stands for Worldwide Interoperability for Microwave Access, which is a long-range, standard-based wireless technology that will effectively allow people to access their phones, computers and the Internet from virtually anywhere. No more need to wait for the cable or phone company to install the “last mile” of pipe to your home. The IEEE 802.16 broadband wireless access standard provides up to 31 miles of linear service area range and allows for connectivity between users without a direct line of sight. This is significant for several reasons: First, it will increase the ease and frequency with which people make wireless connections for work or leisure; second, it will have enormous potential applications in underdeveloped countries–as well as rural areas of the First World–which lack adequate communications infrastructure; and third, no more messy wires.

Technorati Tags: technology, emerging, evolution, futurism


I’d like to place a hold on Library 2.0

December 13, 2005

I was pondering how the face of the library will change over the next few years and I ran into this interesting article. I have cut the out the explanation of each trend for brevity, you can find the entire article here.

In 1519 Leonardo da Vinci died and left behind one of the world’s largest collections of art comprised of well over 5,000 drawings, sketches, and paintings, the vast majority of which the general public would not become aware of until over 400 years later.
The largest portion of this collection was left in the hands of Francesco Melzi, a trusted assistant and favorite student of Leonardo. Sixty years later when Melzi died in 1579 the collection began a lengthy, and often destructive, journey.
In 1630 a sculptor at the court of the King of Spain by the name of Pompeo Leoni began a very sloppy process of rearranging the collections, sorting the artistic drawings from the technical ones with scientific notations. He split up the original manuscripts, cut and pasted pages and created two separate collections. Some pieces were lost.
In 1637 the collections were donated to Biblioteca Ambrosiana, the library in Milan, where they remained until 1796 when Napoleon Bonaparte ordered the manuscripts to be transferred to Paris. Much of the collection “disappeared? for the next 170 year until it was rediscovered in 1966 in the archives of the National Library of Madrid.
Libraries played a significant role in the preservation of the da Vinci collection and we often wonder about other brilliant people in history who didn’t have libraries to preserve their work. Some we will never know about.
Archive of Information
Throughout history the role of the library was to serve as a storehouse, an archive of manuscripts, art, and important documents. The library was the center of information revered by most because each contained the foundational building blocks of information for all humanity.
In medieval times, books were valuable possessions far too expensive for most people to own. As a result, libraries often turned into a collections of lecterns with books chained to them.
In 1455 Johann Gutenberg unveiled his printing press to the world by printing copies of the Gutenberg Bible. Later Gutenberg had his printing press repossessed by Johann Fust, the man who had financed his work for the previous 10 years. The sons of Johann Fust were largely responsible for a printing revolution that saw over 500,000 books put into circulation before 1500.
A huge turning point in the evolution of libraries was architected by Andrew Carnegie. Between 1883 and 1929 he provided funding for 2,509 libraries, of which 1,689 of them were built in the US.
Leading up to today libraries have consisted of large collections of books and other materials, primarily funded and maintained by cities or other institutions. Collections are often used by people who choose not to, or can not afford to, purchase books for themselves.
But that definition is changing.
Beginning the Transition
We have transitioned from a time where information was scarce and precious to today where information is vast and readily available, and in many cases, free.
People who in the past visited libraries to find specific pieces of information are now able to find that information online. The vast majority of people with specific information needs no longer visit libraries. However, others who read for pleasure as example, still regularly patronize their local library.
Setting the Stage
We have put together ten key trends that are affecting the development of the next generation library. Rest assured that these are not the only trends, but ones that have been selected to give clear insight into the rapidly changing technologies and equally fast changing mindset of library patrons.
Trend #1 – Communication systems are continually changing the way people access information
Trend #2 – All technology ends. All technologies commonly used today will be replaced by something new.
Trend #3 – We haven’t yet reached the ultimate small particle for storage. But soon.
Trend #4 – Search Technology will become increasingly more complicated
Trend #5 – Time compression is changing the lifestyle of library patrons
Trend #6 – Over time we will be transitioning to a verbal society
Trend #7 – The demand for global information is growing exponentially
Trend #8 – The Stage is being set for a new era of Global Systems
Trend #9 – We are transitioning from a product-based economy to an experience based economy
Trend #10 – Libraries will transition from a center of information to a center of culture
Recommendations for Libraries
Libraries are in a unique position. Since most people have fond memories of their times growing up in libraries, and there are no real “library hater? organizations, most libraries have the luxury of time to reinvent themselves.
The role of a library within a community is changing. The way people interact with the library and the services it offers is also changing. For this reason we have put together a series of recommendations that will allow libraries to arrive at their own best solutions.
1) Evaluate the library experience. Begin the process of testing patron’s opinions, ideas, thoughts, and figure out how to get at the heart of the things that matter most in your community. Survey both the community at large and the people who walk through the library doors.
2) Embrace new information technologies. New tech products are being introduced on a daily basis and the vast majority of people are totally lost when it comes to deciding on what to use and what to stay away from. Since no organization has stepped up to take the lead in helping the general public understand the new tech, it becomes a perfect opportunity for libraries. Libraries need to become a resource for as well as the experts in each of the new technologies.
a. Create a technology advisory board and stay in close communication with them.
b. Recruit tech savvy members of the community to hold monthly discussion panels where the community at large is invited to join in the discussions.
c. Develop a guest lecture series on the new technologies.
3) Preserve the memories of your own communities. While most libraries have become the document archive of their community, the memories of a community span much more than just documents. What did it sound like to drive down Main Street in 1950? What did it smell like to walk into Joe’s Bakery in the early mornings of 1965? Who are the people in these community photos and why were they important? Memories come in many shapes and forms. Don’t let yours disappear.
4) Experiment with creative spaces so the future role of the library can define itself. Since the role of the library 20 years from now is still a mystery, we recommend that libraries put together creative spaces so staff members, library users, and the community at large can experiment and determine what ideas are drawing attention and getting traction. Some possible uses for these creative spaces include:
a. Band practice rooms
b. Podcasting stations
c. Blogger stations
d. Art studios
e. Recording studios
f. Video studios
g. Imagination rooms
h. Theater-drama practice rooms
We have come a long ways from the time of da Vinci and the time when books were chained to lecterns. But we’ve only scratched the surface of many more changes to come. Writing the definitive history of modern libraries is a work in progress. Our best advice is to enjoy the journey and relish in the wonderment of what tomorrow may bring.

The DaVinci Institute – The Future of Libraries

I can not agree more with many of these points, especially Trend #10 – Libraries will transition from a center of information to a center of culture. Smart libraries will get involved in creating remote patron resources from mashups of leading online services that integrate and add value to the user experience. Here is a starting list of some services the public is already using via logging into each site individually. The mashup field is ready to plant and Yahoo is busy with Flickr, del.icio.us, and Movable Type, lets hope our libraries catch on and can even help integrate in some premium databases.

The Best Web 2.0 Software of 2005

Category: Social Bookmarking

Best Offering: del.icio.us

del.icio.us

Description: Just acquired by Yahoo!, which already has a social bookmarking service called My Web 2.0, the exact future of this seminal bookmarking site is now a little up in the air. But del.icio.us remains the best, largest, fastest, and most elegant social bookmarking service on the Web. In fact, del.icio.us is the benchmark that all others use. And because del.icio.us appears to take the Web 2.0 ideas pretty seriously, they provide a nice API for others to build new services on top of. As a consequence of this, and because social bookmarking sites makes everyone’s data public, witness the amazing array of add-on services (or if you have 15 minutes to spare, look here) that mash-up or otherwise reuse del.icio.us functionality and content. If you want access to your bookmarks anywhere you go along with engaging and satisfying functionality, this is your first stop. I personally can’t live without my tag cloud of del.icio.us bookmarks.

Runners-Up
:


Category: Web 2.0 Start Pages

Best Offering: Netvibes

Description: There are a rapidly growing number of Ajax start pages that allow your favorite content to be displayed, rearranged, and viewed dynamically whenever you want. But if the traffic to this blog is any indication (though possibly it isn’t) Netvibes is far and away the most popular one. Available in multiple languages, sporting new integration with Writely, and offering an extremely slick and well-designed interface that provides some of the best DHTML powered drag-an-drop organization, Netvibes has no major vendor backing, yet it has captured mindshare out of pure excellence. While many of the major Web companies like Microsoft and Google are offering competing products, none of them are yet very good.

Runners-Up
:


Category: Online To Do Lists

Best Offering: Voo2do

Description: Ever more of the software we use on a daily basis is moving online, from e-mail to feed readers. To-do list managers are no exception. I’ve used a variety of them and so far the one that’s resonated with me most is Voo2do. A one person operation run by Shimon Rura, Voo2do uses Ajax sparingly but very effectively to let you create and manage multiple to do lists. With an API available for you to access or export your data with your own programs, support for Joel Spolsky’s Painless Software Scheduling method, Voo2do is the embodiment of simple, satisfying software. Runners-Up:


Category: Peer Production News

Best Offering: digg

Description: While not packed with Ajax, digg frankly doesn’t lack for it. And of course, Ajax is only one of many optional ingredients on the Web 2.0 checklist. The important Web 2.0 capability digg provides is that it successfully harnesses collective intelligence. All news items listed in digg are supplied by its users which then exert editorial control by clicking on the digg button for each story they like. The home page lists the most popular current stories, all selected by its registered users. And digg’s RSS feed has to be one of the most popular on the Web. Digg has been so successful that Wired magazine has even speculated it could bury Slashdot, which also allows users to submit stories, but doesn’t let them see what stories were submitted or vote on them. Runners-Up:


Category: Image Storage and Sharing

Best Offering: Flickr

Description: Also acquired by Yahoo! earlier this year, Flickr is the canonical photo/image sharing site par excellence. Sprinkled with a smattering of just enough Ajax to reduce page loads and make tasks easy, Flickr provides an open API, prepackaged licensing models for your photos, tagging, a variety of community involvement mechanisms, and a vast collection of add-ons and mashups. There are other sites but none of them compare yet. Flickr is one of the Web 2.0 poster children and for a good reason. Runners-Up:


Category: 3rd Party Online File Storage

Best Offering: Openomy Description: As more and more software moves to the Web, having a secure place for your Web-based software to store files such as documents, media, and other data will become essential. There is a burgeoning group of online file storage services and Openomy is one that I’ve been watching for a while. With 1Gb of free file storage and an open API for programmatic access to your tag-based Openomy file system, and you have the raw ingredients for secure online storage of your documents wherever you go. There is even a Ruby-binding for the API. Expect lots of growth in this space going forward, especially as other Web 2.0 applications allow you to plug into your online storage service of choice and the desire also grows to offload personal data backup to professionals.

Runners-Up:


Category: Blog Filters

Best Offering: Memeorandum.com Description: Gabe Rivera’s Memeorandum service is a relevance engine that unblinkingly monitors the activity in the blogosphere and appears to point out the most important posts of the day with a deftness that is remarkable. The growing attention scarcity caused by the rivers of information we’re being subjected to in the modern world needs tools that effectively help us cope with it. Blog filters are just one key example of what the future holds for us. Memeorandum covers both the political and technology blogospheres, and hopefully others in the future. There are other blog and news filters out there, but none compare in terms of simplicity, elegance, and satisfying results.

Runners-Up:


Category: Grassroots Use of Web 2.0

Best Offering: Katrina List Network Description: I covered Katrinalist.net in a detailed blog post a while back but it remains one of the best examples of grassroots Web 2.0. Katrinalist was an emergent phenomenon that triggered the peer production of vital information in the aftermath of this year’s hurricane disaster in New Orleans. In just a handful of days participants created XML data formats, engineered data aggregation from RSS feeds, and harnessed volunteer efforts on-the-fly to compile survivor data from all over the Web. This led to tens of thousands of survivor reports being aggregated into a single database so that people could easily identify and locate survivors from the Katrinalist Web site. All this despite the fact that the information was distributed in unstructured formats from all over the Web with no prior intent of reuse. A hearty thanks again to David Geilhufe for help making Katrinalist happen.

Runners-Up:


Category: Web-Based Word Processing

Best Offering: Writely Description: Easy to set-up, fast, free (in beta), and familiar to those with even a passing familiarity to MS word, Writely.com is an effective and easy to use online word processor. With its WSIWYG editor, users can change font and font size, spell check and insert images (up to 2MB). It also uses tagging and version control, both excellent features for any word processor. A very useful word processing tool, especially for those who can’t afford to buy MS Office. In addition to being a word processor, Writely.com also serves as a collaboration tool. Users invite others to collaborate on a certain documents via email. It is can also serve as a tool to help a user blog and publish. Built with an AJAX user interface, it maximizes many of the new features available with Web 2.o. It ends, once and for all, any uncertainty that productivity tools can and should stay online. Writely is the best out there but just by a nose. The others are very close runners-up.

Runners-Up:


Category: Online Calendars

Best Offering: CalendarHub Description: Online calendaring is a rapidly growing product category in the Web 2.0 software arena. The fact is that a lack of good, shareable electronic calendars is still a real problem these days. I’m fond of saying that the software world has vast collections of synchronization utilities and integration capabilities, yet it’s incredible that we still can’t routinely do simple things like keeping our personal, family, and work calendars synchronized. CalendarHub is the best online calendar I’ve seen so far, with Kiko a close second.

Runners-Up:


Category: Project Management & Team Collaboration Best Offering: BaseCamp

Description: Web 2.0 has terrific social collaboration models for two-way information exchange like blogs and wikis, open enrichment mechanisms like tagging, ranking, popularity, and organizing techniques like folksonomies. All of these provide a great backdrop for team collaboration and project management. Surprisingly, there aren’t many terrific Web 2.0 project management tools. Part of this is because project management tends to be very specific between different types of projects. Fortunately for Web 2.0 companies, this means there isn’t a lot of competition from traditional software companies like Microsoft and Primavera, which churn out somewhat mediocre products in the shrinkwrapped software space. This is why 37Signal’s Basecamp is such a pleasant surprise. It’s an excellent team-based project management tool that continues to delight me the more I use it.

Runners-Up:

The Story Continues However, As It Must!

No one person could accurately list the best Web 2.0 software of 2005. This is the wisdom of crowds bit of Web 2.0. In order to complete this list, I’ll need your help. Please contribute your selections below. Keep in mind that I haven’t worked with many of the terrific Web 2.0 software applications out there but many of you have. There are whole product categories I’m not covering here and I’m glad to keep extending this post if we get lots of feedback. Tell me about social spreadsheets, Web 2.0 project management tools, video versions of Flickr, additional grassroots Web 2.0 events, and whatever else you know of.

The Best Web 2.0 Software of 2005 (web2.wsj2.com)


Technology begins by mimicry

November 13, 2005

I am half way through The Singularty is Near by Ray Kurzweil. He talks about The six Epochs of Evolution:

  • Epoch 1: Physics and chemistry (Information in atomic structures)
  • Epoch 2: Biology (Information as DNA)
  • Epoch 3: Brains (Information as neural patterns)
  • Epoch 4: Technology (Information as hardware and software designs)
  • Epoch 5: Meger of Technology and Human Intelligence (Methods of Biology integrated into the human technology base)
  • Epoch 6: The Universe Wakes Up (Patterns of matter and energy become saturated with intelligent processes and knowledge)

Humans evolution begins via mimicry as the infants brain is being wired (prerational but exponential growth). If this mimicry step in biology is a generic step to evolution Ken Thompson is looking in the right direction and Ray’s prediction of impending exponential growth bears attention.

The four action zones are to a bioteam what the four chemical bases (A, T, G and C) are to DNA – their interdependencies and constantly repeating patterns providing the building blocks of the famous double helix structure common to all living things.

Bioteam 12 pres-1 slide-summary_350.jpg


Leadership Zone: Bioteams treat every team member as a leader

RULE1: Stop Controlling
Communicate information not orders

Traditional teams issue orders and mostly use 2-way communications.

Bioteams provide ‘situational information’ to the team members who are trained to judge themselves what they should do in the best interests of the team. They move exceptionally fast because they mostly use 1-way broadcast communications and only use 2-way communications where its really needed.


RULE2: Team Intelligence
Mobilise everyone to look for and manage team threats and opportunities

In traditional teams it’s the leader’s job to provide most of the “Team Intelligence? – information on potential threats to, or opportunities for, the team.

In a Bioteam it is every team member’s responsibility to constantly look-out for this team intelligence and to make sure it is quickly and effectively communicated to all the other team members who might need to know about it.

RULE3: Permission Granted
Achieve accountability through transparency not permission

Traditional teams protect themselves against member mistakes through establishing layers of permission which must be granted before a team member may take action in certain circumstances. I call these “Permission Structures?.

Bioteams have slimmed down these “Permission Structures? to the absolute bare minimum necessary to protect purely against the most serious potential mistakes which would threaten the team’s mission.

Accountability in bioteams is achieved through team ‘transparency systems’ not control systems.


Connectivity Zone: Bioteams connect the team members, partners and networks

RULE4: Always On
Provide 24*7 instant “in-situ? message hotlines for all team members

Traditional teams expect their members to go somewhere, such as their PC, to “get their messages”.

Bioteams ‘take the messages to the team members’ via whichever device suits each individual member best at any particular time in their working day.

RULE5: Symbiosis
Treat external partners as fully trusted team members

Traditional teams pay ‘lip-service’ to team members from external organisations such as customers or suppliers in terms of transparency and trust.

Bioteams pick their partners very carefully but once they have committed to them they treat them identically to their own internal team members in terms of granting them full transparency and trust.

RULE6. Cluster
Nurture the team’s internal and external networks and connections

Traditional Teams don’t think about their networks – they believe it’s the team leader’s job alone to make sure they have all the resources they need.

Bioteams pay a lot of attention to the collective networks and relationships of each team member. This is to ensure they have adequate “strong ties? to get the work done well and be able to ‘call in short-notice favours’ as needed from external parties.

Also they ensure they have sufficient “weak ties? to quickly spot important changes and warning signals from the external market/operational environment.

 


Execution Zone: Bioteams experiment, co-operate and learn

RULE7: Swarm
Develop consistent autonomous team member behaviors

Traditional teams focus on team member individuality as a means of achieving creativity and innovation but neglect the hidden power of consistent member behaviors.

Bioteams have discovered that for a team to be really effective they must first put in place the foundations by ensuring there are a basic core set of team member behaviours which can be guaranteed to be executed consistently by all members at all times.

Bioteam members take an interest in anything which might affect the ultimate success of their project whether its within their defined project role or not.

RULE8: Tit-for-Tat
Teach team members effective biological personal co-operation strategies

Traditional teams try to play “Win-Win? and “Collaborate? but don’t actually have any practical strategies or tactics for achieving this.

Traditional teams are not really interested in the real, often raw, basic and undeclared, motivations of each team member.

Bioteams realise that “Win-Win? is an outcome not a strategy and use proven personal collaboration strategies to create the conditions for it to happen.

Bioteam personal collaboration strategies also address the “what’s in it for me” question for each team member.

RULE9: Genetic Algorithms
Learn through experimentation, mutation and team review

Traditional teams believe that analysis is the main way to get things right. Consequently they engage in extensive planning, design and preparation before trying out new things or releasing new products to their customers.

Bioteams believe that live controlled experimentation is the only way to get things right and that most things won’t workout as planned anyway no matter how well they are analysed and designed.

Therefore they quickly try out numerous alternative actions in parallel in small safe ways to find out what works best and then they build on and adapt the most promising results.


Organisation Zone: Bioteams establish sustainable self-organisation

RULE10: Autopoiesis (aka “Self-Organising Networks”)
Define the team in terms of ‘network transformations’ – not outputs

Traditional teams define their goals and roles in terms of the outputs and activities they are expected to produce – i.e. inanimate objects.

Bioteams define their goals and roles in terms of the transformations they intend to make in the people and partners they will engage with – i.e. living things.

RULE11: Porous Membranes
Develop team boundaries which are open to energy but closed to waste

In a traditional team the leader selects the members and the team effectively becomes “sealed? at the pre-ordained “right size? in terms of members very early in its lifecycle. There is a big focus on full-time members as the team “product engines?.

In a bioteam the members select the members and recognise that the “right team size” will only emerge over time no matter what the plan says.

So they keep looking for useful new members throughout the whole of their team’s lifetime – particularly for part-time members, advisors, experts, “jungle-guides” and external allies.

RULE12. Emerge
Scale naturally through nature’s universal growth and decay cycles

Traditional teams grow as quick as they can to the agreed size as per the agreed project schedule.

Bioteams are aware that growth is not something that can be managed or controlled. The team leaders and members treat the bioteam like it is a “living thing? in itself and watch for and exploit natural opportunities for its growth.

The secret DNA of high-performing virtual teams – The Bumble Bee

technorati tags: , , , , ,