Off the Top: Statistics Entries
Things here are a little quiet as I have been in writing mode as well as pitching new work. I have been blogging work related items over at Personal InfoCloud, but I am likely only going to be posting summaries of those pieces here from now on, rather than the full posts. I am doing this to concentrate work related posts, particularly on a platform that has commenting available. I am still running my own blogging tool here at vanderwal.net I wrote in 2001 and turned off the comments in 2006 after growing tired of dealing comment spam.
The following are recently posted over at Personal InfoCloud
SharePoint 2007: Gateway Drug to Enterprise Social Tools
SharePoint 2007: Gateway Drug to Enterprise Social Tools focusses on the myriad of discussions I have had with clients of mine, potential clients, and others from organizations sharing their views and frustrations with Microsoft SharePoint as a means to bring solid social software into the workplace. This post has been brewing for about two years and is now finally posted.
Optimizing Tagging UI for People & Search
Optimizing Tagging UI for People and Search focuses on the lessons learned and usability research myself and others have done on the various input interfaces for tagging, particularly tagging with using multi-term tags (tags with more than one word). The popular tools have inhibited adoption of tagging with poor tagging interaction design and poor patterns for humans entering tags that make sense to themselves as humans.
LinkedIn: Social Interaction Design Lessons Learned (not to follow)
I have a two part post on LinkedIn's social interaction design. LinkedIn: Social Interaction Design Lessons Learned (not to follow) - 1 of 2 looks at what LinkedIn has done well in the past and had built on top. Many people have expressed the new social interactions on LinkedIn have decreased the value of the service for them.
The second part, LinkedIn: Social Interaction Design Lessons Learned (not to follow) - 2 of 2 looks at the social interaction that has been added to LinkedIn in the last 18 months or so and what lessons have we as users of the service who pay attention to social interaction design have learned. This piece also list ways forward from what is in place currently.
Lee Ranie of the Pew Internet and American Life Project announced the release of Pew Internet Project Report on Tagging in America. The report also includes an extensive interview with David Weinberger on the subject of tagging. The most interesting parts of this report at the percentages of people in America who tag (includes those who add categories). Based on their survey, which randomly selected and spoke to 2,373 adults, 28% of Americans online have added tags or categories. The survey found 7% of the respondents tag/categorize daily.
I am really happy with the report as it looks at the numbers from a use perspective. Up to this point I have been using tagging service provider numbers (few are made public) along with Alexa hit reports across many services and took that total and divided by the Neilsen report number of total people on the web (approximately 750 million people). This approach provided about .85% of all the people on the web are tagging (does not include tagging on blogs as that is more ad hoc categories, but that is a long post to explain or done over a beer or two).
The difference between the percentages in the Pew report and the numbers I backed into is the Pew is just an American view and mine was looking at things globally. Pew looks at tags and categories and many systems have categories. I am really comfortable with the daily number of 7% on the web are tagging/categorizing and I will likely use that number in future presentations. The 28% number is really surprising, but for one time use it is accurate. This represents a much larger user base than I thought, but is also includes categories with tagging.
Separating Tagging and Categories
The Pew Report on Tagging combines categories and tagging. While optimally it would be great to separate the two out, explaining the difference between the two to a regular person (non-geek) in America will be difficult. Asking if somebody has used certain functionality on a service or one of the 130 or so social bookmarking tools or the many hundreds of products that include tagging will negatively impact the results. The terms tagging and categories combined for a research question make for a question that is more easily answered yes or no.
The Pew Report provides a starting place for future research, hopefully delving into the subject with a little more clarity, where tagging and categories are separated.
Tagging and Race
If one looks at tagging as a means to refind information and looks at tagging as adding context by adding a person's own vocabulary and social terminology as one of the tools of tagging then looking at various social groups is a simple way to start to validate this (a much better approach is to ascertain why somebody is adding a tag). One simple way to look at different social structures is race. The inclusion of the break down of who tags by race can provide a good argument that people who tag are adding missing language terms, if the assumption is made that the content is missing metadata or is provided by somebody not of that race.
The Pew Report indicates the following tagging breakdown by race:
- 26% of White, non-Hispanics
- 36% of Black, non-Hispanic
- 33% of English-speaking Hispanic
These higher numbers of people tagging who are not white seems to support the idea that those whose vocabulary and terminology is not represented will tag to ease their refinding the information. When things are in familiar terms they are easier to find and having the ability to tag from one's own context eases refindability. The Report does not dive into this and it is a really good subject for future research.
[I initially posted this at Personal InfoCloud :: Pew Research on Tagging, which has comments open
Dave Weinberger points out gross errors Information Week made when graphically comparing perceived problems with Windows and Linux. The error is that the Windows graphic uses a scale of 80 percent, while Linux uses a scale of 40 percent. When you realize this the differences in perception become huge.
Microsoft shows nearly 80 of those surveyed had concerns about their software quality and vulnerabilities, while Linux had less than 25 percent. More than 60 percent felt the cost of ownership is too high with Microsoft, while far less than 5 percent had the same concern with Linux. The Linux perceived problems revolve around a lack of complete and fully integrated software environment (40 percent), accountability if problems arise (above 35 percent), and lack of clear product road map (35 percent). Each of the Linux perceived problems, once you spend time looking into them, is not really a problem, but more of a lack of a company with a large marketing budget. I am hoping that Novel and IBM can really start making headway in this area. The quality of Linux products is far higher than Microsoft's and for nearly every product that Microsoft pushes there is at least an equal product in the Linux community.
Then again there is Apple too.
Fans of Moneyball will like the Washington Post story on the NBA wiz kid executive. The focus of the article is the San Antonio Spur's Sam Presti, age 27, who is applying MBA tactics to the NBA. Yes, quantitative analysis to mitigate risk and control cost is behind the NBA version of Moneyball, just as it is in Major League Baseball.
Boxes and Arrows is currently running two wonderful articles. Report Review: Nielsen/Norman Group's Usability Return on Investment by Peter Merholz and Scott Hirsch. The second article is Web Traffic Analytics and User Experience by Fran Diamond.
Go read, I will be back shortly.