Keynotes
- Luis von Ahn
Carnegie Mellon University - Sinan Aral
New York University - danah boyd
Microsoft Research - Jon Kleinberg
Cornell University - Sonia Livingstone
London School of Economics - Siva Vaidyanathan
University of Virginia
Invited Panelists
- Ed Chi
Google - Chris Diehl
Jive - Christian Posse
LinkedIn - JP Rangaswami
Salesforce - Raghu Ramakrishnan
Yahoo Research - Neel Sundaresan
eBay
- Luis von Ahn
Other ACM Conferences
- 2011 in Koblenz, Germany
- 2010 in Raleigh, North Carolina
- 2009 in Athens, Greece
Past Web Science Conferences
- Presentations from Koblenz, Germany
Videos
- Hyper Text 2012
Related Conferences
Contact
Willem Pieterson
Conference Manager / Treasurer
websci12@gmail.com
+1-(847)-532-0624
Mapping & Clouding I & II
Mapping & Clouding: Employing Digital Methods I & II
Two workshops will be organized during the Web Science conference around the topic ‘Mapping and Clouding: Employing Digital Methods’. The first one (M&C I) will focus on The Issue Crawler, whereas the second one (M&C II) focuses on The Lippmannian Device.
M&C I: Thursday June 21, Half-day workshop (morning) or II: Friday June 22, Half-day workshop (morning)
Mapping & Clouding: Employing Digital Methods I
Thursday June 21, Half-day workshop (morning)
The Digital Methods workshop focuses on mapping website networks with the Issue Crawler. The Issue Crawler, online since 2001, is Web network and visualization software that works in a browser. It consists of crawlers, databases, analysis engines, and visualization modules. The software relies on co-link analysis, a scientometric sampling or network demarcation technique based on citation analysis, adapted for the Web. Enter a set of URLs into the software, and the URLs are crawled, the outlinks are captured and the co-links retained, in one, two or three iterations of the procedure, as selected by the user. (There are also snowball and inter-actor crawling methods built into the software.) The results are analyzed for centrality measures and visualized in a directed graph, showing site inter-linkings (nodes and lines with arrows). The file format of the graph is a scalable vector graphic (SVG), which also may be saved in a variety of other file formats, including PNG and PDF. The online SVG graphic is interactive, whereby the user can click the URLs behind the nodes, and may turn on and off links as well as domains. The purpose of the interactivity is to provide a single, graphical space for users to explore specific inter-linkings between sites (who links to whom, and who does not?) as well as to spend time reading the content of the pages in the network, in an alternative to search engine space (and ranked lists). The original purpose of the software is ‘issue network’ analysis for social and political theory, and a literature has developed around the software. The Issue Crawler is also employed, methodologically, for dynamic URL sampling, i.e., building out a seed list of URLs for related sites. Among other applications, dynamic URL sampling techniques have been employed in the study of Internet censorship.
The workshop provides an introduction to the Issue Crawler as well as its allied tools, including the actor profiler, and export features to Gephi and others. See www.
Preparatory reading
R. Rogers (2010) ”Mapping Public Web Space with the Issuecrawler,” in: Claire Brossard and Bernard Reber (eds.), Digital Cognitive Technologies: Epistemology and Knowledge Society. London: Wiley, 115-126.
Relevant Websites
- Issue Crawler, www.issuecrawler.net/
- Govcom.org Foundation, www.govcom.org/
- Digital Methods Initiative, www.digitalmethods.net/
- Mapping Controversies (EU 7th) Project, www.
mappingcontroversies.net/ - EMAPS (EU 7th) Project, www.emapsproject.com/
blog/
Mapping & Clouding: Employing Digital Methods II
Friday June 22, Half-day workshop (morning)
The Digital Methods workshop concerns itself with clouding resonance of issue mentions on websites. The workshop in particular concentrates on using and interpreting the Lippmannian Device, the tool developed by Rogers and colleagues in the context of the Mapping Controversies project led by Bruno Latour (www.
Preparatory reading
R. Rogers (2010). “Internet Research: The Question of Method,” Journal of Information Technology and Politics, 7(2/3): 241-260,
www.govcom.org/
R. Rogers (2009). The End of the Virtual: Digital Methods, Amsterdam: Amsterdam University Press,
govcom.org/
Workshop organizer:
Richard Rogers, PhD is University Professor and holds the Chair in New Media & Digital Culture at the University of Amsterdam. He is Director of Govcom.org, the group responsible for the Issue Crawler and other info-political tools, and the Digital Methods Initiative, reworking method for Internet research. Among other works, Rogers is author of Information Politics on the Web (MIT Press, 2004), awarded the 2005 best book of the year by the American Society of Information Science & Technology (ASIS&T). His latest book, Digital Methods, is to be published by MIT Press.