Time-to-Adoption Horizon: One Year or Less
Time-to-Adoption Horizon: Two to Three Years
Time-to-Adoption Horizon: Four to Five Years
Tagging, the practice of attaching a descriptive word or phrase to a piece of online content for the purpose of linking it to other related content, has become a mainstream activity. A vision for the next generation of tagging is deep tagging, which would allow people to create a direct link to a small part of a larger piece of media, such as an image or a video. Others who search for those tags would be able to retrieve specific content via these tags, with the promise of facilitating just-in-time learning and creating new possibilities for research and scholarly work.
Real-time video annotation is already quite common, but currently requires the use of specialized tools that match the time of the annotation to the time on the clip where the reference occurs. If the capability for anyone to easily add and search for tags is developed as anticipated, images and video and audio clips would become as easy to find as text-based materials. As more rich media clips are released online, having a way to easily reference a particular segment of a clip could facilitate the creation of thematic resource collections.
Based on the age of the literature we were able to find, work in deep tagging appears to have stalled.
Relevance for Teaching, Learning & Creative Expression
- Deep tagging could increase the granularity of time-based media, allowing parts of media clips to be more easily remixed, linked, and reused.
- Many disciplines could benefit from video and audio libraries that are as easy to search and tag as text-based resources.
- Tagging within video and audio clips could facilitate the organization and description of rich media in social software environments and enable users to co-create content by annotating the media.
Although the technology to support deep tagging has been in development since 2006, we were not able to find any educational examples of deep tagging (repositories, course materials, etc.). The following examples are software tools that can tag within video or other media.
- Gotuit makes software to manage video libraries by deep tagging: http://www.gotuit.com
- EveryZing provides a software solution for indexing, searching, and connecting multimedia content to search engines: http://www.everyzing.com
For Further Reading
All the Cool Kids Are Deep Tagging
(Michael Arrington, TechCrunch, 1 October 2006.) Written when deep tagging was first introduced, this post discusses deep tagging and websites that offer it.
Deep Tag It
(Michelle Lentz, WriteTechnology, 2 March 2007.) This blog post describes deep tagging, suggests ways to use tagged video, and describes a deep tagging product by Veotag.
Video Search Catches Up with Video Tagging
(Jeremy Lockhorn, ClickZ, 29 January 2007.) This article describes video tagging and outlines why it is useful, as well as identifying a few websites that offer video tagging.
Sandbox Discussion (July-August 2008)
Tagging, the practice of attaching a descriptive word or phrase to a piece of online content for the purpose of linking it to other related content, has become a mainstream activity. The next generation of tagging, tagging within rich media, will allow people to create a direct link to a small part of a larger piece of media. Others who search for those tags will be able to retrieve specific content that is relevant to them, facilitating just-in-time learning and creating new possibilities for research and scholarly work.
Why is this topic relevant to teaching, learning or creative expression?
- Large media files can only be direct referenced in their entirety- being able to link directly to a "clip" within an audio, video, (flash?) file can open doors for not only connecting directly to relevant information, but perhaps a suite of tools that can allow remixing or archiving of media clips.
- Adding tagging can better help organize and annotate information in a social software environment.
- Enables searching within media (eg audio, video) – user can go directly to the topics they want – similar to a book index
- Allows user to co-create content via annotation of media
- Makes using media as an instructional tool more efficient – to date streaming media is more time consuming than print media to ‘read’
- Allows time based media to be more granular which facilitates the ease of reuse online - parts and fragments of larger things can easily be reassembled into different contexts, just as we do with tagging of small things like blog posts now
- Allows for the development of new academic forms, for example video and audio libraries which can be tagged by individuals so that the material can be remixed and viewed /used on the basis of these tags
- list your thoughts here [LJ]
Please list links to local or international projects that are experimenting with or implementing this technology.
- MIT Lecture Browser http://web.sls.csail.mit.edu/lectures/
- MIT develops lecture search engine to aid students http://web.mit.edu/newsoffice/2007/lectures-tt1107.html
- IT Conversations link to excerpt http://www.gardnercampbell.net/blog1/?p=610
- EveryZing "EveryZing provides next-generation universal search technology and brings the benefits of search engine optimization (SEO) to online audio and video content. By aggregating structured and unstructured digital content for faster, easier discovery, EveryZing improves the search experience and drives user consumption of multimedia content." http://search.everyzing.com
- Re-ingesting Flickr tags from the Commons back into the Powerhouse Museum collection OPAClink
- Videodefunct is an Australian online video project that relies on tagging to build new relations between clips see http://videodefunct.net/prototypes/
- list your project links here [LJ]
Please provide links to any local or international reports, papers, or articles that either help define the topic, or that provide detailed information about it.
- Share Video Online: Link Deep Into Any Video Clip With Motionbox - Video Intro http://www.masternewmedia.org/news/2006/09/13/share_video_online_link_deep.htm
- Tagging 101 http://news.zdnet.com/2422-13569_22-153233.html
- Video Tagging Gets Cool (2006)
- Video Search Catches up with Video Tagging
- Share Only the Interesting Portions of YouTube Videos
- All the Cool Kids are Deep Tagging
- Deep Tag It
- list your resource links here [LJ]
Please add any other information that may be helpful to the staff as they write up this topic.
- CommentPress allows commenting on large text documents at the paragraph level, but not sure if tagging is a part of this [RSS]
- your idea or information here [LJ]