WO2004109428A2 - System and method for indicating an annotation for a document - Google Patents

System and method for indicating an annotation for a document Download PDF

Info

Publication number
WO2004109428A2
WO2004109428A2 PCT/US2004/012098 US2004012098W WO2004109428A2 WO 2004109428 A2 WO2004109428 A2 WO 2004109428A2 US 2004012098 W US2004012098 W US 2004012098W WO 2004109428 A2 WO2004109428 A2 WO 2004109428A2
Authority
WO
WIPO (PCT)
Prior art keywords
document
annotation
server
annotations
window
Prior art date
Application number
PCT/US2004/012098
Other languages
French (fr)
Other versions
WO2004109428A8 (en
Inventor
Rami Caspi
Original Assignee
Siemens Communications, Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Siemens Communications, Inc. filed Critical Siemens Communications, Inc.
Publication of WO2004109428A2 publication Critical patent/WO2004109428A2/en
Publication of WO2004109428A8 publication Critical patent/WO2004109428A8/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/10Office automation; Time management
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F40/00Handling natural language data
    • G06F40/10Text processing
    • G06F40/166Editing, e.g. inserting or deleting
    • G06F40/169Annotation, e.g. comment data or footnotes

Definitions

  • the present invention relates to telecommunications systems and, in particular, to an improved system and method for indicating an annotation for a document.
  • conferencing typically, a more or less central server or other device manages the conference and maintains the various communications paths to computers or other client devices being used by parties to participate in the conference. Parties to the conference may be able to communicate via voice and/or video through the server and their client devices.
  • SIP Session Initiation Protocol
  • Instant messaging can provide an added dimension to multimedia conferences.
  • instant messaging systems such as the Microsoft Windows MessengerTM system can allow for transfer of files, document sharing and collaboration, collaborative whiteboarding, and even voice and video.
  • a complete multimedia conference can involve multiple voice and video streams, the transfer of many files, and much marking-up of documents and white boarding.
  • Participants in the multimedia conference or collaboration effort as well as other users may want to provide a voice, text, graphical, or other type of annotation regarding a document used in the conference or collaboration, their thoughts regarding the conference or collaboration effort, etc.
  • the annotation may be provided back to other participants in the conference or collaboration effort for further review.
  • Embodiments provide a system, method, apparatus, means, and computer program code for allowing multiple annotations for documents used in a conference or collaboration session to co-exist and for distinguishing between the annotations made by different participants in the conference or collaboration and/or other users.
  • people may view documents, exchange ideas and messages, etc. via a server or conference/collaboration system at different times and/or without being in direct communication with each other.
  • the people may want to add listen to, view, or add annotations regarding one or more documents.
  • the systems, methods, apparatus, means, and computer program code described herein allow users to follow the trail of annotations regarding a document and to distinguish between the voice or other audible annotations created by other people.
  • a method for associating an annotation with a document may include associating an audible annotation with a document; associating a person with the audible annotation; establishing an identifier associated with the person; providing data indicative of the annotation; and providing data indicative of the identifier.
  • a method for associating an annotation with a document may include associating a plurality of people with a document; associating a different identifier to each of the plurality of people; associating an audible annotation made by a first of the plurality of people to the document; and associating the first of the plurality of people's respective identifier to the document.
  • either of both of the methods may include receiving a request for a document or annotation, providing a document or annotation, receiving data indicative of a document or annotation, storing or displaying a document or annotation, and/or associating distinct icons or other identifiers with one or more people associated with a document, who have access to the document, or are allowed to record or provide an annotation for a document.
  • Other embodiments may include means, systems, computer code, etc. for implementing some or all of the elements of the methods described herein.
  • FIG. 1 is a diagram of a conference system according to some embodiments
  • FIG. 2 is a diagram illustrating a conference collaboration system according to some embodiments
  • FIG. 3 is another diagram illustrating a conference collaboration system according to some embodiments.
  • FIG. 4 is a diagram illustrating a graphical user interface according to some embodiments.
  • FIG. 5 is a diagram illustrating another graphical user interface according to some embodiments.
  • FIG. 6 is a diagram illustrating another graphical user interface according to some embodiments.
  • FIG. 7 is a diagram illustrating another graphical user interface according to some embodiments.
  • FIG. 8 is a diagram illustrating another graphical user interface according to some embodiments.
  • FIG. 9 is a diagram illustrating another graphical user interface according to some embodiments.
  • FIG. 10 is a diagram illustrating another graphical user interface according to some embodiments.
  • FIG. 11 is a diagram illustrating another graphical user interface according to some embodiments;
  • FIG. 12 is a flowchart of a method in accordance with some embodiments.
  • FIG. 13 is another flowchart of a method in accordance with some embodiments.
  • FIG. 14 is a block diagram of possible components that may be used in some embodiments of the server of FIG. 1.
  • Applicant has recognized that there is a market opportunity for systems, means, computer code, and methods that allow multiple annotations for a document to co-exist and that distinguishes between the annotations made by different participants in a collaborative effort regarding or involving the document.
  • a conference or collaborative effort/session different participants may be in communication with a server or conference system via client devices (e.g., computers, telephones).
  • the server or conference system may facilitate communication between the participants, sharing or accessing of documents, etc.
  • the conference may be part of a collaborative effort involving the participants.
  • one or more of the participants may communicate in an off-line collaboration mode, wherein the participants view documents, exchange ideas and messages, etc. via the server or conference system at different times and/or without being in direct communication with each other.
  • the participants or other users may want to add listen to, view, or add voice or other audible annotations regarding the conference or collaborative effort.
  • a participant may want to record or otherwise add an annotation to a document being shared or prepared as part of the collaborative effort.
  • a participant may want to listen to one or more voice annotations regarding the document previously made by other participants.
  • the methods and systems described herein allow, for example, users to follow the trail of voice annotations regarding a document and to distinguish between the voice annotations created by other participants in an off-line collaboration session.
  • the system 100 may include a local area network (LAN) 102.
  • the LAN 102 may be implemented using a TCP/IP network and may implement voice or multimedia over IP using, for example, the Session Initiation Protocol (SIP).
  • SIP Session Initiation Protocol
  • Operably coupled to the local area network 102 is a server 104.
  • the server 104 may include one or more controllers 101 , which may be embodied as one or more microprocessors, and memory 103 for storing application programs and data.
  • the controller 101 may implement an instant messaging system 106.
  • the instant messaging system may be embodied as Microsoft Windows MessengerTM software or other instant messaging system.
  • the instant messaging system 106 implements the Microsoft.NetTM environment 108 and Real Time Communications protocol (RTC) 110.
  • RTC Real Time Communications protocol
  • a collaboration system 114 may be provided, which may be part of an interactive suite of applications 112, run by controller 101 , as will be described in greater detail below.
  • an action prompt module 115 may be provided, which detects occurrences of action cues and causes action prompt windows to be launched at the client devices 122.
  • the collaboration system 114 may allow users of the system to become participants in a conference or collaboration session.
  • a gateway 116 which may be implemented as a gateway to a private branch exchange (PBX), the public switched telephone network (PSTN) 118, or any of a variety of other networks, such as a wireless or cellular network.
  • PBX private branch exchange
  • PSTN public switched telephone network
  • one or more LAN telephones 120a-120n and one or more computers 122a-122n may be operably coupled to the LAN 102.
  • one or more other types of networks may be used for communication between the server 104, computers 122a-122n, telephones 120a-120n, the gateway 116, etc.
  • a communications network might be or include the Internet, the World Wide Web, or some other public or private computer, cable, telephone, client/server, peer-to-peer, or communications network or intranet.
  • a communications network also can include other public and/or private wide area networks, local area networks, wireless networks, data communication networks or connections, intranets, routers, satellite links, microwave links, cellular or telephone networks, radio links, fiber optic transmission lines, ISDN lines, T1 lines, DSL connections, etc.
  • communications include those enabled by wired or wireless technology.
  • one or more client devices e.g., the computers 122a-122n
  • the computers 122a-122n may be personal computers implementing the Windows XPTM operating system and thus, Windows MessengerTM instant messenger system.
  • the computers 122a-122n may include telephony and other multimedia messaging capability using, for example, peripheral cameras, Web cams, microphones and speakers (not shown) or peripheral telephony handsets 124, such as the OptipointTM handset, available from Siemens Corporation.
  • peripheral telephony handsets 124 such as the OptipointTM handset, available from Siemens Corporation.
  • one or more of the computers may be implemented as wireless telephones, digital telephones, or personal digital assistants (PDAs).
  • PDAs personal digital assistants
  • the computers may include one or more controllers 129, such as PentiumTM type microprocessors, and storage 131 for applications and other programs.
  • the computers 122a-122n may implement interaction services 128a-128n according to some embodiments.
  • the interaction services 128a- 128n may allow for interworking of phone, buddy list, instant messaging, presence, collaboration, calendar and other applications.
  • the interaction services 128 may allow access to the collaboration system or module 114 and the action prompt module 115 of the server 104.
  • FIG. 2 is a logical diagram illustrating a particular embodiment of a collaboration server 104.
  • the server 104 includes a plurality of application modules 200 and a communication broker (CB) module 201.
  • One or more of the application modules and communication broker module 201 may include an inference engine, i.e., a rules or heuristics based artificial intelligence engine for implementing functions in some embodiments.
  • the server 104 provides interfaces, such as APIs (application programming interfaces) to SIP phones 220 and gateways/interworking units 222.
  • APIs application programming interfaces
  • the broker module 201 includes a basic services module 214, an advanced services module 216, an automation module 212, and a toolkit module 218.
  • the automation module 212 implements an automation framework for ISVs (independent software vendors) 212 that allow products, software, etc. provided by such ISVs to be used with or created the server 104.
  • the basic services module 214 functions to implement, for example, phone support, PBX interfaces, call features and management, as well as Windows MessagingTM software and RTC add-ins, when necessary.
  • the phone support features allow maintenance of and access to buddy lists and provide presence status.
  • the advanced services module 216 implements function such as presence, multipoint control unit or multi-channel conferencing unit (MCU), recording, and the like.
  • MCU functions are used for voice conferencing and support ad hoc and dynamic conference creation from a buddy list following the SIP conferencing model for ad hoc conferences and collaboration sessions.
  • support for G.711 and G.723.1 codecs is provided.
  • the MCU can distribute media processing over multiple servers using the MEGACO protocol.
  • an MCU may provide the ability for participants to set up ad hoc voice, data, or multimedia conferencing or collaboration sessions.
  • different client devices may establish channels to the MCU and the server 104, the channels carrying voice, audio, video and/or other data from and to participants via their associated client devices.
  • more than one participant may be participating in the conference via the same client device.
  • multiple participants may be using a telephone (e.g., the telephone 126a) located in a conference room to participate in the conference.
  • a participant may be using one client device (e.g., a computer) or multiple devices (e.g., a computer and a telephone) to participate in the conference.
  • the Real-Time Transport Protocol (RTP) and the Real Time Control Protocol (RTCP) may be used to facilitate or manage communications or data exchanges between the client devices for the participants in the conference.
  • RTP Real-Time Transport Protocol
  • RTCP Real Time Control Protocol
  • an MCU may include a mixer application or logical function that provides the audio, video, voice, etc. data to the different participants.
  • the MCU may handle or manage establishing the calls in and out to the different participants and establish different channels with the client devices used by the participants.
  • the server 104 may include, have access to, or be in communication with additional applications or functions that establish a list of participants in the conference as well as identify the participants speaking at a given moment during the conference.
  • Presence features provide device context for both SIP registered devices and user-defined non-SIP devices.
  • Various user contexts such as In Meeting, On Vacation, In the Office, etc., can be provided for.
  • voice, e-mail, and instant messaging availability may be provided across the user's devices.
  • the presence feature enables real time call control using presence information, e.g., to choose a destination based on the presence of a user's device(s).
  • various components have a central repository for presence information and for changing and querying presence information.
  • the presence module provides a user interface for presenting the user with presence information.
  • the broker module 201 may include the ComResponseTM platform, available from Siemens Information and Communication Networks, Inc.
  • the ComResponseTM platform features include speech recognition, speech-to-text, and text-to-speech, and allows for creation of scripts for applications.
  • the speech recognition and speech-to-text features may be used by the collaboration summarization unit 114 and the action prompt module 115.
  • real time call control is provided by a SIP API 220 associated with the basic services module 214. That is, calls can be intercepted in progress and real time actions performed on them, including directing those calls to alternate destinations based on rules and or other stimuli.
  • the SIP API 220 also provides call progress monitoring capabilities and for reporting status of such calls to interested applications.
  • the SIP API 220 also provides for call control from the user interface.
  • the toolkit module 218 may provide tools, APIs, scripting language, interfaces, software modules, libraries, software drivers, objects, etc. that may be used by software developers or programmers to build or integrate additional or complementary applications.
  • the application modules include a collaboration module 202, an interaction center module 204, a mobility module 206, an interworking services module 208, a collaboration summarization module 114, and an action prompt module 115.
  • the collaboration module 202 allows for creation, modification or deletion of a collaboration session for a group of participants or other users.
  • the collaboration module 202 may further allow for invoking a voice conference from any client device.
  • the collaboration module 202 can launch a multi-media conferencing package, such as the WebExTM package. It is noted that the multi-media conferencing can be handled by other products, applications, devices, etc.
  • the interaction center 204 provides a telephony interface for both subscribers and guests. Subscriber access functions include calendar access and voicemail and e-mail access.
  • the calendar access allows the subscriber to accept, decline, or modify appointments, as well as block out particular times.
  • the voicemail and e-mail access allows the subscriber to access and sort messages.
  • the guest access feature allows the guest access to voicemail for leaving messages and calendar functions for scheduling, canceling, and modifying appointments with subscribers. Further, the guest access feature allows a guest user to access specific data meant for them, e.g., receiving e-mail and fax back, etc.
  • the mobility module 206 provides for message forwarding and "one number” access across media, and message “morphing" across media for the subscriber. Further, various applications can send notification messages to a variety of destinations, such as e-mails, instant messages, pagers, and the like. In addition, a user can set rules that the mobility module 206 uses to define media handling, such as e-mail, voice and instant messaging handling. Such rules specify data and associated actions. For example, a rule could be defined to say "If I'm traveling, and I get a voicemail or e-mail marked Urgent, then page me.”
  • the collaboration summarization module 114 is used to identify or highlight portions of a multimedia conference and configure the portions sequentially for later playback.
  • the portions may be stored or identified based on recording cues either preset or settable by one or more of the participants in the conference, such as a moderator.
  • the recording cues may be based on vocalized keywords identified by the voice recognition unit of the ComResponseTM module, or may be invoked by special controls or video or whiteboarding or other identifiers.
  • the action prompt module 115 similarly allows a user to set action cues, which cause the launch of an action prompt window at the user's associated client device 122.
  • the client devices 122 can then perform various functions in accordance with the action cues.
  • FIG. 3 a diagram of a graphical user interface 300 used in some embodiments is shown.
  • a graphical user interface 300 may be implemented on one or more of the client devices 302, 304, 306, 308.
  • the graphical user interface 300 may interact with the interactive services unit 128 to control collaboration sessions.
  • a collaboration interface 302 Shown are a collaboration interface 302, a phone interface 304, and a buddy list 306. It is noted that other functional interfaces may be provided. According to particular embodiments, certain of the interfaces may be based on, be similar to, or interwork with, those provided by Microsoft Windows MessengerTM or OutlookTM software.
  • the buddy list 306 is used to set up instant messaging calls and/or multimedia conferences.
  • the phone interface 304 is used to make calls, e.g., by typing in a phone number, and also allows invocation of supplementary service functions such as transfer, forward, etc.
  • the collaboration interface 302 allows for viewing the parties to a conference or collaboration 302a and the type of media involved. It is noted that, while illustrated in the context of personal computers 122, similar interfaces may be provided the telephones or cellular telephones or PDAs. During a conference or collaboration, participants in the conference or collaboration may access or view shared documents or presentations, communicate with each other via audio, voice, data and/or video channels, etc.
  • a representative window 320 is illustrated that may allow a user to access documents associated with a collaboration effort or group and to access annotations associated with documents that are themselves associated with the collaboration effort or group.
  • the window 320 includes two primary portions 322 and 324 that provide information regarding the five participants in the collaboration and documents associated with the collaboration.
  • the window 320 may be displayed on a client device used by a person associated with a collaboration.
  • the window portion 322 is similar to the window 302a described previously above.
  • the window 322 also includes icons or other identifiers 326, 328, 330, 332, 334, each being associated with a different participant.
  • Each of the icons 326, 328, 330, 332, 334 may be different (e.g., visually distinct or having a different color, shape, fill pattern, flashing rate, size, etc.) to indicate its relationship with a particular participant, as will be discussed in more detail below.
  • the icons act as identifiers for specific people with regard to annotations made by the people to one or more documents.
  • the window portion 324 includes information regarding documents that are being used by the five participants and the number of annotations already made to each of the documents. For example, the window portion 324 indicates that the document entitled “Contact List” has two associated annotations and that the document entitled “Preliminary Budget” has eleven associated annotations.
  • a user of the window 320 may be able to access or open a document by selecting it or clicking on it in the window 320.
  • the user of the window 320 may be able to access or open one or more annotations associated with a document by selecting or clicking on an annotation number associated with the document.
  • clicking on a document name or annotation number, or clicking on a link or button 336 entitled "Annotation Tool” might result in display of an annotation tool window 340, as illustrated in FIG. 5.
  • the annotation tool window 340 may allow a user to access or create annotations associated with a document.
  • the window 340 may be displayed on a client device used by a person associated with a collaborative effort.
  • the window 340 may include a text block 342 in which the user may type the name of the document (e.g., "Preliminary Budget") of interest. Alternatively, clicking on or selecting the document on the window 320 may cause the document to be displayed automatically in the text block 342.
  • the window 340 may display the number of annotations associated with the document in text block 344 and the number of different annotators for the document in text block 346.
  • the window 340 also may include one or more buttons 348, 350, 352, 354 that allow a user to initiate different functions or features. For example, selecting or clicking on the button 348 may allow the user to view information regarding annotations made to the document "Preliminary Budget". Selecting or clicking on the button 350 may allow the user to record or add an annotation that is associated with the document "Preliminary Budget”.
  • Selecting or clicking on the button 352 may allow a user to listen to previously recorded annotations associated with the document. For example, selecting the button 352 may cause all of the previously recorded annotations for the document to be played in chronological order, in chronological order by individual, or in accordance with some other rule or procedure.
  • Selecting or clicking on the button 354 may cause the text blocks 342, 344, 346 to be emptied and a prompt to be displayed the user to enter a new document name in the text box 342.
  • the window 360 may be displayed that will allow the user to record an annotation for the document "Preliminary Budget", as illustrated in FIG. 6.
  • the window 360 may be displayed on a client device (e.g., the computer 122a) used by a person associated with a collaboration.
  • the window 360 may include a text block 362 in which the user can indicate who is making the annotation.
  • the text block 362 may include a pull down menu that the user can access via box 364. When used, the pull down menu may have a list of all people having the ability to add such an annotation, which may include the people indicated in the window 322. By selecting or clicking button 366, the user can begin recording the annotation.
  • the window 360 may be associated with a particular microphone or other input device on a client device (e.g., computer, telephone) displaying the window 360 that the user can use to provide the audible annotation.
  • the user can stop the recording by selecting or clicking button 368. Selecting or clicking button 370 will cause the just recorded annotation to be replayed while selecting or clicking button 372 will cause the just recorded annotation to be canceled or deleted.
  • an annotation for a document may be stored as a .WAV file and is associated with the document or stored as part of the document file.
  • the annotations for a document may be stored as separate files from the document. Each annotation may be stored or kept as a separate file, thereby allowing the annotations to be accessed and listened to separately and independently.
  • An annotation file also may include information (e.g., icon, icon color, speaker name or other identifier, creation date/time, document identifier, client device identifier, etc.) that associates the annotation with a specific speaker, document, client device, etc.
  • Playing of an annotation may involve retrieval of the appropriate annotation file and delivery of the annotation file to a client device, which can then play the annotation using software operating on the client device.
  • Annotation files may be stored in the same locations as their associated documents, which may be managed or provided by the server 104, an application or module forming part of the server 104, or some other application, device, etc.
  • a window 380 is illustrated that may allow a user to access annotations associated with a document 381 (e.g., the document entitled "Preliminary Budget”).
  • a document 381 e.g., the document entitled "Preliminary Budget”
  • the document 381 and the window 380 may be displayed by a client device associated with the user if the user clicks on the document entitled "Preliminary Budget” in the window 320 or on the button 348 in the window 340.
  • the document 381 may be displayed in or by a conventional word processor, spreadsheet, document manager, or other software application.
  • the displayed document 381 may include icons that indicate annotations to the document 381 , such as the icons 382, 384, 386, 388, 390, 392, 394.
  • the different colors, shapes, fill patterns, flashing rates, sizes, etc. of the icons displayed in or with the document 381 may indicate the providers of the annotations associated with the icons.
  • the annotation icons 382, 388, 392, 394 are associated with the user identified by email address "fred@largecompany.com".
  • this user has recorded four annotations that are associated with the document "Preliminary Budget" in the window 380.
  • the relative positions of the icons 382, 388, 392, 394 in the window 380 or the document 381 may indicate the general or specific subject matter of the annotations.
  • the annotation icon 384 is associated with the user identified by email address "dave@conglomerate.com” and the annotation icons 386, 390 are associated with the user identified by email address "susan @ independentsmall.com”. Clicking or selecting any of the icons displayed in the window 380 may cause the corresponding voice annotation to be played, delivered to a specific client device, etc.
  • the annotation icons in the window 380 also may have associated with them information regarding the dates/times of the creation of their associated annotations. As a result, moving a curser over an icon in the drop down window 380 may allow display in the window 380 of the creation date/time of the annotation associated with the icon.
  • a window 400 may allow a user to access annotations associated with a document 401 (e.g., the document entitled "Preliminary Budget”).
  • the document 401 and the window 400 may be displayed by a client device associated with the user if the user clicks on the document entitled "Preliminary Budget” in the window 320 or on the button 348 in the window 340.
  • the document 401 may be displayed in or by a conventional word processor, spreadsheet, document manager, or other software application.
  • the displayed document 401 may include icons that indicate annotations to the document exist, such as the icons 402 and 404. The different colors, shapes, fill patterns, flashing rates, sizes, etc.
  • the icons 402 and 404 displayed in or with the document 381 may indicate the providers of the annotations associated with the icons.
  • the annotation icon 402 is associated with the user identified by email address "fred@largecompany.com”.
  • the annotation icon 404 is associated with the user identified by email address "dave@conglomerate.com”.
  • the icons 402 and 404 indicate that specific users have provided annotations to the document 401.
  • Arrows 406, 408 indicate that the user may view specific annotations for the respective annotators identified by "fred@largecompany.com” and “dave@conglomerate.com”. For example, now referring to FIG.
  • a user of the window 400 selecting or clicking on the arrow 406 may cause drop down menu 410 to appear and selecting or clicking on arrow 408 may cause drop menu 412 to appear.
  • the drop down menu 410 indicates that the annotator identified by "fred@largecompany.com” has made or recorded four annotations associated with the document 401 and the drop down menu 412 indicates that the annotator identified by "dave@conglomerate.com” has made one annotation associated with the document 401.
  • the user may listen to any of the four annotations indicated in the drop down menu 410 my selecting or clicking on the iteration numbers "1", "2", “3” or "4" in the drop down menu 410.
  • the user may listen to the annotation indicated in the drop down menu 412 by selecting or clicking on the iteration number "1" in the drop down menu 412. Clicking on or selecting any of the iteration numbers in the drop down menus 410 or 412 may cause the annotation associated with the iteration number to be played, delivered to a specific client device, etc.
  • the appropriate iteration number may be added to the appropriate drop down menu or a drop down menu may be added as needed. Iteration numbers for annotations also may have associated with them information regarding the dates/times of the creation of the annotations. As a result, moving a curser over an iteration number may allow display in the window 400 of the creation date/time of the annotation associated with the iteration number.
  • a window 420 is illustrated that may allow a user to access annotations associated with a document 421 (e.g., the document entitled "Preliminary Budget”).
  • a document 421 e.g., the document entitled "Preliminary Budget”
  • the document 421 and the window 420 may be displayed by a client device associated with the user if the user clicks on the document entitled "Preliminary Budget” in the window 320 or on the button 348 in the window 340.
  • the document 421 may be displayed in or by a conventional word processor, spreadsheet, document manager, or other software application.
  • the displayed document 421 may include icons that indicate annotations to the document exist, such as the icons 424 and 426.
  • the icon 424 is not associated with any particular annotator and may indicate only that annotations to the document 421 exist. Selecting or clicking on the icon 426 may cause a drop down menu 428 to be displayed, as indicated in FIG. 11.
  • the drop down menu 428 may include a number of icons, each associated with a particular speaker.
  • the relative positions, colors, fill patterns, etc. of the icons in the drop down menu 428 may indicate the order in which annotations where added and by which annotator. Clicking on or selecting any of the icons in the drop down menu 428 may cause the annotation associated with the icon to be played, delivered to a specific client device, etc.
  • the icons in the drop down menu 428 also may have associated with them information regarding the dates/times of the creation of the annotations. As a result, moving a curser over an icon in the drop down menu 429 may allow display in the window 420 of the creation date/time of the annotation associated with the icon.
  • FIG. 12 a flow chart 450 is shown which represents the operation of a first embodiment a method.
  • the particular arrangement of elements in the flow chart 450 is not meant to imply a fixed order to the elements; embodiments can be practiced in any order that is practicable.
  • some or all of the elements of the method 450 may be performed or completed by the server 104 or another device or application, as will be discussed in more detail below.
  • Processing begins at 452 during which the server 104 associates an audible annotation with a document.
  • the document may be accessible to or otherwise associated with multiple people as part of a collaboration effort, conference, etc.
  • the server 104 may receive the annotation from a client device (e.g., the computer 122a) as an annotation file recorded by the client device via use of an annotation tool (e.g., the annotation tool windows 340 and 360). In other embodiments, the server 104 may record the annotation directly or receive data indicative of the annotation from another source. The server 104 may store the annotation with, as part of, separate from, or in the same location as the document.
  • a client device e.g., the computer 122a
  • an annotation tool e.g., the annotation tool windows 340 and 360
  • the server 104 may record the annotation directly or receive data indicative of the annotation from another source.
  • the server 104 may store the annotation with, as part of, separate from, or in the same location as the document.
  • the server 104 associates a person with the audible annotation.
  • the person may be the person who recorded the annotation,, the person associated with the client device from which the server 104 received the annotation, a person chosen from a list of people associated with the document or who have the right or ability to create an annotation for the document, or a person chosen via some other means.
  • the server 104 may have or have access to voice or speaker recognition applications or other technology through which the server 104 can identify the speaker of a voice annotation.
  • 454 may include requesting, establishing, or otherwise determining the identify of the person.
  • the server 104 establishes an identifier for the person.
  • the identifier may be a specific icon, icon color, name, code, etc.
  • a document may be associated with multiple people (e.g., all of the people who have access to a particular document). Each of the people may be assigned a different or distinct identifier during an implementation of 456. In such embodiments, 456 may occur prior to 452 and/or 454.
  • the server 104 provides data indicative of the annotation.
  • the server 104 may provide the annotation upon receiving a request for the annotation or the document.
  • the server 104 may provide or display an icon or other identifier in conjunction with the document to indicate that the annotation exists or is otherwise available.
  • the icon or other identifier may be displayed in, on, or as part of the document itself, as part of a window or toolbar used with the document, or in some other fashion.
  • the server 104 provides data indicative of the identifier established during 456.
  • the server 104 may provide the identifier upon receiving a request for the annotation or the document.
  • the server 104 may provide or display an icon or other identifier in conjunction with the document to indicate that the annotation exists or is otherwise available and/or to indicate that the annotation is associated with or related to the specific person.
  • the icon or other identifier may be displayed in or as part of the document itself, as part of a window or toolbar used with the document, or in some other fashion.
  • Data provided by the server 104 may be indicative of the icon associated with the specific person and/or the annotation.
  • FIG. 13 where a flow chart 470 is shown which represents the operation of a second embodiment of a method.
  • the particular arrangement of elements in the flow chart 470 is not meant to imply a fixed order to the elements; embodiments can be practiced in any order that is practicable.
  • some or all of the elements of the method 450 may be performed or completed by the server 104 or another device or application, as will be discussed in more detail below.
  • Processing begins at 472 during which the server 104 associates a plurality of people with a document.
  • the people may be participants in a conference or on-line or off-line collaborative effort, each having permission to access the document, make or provide annotations for the document, etc.
  • the list of people or data indicative of the people may be received from an application, client device, or other device.
  • the list of people may be established by a person setting up a conference, collaborative session or effort, etc.
  • the server 104 associates a different identifier to each of the plurality of people associated with the document.
  • the server 104 may associate different icons, icon shapes, icon colors, names, codes, etc. to each of the people.
  • 474 may be or include the server 104 receiving data indicative of the identifiers to be associated with the people from another application, device, etc.
  • the server 104 associates an audible annotation made by one of the plurality of people with the document.
  • the server 104 may receive the annotation from a client device (e.g., the computer 122a) as an annotation file recorded by the client device via use of an annotation tool (e.g., the annotation tool windows 340 and 360).
  • an annotation tool e.g., the annotation tool windows 340 and 360.
  • the server 104 may record the annotation directly or receive data indicative of the annotation from another source.
  • the server 104 may store the annotation with, as part of, separate from, or in the same location as the document.
  • the server 104 associates the identifier of the person who made the annotation with the document.
  • the server 104 may store the identifier with, as part of, separate from, or in the same location as the document and/or the annotation.
  • the server 104 also may display the identifier in or with the document, as part of a window or toolbar used with the document, or in some other fashion.
  • the server 104 may receive a request for a document and/or annotation, provide a document and one or more annotations, icons, identifiers, etc. associated with the document, etc.
  • the server 104 can comprise a single device or computer, a networked set or group of devices or computers, a workstation, mainframe or host computer, etc., and, in some embodiments, may include some or all of the components described above in regards to FIG. 1. In some embodiments, the server 104 may be adapted to implement one or more elements of the methods disclosed herein.
  • the server 104 may include a processor, microchip, central processing unit, or computer 550 that is in communication with or otherwise uses or includes one or more communication ports 552 for communicating with user devices and/or other devices.
  • the processor 550 may be or include some or all of the controller 101 previously discussed above. In some embodiments, the processor 550 may be operative to implement one or more of the elements of the methods disclosed herein.
  • Communication ports may include such things as local area network adapters, wireless communication devices, Bluetooth technology, etc.
  • the server 104 also may include an internal clock element 554 to maintain an accurate time and date for the server 104, create time stamps for communications received or sent by the server 104, etc.
  • the server 104 may include one or more output devices 556 such as a printer, infrared or other transmitter, antenna, audio speaker, display screen or monitor (e.g., the monitor 400), text to speech converter, etc., as well as one or more input devices 558 such as a bar code reader or other optical scanner, infrared or other receiver, antenna, magnetic stripe reader, image scanner, roller ball, touch pad, joystick, touch screen, microphone, computer keyboard, computer mouse, etc.
  • output devices 556 such as a printer, infrared or other transmitter, antenna, audio speaker, display screen or monitor (e.g., the monitor 400), text to speech converter, etc.
  • input devices 558 such as a bar code reader or other optical scanner, infrared or other receiver, antenna, magnetic stripe reader, image scanner, roller ball, touch pad, joystick, touch screen, microphone, computer keyboard, computer mouse, etc.
  • the server 104 may include a memory or data storage device 560 (which may be or include the memory 103 previously discussed above) to store information, software, databases, documents, communications, device drivers, etc.
  • the memory or data storage device 560 preferably comprises an appropriate combination of magnetic, optical and/or semiconductor memory, and may include, for example, Read-Only Memory (ROM), Random Access Memory (RAM), a tape drive, flash memory, a floppy disk drive, a ZipTM disk drive, a compact disc and/or a hard disk.
  • the server 104 also may include separate ROM 562 and RAM 564.
  • the processor 550 and the data storage device 560 in the server 104 each may be, for example: (i) located entirely within a single computer or other computing device; or (ii) connected to each other by a remote communication medium, such as a serial port cable, telephone line or radio frequency transceiver.
  • the server 104 may comprise one or more computers that are connected to a remote server computer for maintaining databases.
  • a conventional personal computer or workstation with sufficient memory and processing capability may be used as the server 104.
  • the server 104 operates as or includes a Web server for an Internet environment.
  • the server 104 may be capable of high volume transaction processing, performing a significant number of mathematical calculations in processing communications and database searches.
  • a PentiumTM microprocessor such as the Pentium IIITM or IVTM microprocessor, manufactured by Intel Corporation may be used for the processor 550. Equivalent processors are available from Motorola, Inc., AMD, or Sun Microsystems, Inc.
  • the processor 550 also may comprise one or more microprocessors, computers, computer systems, etc.
  • Software may be resident and operating or operational on the server 104.
  • the software may be stored on the data storage device 560 and may include a control program 566 for operating the server, databases, etc.
  • the control program 566 may control the processor 550.
  • the processor 550 preferably performs instructions of the control program 566, and thereby operates in accordance with the present invention, and particularly in accordance with the methods described in detail herein.
  • the control program 566 may be stored in a compressed, uncompiled and/or encrypted format.
  • the control program 566 furthermore includes program elements that may be necessary, such as an operating system, a database management system and device drivers for allowing the processor 550 to interface with peripheral devices, databases, etc. Appropriate program elements are known to those skilled in the art, and need not be described in detail herein.
  • the server 104 also may include or store information regarding users, user or client devices, conferences, collaborations, annotations, documents, communications, etc.
  • information regarding one or more collaborations may be stored in a conference information database 568 for use by the server 104 or another device or entity.
  • Information regarding one or more users e.g., participants in a collaboration effort
  • information regarding one or more annotations may be stored in an annotation information database 572 for use by the server 104 or another device or entity.
  • some or all of one or more of the databases may be stored or mirrored remotely from the server 104.
  • the instructions of the control program may be read into a main memory from another computer-readable medium, such as from the ROM 562 to the RAM 564. Execution of sequences of the instructions in the control program causes the processor 550 to perform the process elements described herein.
  • hard-wired circuitry may be used in place of, or in combination with, software instructions for implementation of some or all of the methods described herein. Thus, embodiments are not limited to any specific combination of hardware and software.
  • the processor 550, communication port 552, clock 554, output device 556, input device 558, data storage device 560, ROM 562, and RAM 564 may communicate or be connected directly or indirectly in a variety of ways.
  • the processor 550, communication port 552, clock 554, output device 556, input device 558, data storage device 560, ROM 562, and RAM 564 may be connected via a bus 574.
  • the methods described herein may be embodied as a computer program developed using an object oriented language that allows the modeling of complex systems with modular objects to create abstractions that are representative of real world, physical objects and their interrelationships.
  • object oriented language that allows the modeling of complex systems with modular objects to create abstractions that are representative of real world, physical objects and their interrelationships.
  • the invention as described herein could be implemented in many different ways using a wide range of programming techniques as well as general-purpose hardware systems or dedicated controllers.
  • many, if not all, of the elements for the methods described above are optional or can be combined or performed in one or more alternative orders or sequences without departing from the scope of the present invention and the claims should not be construed as being limited to any particular order or sequence, unless specifically indicated.
  • Each of the methods described above can be performed on a single computer, computer system, microprocessor, etc.
  • two or more of the elements in each of the methods described above could be performed on two or more different computers, computer systems, microprocessors, etc., some or all of which may be locally or remotely configured.
  • the methods can be implemented in any sort or implementation of computer software, program, sets of instructions, code, ASIC, or specially designed chips, logic gates, or other hardware structured to directly effect or implement such software, programs, sets of instructions or code.
  • the computer software, program, sets of instructions or code can be storable, writeable, or savable on any computer usable or readable media or other program storage device or media such as a floppy or other magnetic or optical disk, magnetic or optical tape, CD-ROM, DVD, punch cards, paper tape, hard disk drive, ZipTM disk, flash or optical memory card, microprocessor, solid state memory device, RAM, EPROM, or ROM.

Abstract

Embodiments provide a system, method, apparatus, means, and computer program code that allow multiple annotations to a document to be created and that distinguish between the annotations made by different people. The people may view documents (380, 400, 420), exchange ideas and messages, etc. via a server or conference/collaboration system at different times and/or without being in direct communication with each other. In such an off-line collaboration mode, the people may want to add listen to, view, or add annotations regarding one or more documents (380, 400, 420). The methods and systems described herein allow users to follow the trail of annotations regarding a document (380, 400, 420) and to distinguish between the voice or other audible annotations created by other people

Description

SYSTEM AND METHOD FOR INDICATING AN ANNOTATON
FOR A DOCUMENT
FIELD OF THE INVENTION
The present invention relates to telecommunications systems and, in particular, to an improved system and method for indicating an annotation for a document.
BACKGROUND
The development of various voice over IP protocols such as the H.323 Recommendation and the Session Initiation Protocol (SIP) has led to increased interest in multimedia conferencing. In such conferencing, typically, a more or less central server or other device manages the conference and maintains the various communications paths to computers or other client devices being used by parties to participate in the conference. Parties to the conference may be able to communicate via voice and/or video through the server and their client devices.
Instant messaging can provide an added dimension to multimedia conferences. In addition to allowing text chatting, instant messaging systems such as the Microsoft Windows Messenger™ system can allow for transfer of files, document sharing and collaboration, collaborative whiteboarding, and even voice and video. A complete multimedia conference can involve multiple voice and video streams, the transfer of many files, and much marking-up of documents and white boarding.
Participants in the multimedia conference or collaboration effort as well as other users may want to provide a voice, text, graphical, or other type of annotation regarding a document used in the conference or collaboration, their thoughts regarding the conference or collaboration effort, etc. The annotation may be provided back to other participants in the conference or collaboration effort for further review. As such, there is a need for a system and method for allowing multiple annotations to co-exist while distinguishing between the annotations made by different participants or other users. SUMMARY
Embodiments provide a system, method, apparatus, means, and computer program code for allowing multiple annotations for documents used in a conference or collaboration session to co-exist and for distinguishing between the annotations made by different participants in the conference or collaboration and/or other users. As part of a collaborative effort, people may view documents, exchange ideas and messages, etc. via a server or conference/collaboration system at different times and/or without being in direct communication with each other. In such an off-line collaboration mode, the people may want to add listen to, view, or add annotations regarding one or more documents. The systems, methods, apparatus, means, and computer program code described herein allow users to follow the trail of annotations regarding a document and to distinguish between the voice or other audible annotations created by other people.
Additional objects, advantages, and novel features of the invention shall be set forth in part in the description that follows, and in part will become apparent to those skilled in the art upon examination of the following or may be learned by the practice of the invention.
In some embodiments, a method for associating an annotation with a document may include associating an audible annotation with a document; associating a person with the audible annotation; establishing an identifier associated with the person; providing data indicative of the annotation; and providing data indicative of the identifier. In some other embodiments, a method for associating an annotation with a document may include associating a plurality of people with a document; associating a different identifier to each of the plurality of people; associating an audible annotation made by a first of the plurality of people to the document; and associating the first of the plurality of people's respective identifier to the document. In some embodiments, either of both of the methods may include receiving a request for a document or annotation, providing a document or annotation, receiving data indicative of a document or annotation, storing or displaying a document or annotation, and/or associating distinct icons or other identifiers with one or more people associated with a document, who have access to the document, or are allowed to record or provide an annotation for a document. Other embodiments may include means, systems, computer code, etc. for implementing some or all of the elements of the methods described herein.
With these and other advantages and features of the invention that will become hereinafter apparent, the nature of the invention may be more clearly understood by reference to the following detailed description of the invention, the appended claims and to the several drawings attached herein.
BRIEF DESCRIPTION OF THE DRAWINGS
The accompanying drawings, which are incorporated in and form a part of the specification, illustrate the embodiments, and together with the descriptions serve to explain the principles of the invention.
FIG. 1 is a diagram of a conference system according to some embodiments;
FIG. 2 is a diagram illustrating a conference collaboration system according to some embodiments;
FIG. 3 is another diagram illustrating a conference collaboration system according to some embodiments;
FIG. 4 is a diagram illustrating a graphical user interface according to some embodiments;
FIG. 5 is a diagram illustrating another graphical user interface according to some embodiments;
FIG. 6 is a diagram illustrating another graphical user interface according to some embodiments;
FIG. 7 is a diagram illustrating another graphical user interface according to some embodiments;
FIG. 8 is a diagram illustrating another graphical user interface according to some embodiments;
FIG. 9 is a diagram illustrating another graphical user interface according to some embodiments;
FIG. 10 is a diagram illustrating another graphical user interface according to some embodiments; FIG. 11 is a diagram illustrating another graphical user interface according to some embodiments;
FIG. 12 is a flowchart of a method in accordance with some embodiments;
FIG. 13 is another flowchart of a method in accordance with some embodiments; and
FIG. 14 is a block diagram of possible components that may be used in some embodiments of the server of FIG. 1.
DETAILED DESCRIPTION
Applicant has recognized that there is a market opportunity for systems, means, computer code, and methods that allow multiple annotations for a document to co-exist and that distinguishes between the annotations made by different participants in a collaborative effort regarding or involving the document.
During a conference or collaborative effort/session, different participants may be in communication with a server or conference system via client devices (e.g., computers, telephones). The server or conference system may facilitate communication between the participants, sharing or accessing of documents, etc. The conference may be part of a collaborative effort involving the participants. In some situations, one or more of the participants may communicate in an off-line collaboration mode, wherein the participants view documents, exchange ideas and messages, etc. via the server or conference system at different times and/or without being in direct communication with each other. In such an off-line collaboration mode, the participants or other users may want to add listen to, view, or add voice or other audible annotations regarding the conference or collaborative effort. For example, a participant may want to record or otherwise add an annotation to a document being shared or prepared as part of the collaborative effort. As another example, a participant may want to listen to one or more voice annotations regarding the document previously made by other participants. The methods and systems described herein allow, for example, users to follow the trail of voice annotations regarding a document and to distinguish between the voice annotations created by other participants in an off-line collaboration session.
Referring now to FIG. 1 , a diagram of an exemplary telecommunications or conference system 100 used in some embodiments is shown. As shown, the system 100 may include a local area network (LAN) 102. The LAN 102 may be implemented using a TCP/IP network and may implement voice or multimedia over IP using, for example, the Session Initiation Protocol (SIP). Operably coupled to the local area network 102 is a server 104. The server 104 may include one or more controllers 101 , which may be embodied as one or more microprocessors, and memory 103 for storing application programs and data. The controller 101 may implement an instant messaging system 106. The instant messaging system may be embodied as Microsoft Windows Messenger™ software or other instant messaging system. Thus, in some embodiments, the instant messaging system 106 implements the Microsoft.Net™ environment 108 and Real Time Communications protocol (RTC) 110.
In addition, in some embodiments, a collaboration system 114 may be provided, which may be part of an interactive suite of applications 112, run by controller 101 , as will be described in greater detail below. In addition, an action prompt module 115 may be provided, which detects occurrences of action cues and causes action prompt windows to be launched at the client devices 122. The collaboration system 114 may allow users of the system to become participants in a conference or collaboration session.
Also coupled to the LAN 102 is a gateway 116 which may be implemented as a gateway to a private branch exchange (PBX), the public switched telephone network (PSTN) 118, or any of a variety of other networks, such as a wireless or cellular network. In addition, one or more LAN telephones 120a-120n and one or more computers 122a-122n may be operably coupled to the LAN 102. In some embodiments, one or more other types of networks may be used for communication between the server 104, computers 122a-122n, telephones 120a-120n, the gateway 116, etc. For example, in some embodiments, a communications network might be or include the Internet, the World Wide Web, or some other public or private computer, cable, telephone, client/server, peer-to-peer, or communications network or intranet. In some embodiments, a communications network also can include other public and/or private wide area networks, local area networks, wireless networks, data communication networks or connections, intranets, routers, satellite links, microwave links, cellular or telephone networks, radio links, fiber optic transmission lines, ISDN lines, T1 lines, DSL connections, etc. Moreover, as used herein, communications include those enabled by wired or wireless technology. Also, in some embodiments, one or more client devices (e.g., the computers 122a-122n) may be connected directly to the server 104.
The computers 122a-122n may be personal computers implementing the Windows XP™ operating system and thus, Windows Messenger™ instant messenger system. In addition, the computers 122a-122n may include telephony and other multimedia messaging capability using, for example, peripheral cameras, Web cams, microphones and speakers (not shown) or peripheral telephony handsets 124, such as the Optipoint™ handset, available from Siemens Corporation. In other embodiments, one or more of the computers may be implemented as wireless telephones, digital telephones, or personal digital assistants (PDAs). Thus, the figures are exemplary only. As shown with reference to computer 122a, the computers may include one or more controllers 129, such as Pentium™ type microprocessors, and storage 131 for applications and other programs.
Finally, the computers 122a-122n may implement interaction services 128a-128n according to some embodiments. The interaction services 128a- 128n may allow for interworking of phone, buddy list, instant messaging, presence, collaboration, calendar and other applications. In addition, the interaction services 128 may allow access to the collaboration system or module 114 and the action prompt module 115 of the server 104.
Turning now to FIG. 2, a functional model diagram illustrating the collaboration system 114 is shown. More particularly, FIG. 2 is a logical diagram illustrating a particular embodiment of a collaboration server 104. The server 104 includes a plurality of application modules 200 and a communication broker (CB) module 201. One or more of the application modules and communication broker module 201 may include an inference engine, i.e., a rules or heuristics based artificial intelligence engine for implementing functions in some embodiments. In addition, the server 104 provides interfaces, such as APIs (application programming interfaces) to SIP phones 220 and gateways/interworking units 222.
According to the embodiment illustrated, the broker module 201 includes a basic services module 214, an advanced services module 216, an automation module 212, and a toolkit module 218. The automation module 212 implements an automation framework for ISVs (independent software vendors) 212 that allow products, software, etc. provided by such ISVs to be used with or created the server 104.
The basic services module 214 functions to implement, for example, phone support, PBX interfaces, call features and management, as well as Windows Messaging™ software and RTC add-ins, when necessary. The phone support features allow maintenance of and access to buddy lists and provide presence status.
The advanced services module 216 implements function such as presence, multipoint control unit or multi-channel conferencing unit (MCU), recording, and the like. MCU functions are used for voice conferencing and support ad hoc and dynamic conference creation from a buddy list following the SIP conferencing model for ad hoc conferences and collaboration sessions. In certain embodiments, support for G.711 and G.723.1 codecs is provided. Further, in some embodiments, the MCU can distribute media processing over multiple servers using the MEGACO protocol. In some embodiments, an MCU may provide the ability for participants to set up ad hoc voice, data, or multimedia conferencing or collaboration sessions. During such sessions, different client devices (e.g., the computers 122a-122n) may establish channels to the MCU and the server 104, the channels carrying voice, audio, video and/or other data from and to participants via their associated client devices. In some cases, more than one participant may be participating in the conference via the same client device. For example, multiple participants may be using a telephone (e.g., the telephone 126a) located in a conference room to participate in the conference. Also, in some cases, a participant may be using one client device (e.g., a computer) or multiple devices (e.g., a computer and a telephone) to participate in the conference. The Real-Time Transport Protocol (RTP) and the Real Time Control Protocol (RTCP) may be used to facilitate or manage communications or data exchanges between the client devices for the participants in the conference.
In some embodiments an MCU may include a mixer application or logical function that provides the audio, video, voice, etc. data to the different participants. The MCU may handle or manage establishing the calls in and out to the different participants and establish different channels with the client devices used by the participants. The server 104 may include, have access to, or be in communication with additional applications or functions that establish a list of participants in the conference as well as identify the participants speaking at a given moment during the conference.
Presence features provide device context for both SIP registered devices and user-defined non-SIP devices. Various user contexts, such as In Meeting, On Vacation, In the Office, etc., can be provided for. In addition, voice, e-mail, and instant messaging availability may be provided across the user's devices. The presence feature enables real time call control using presence information, e.g., to choose a destination based on the presence of a user's device(s). In addition, various components have a central repository for presence information and for changing and querying presence information. In addition, the presence module provides a user interface for presenting the user with presence information.
In addition, the broker module 201 may include the ComResponse™ platform, available from Siemens Information and Communication Networks, Inc. The ComResponse™ platform features include speech recognition, speech-to-text, and text-to-speech, and allows for creation of scripts for applications. The speech recognition and speech-to-text features may be used by the collaboration summarization unit 114 and the action prompt module 115.
In addition, real time call control is provided by a SIP API 220 associated with the basic services module 214. That is, calls can be intercepted in progress and real time actions performed on them, including directing those calls to alternate destinations based on rules and or other stimuli. The SIP API 220 also provides call progress monitoring capabilities and for reporting status of such calls to interested applications. The SIP API 220 also provides for call control from the user interface.
The toolkit module 218 may provide tools, APIs, scripting language, interfaces, software modules, libraries, software drivers, objects, etc. that may be used by software developers or programmers to build or integrate additional or complementary applications.
According to the embodiment illustrated, the application modules include a collaboration module 202, an interaction center module 204, a mobility module 206, an interworking services module 208, a collaboration summarization module 114, and an action prompt module 115.
The collaboration module 202 allows for creation, modification or deletion of a collaboration session for a group of participants or other users. The collaboration module 202 may further allow for invoking a voice conference from any client device. In addition, the collaboration module 202 can launch a multi-media conferencing package, such as the WebEx™ package. It is noted that the multi-media conferencing can be handled by other products, applications, devices, etc.
The interaction center 204 provides a telephony interface for both subscribers and guests. Subscriber access functions include calendar access and voicemail and e-mail access. The calendar access allows the subscriber to accept, decline, or modify appointments, as well as block out particular times. The voicemail and e-mail access allows the subscriber to access and sort messages.
Similarly, the guest access feature allows the guest access to voicemail for leaving messages and calendar functions for scheduling, canceling, and modifying appointments with subscribers. Further, the guest access feature allows a guest user to access specific data meant for them, e.g., receiving e-mail and fax back, etc.
The mobility module 206 provides for message forwarding and "one number" access across media, and message "morphing" across media for the subscriber. Further, various applications can send notification messages to a variety of destinations, such as e-mails, instant messages, pagers, and the like. In addition, a user can set rules that the mobility module 206 uses to define media handling, such as e-mail, voice and instant messaging handling. Such rules specify data and associated actions. For example, a rule could be defined to say "If I'm traveling, and I get a voicemail or e-mail marked Urgent, then page me."
Further, the collaboration summarization module 114 is used to identify or highlight portions of a multimedia conference and configure the portions sequentially for later playback. The portions may be stored or identified based on recording cues either preset or settable by one or more of the participants in the conference, such as a moderator. The recording cues may be based on vocalized keywords identified by the voice recognition unit of the ComResponse™ module, or may be invoked by special controls or video or whiteboarding or other identifiers.
The action prompt module 115 similarly allows a user to set action cues, which cause the launch of an action prompt window at the user's associated client device 122. In response, the client devices 122 can then perform various functions in accordance with the action cues.
Now referring to FIG. 3, a diagram of a graphical user interface 300 used in some embodiments is shown. In particular, shown are a variety of windows for invoking various functions. Such a graphical user interface 300 may be implemented on one or more of the client devices 302, 304, 306, 308. Thus, the graphical user interface 300 may interact with the interactive services unit 128 to control collaboration sessions.
Shown are a collaboration interface 302, a phone interface 304, and a buddy list 306. It is noted that other functional interfaces may be provided. According to particular embodiments, certain of the interfaces may be based on, be similar to, or interwork with, those provided by Microsoft Windows Messenger™ or Outlook™ software.
The buddy list 306 is used to set up instant messaging calls and/or multimedia conferences. The phone interface 304 is used to make calls, e.g., by typing in a phone number, and also allows invocation of supplementary service functions such as transfer, forward, etc. The collaboration interface 302 allows for viewing the parties to a conference or collaboration 302a and the type of media involved. It is noted that, while illustrated in the context of personal computers 122, similar interfaces may be provided the telephones or cellular telephones or PDAs. During a conference or collaboration, participants in the conference or collaboration may access or view shared documents or presentations, communicate with each other via audio, voice, data and/or video channels, etc.
Now referring to FIG. 4, a representative window 320 is illustrated that may allow a user to access documents associated with a collaboration effort or group and to access annotations associated with documents that are themselves associated with the collaboration effort or group. For example, the window 320 includes two primary portions 322 and 324 that provide information regarding the five participants in the collaboration and documents associated with the collaboration. The window 320 may be displayed on a client device used by a person associated with a collaboration. The window portion 322 is similar to the window 302a described previously above. In addition to listing participants by their email address that are associated with the collaboration, the window 322 also includes icons or other identifiers 326, 328, 330, 332, 334, each being associated with a different participant. Each of the icons 326, 328, 330, 332, 334 may be different (e.g., visually distinct or having a different color, shape, fill pattern, flashing rate, size, etc.) to indicate its relationship with a particular participant, as will be discussed in more detail below. Thus, the icons act as identifiers for specific people with regard to annotations made by the people to one or more documents.
The window portion 324 includes information regarding documents that are being used by the five participants and the number of annotations already made to each of the documents. For example, the window portion 324 indicates that the document entitled "Contact List" has two associated annotations and that the document entitled "Preliminary Budget" has eleven associated annotations. In some embodiments, a user of the window 320 may be able to access or open a document by selecting it or clicking on it in the window 320. Similarly, in some embodiments, the user of the window 320 may be able to access or open one or more annotations associated with a document by selecting or clicking on an annotation number associated with the document. Alternatively, clicking on a document name or annotation number, or clicking on a link or button 336 entitled "Annotation Tool" might result in display of an annotation tool window 340, as illustrated in FIG. 5.
Now referring to FIG. 5, the annotation tool window 340 may allow a user to access or create annotations associated with a document. The window 340 may be displayed on a client device used by a person associated with a collaborative effort. The window 340 may include a text block 342 in which the user may type the name of the document (e.g., "Preliminary Budget") of interest. Alternatively, clicking on or selecting the document on the window 320 may cause the document to be displayed automatically in the text block 342. Once the document name is known, the window 340 may display the number of annotations associated with the document in text block 344 and the number of different annotators for the document in text block 346.
In some embodiments, the window 340 also may include one or more buttons 348, 350, 352, 354 that allow a user to initiate different functions or features. For example, selecting or clicking on the button 348 may allow the user to view information regarding annotations made to the document "Preliminary Budget". Selecting or clicking on the button 350 may allow the user to record or add an annotation that is associated with the document "Preliminary Budget".
Selecting or clicking on the button 352 may allow a user to listen to previously recorded annotations associated with the document. For example, selecting the button 352 may cause all of the previously recorded annotations for the document to be played in chronological order, in chronological order by individual, or in accordance with some other rule or procedure.
Selecting or clicking on the button 354 may cause the text blocks 342, 344, 346 to be emptied and a prompt to be displayed the user to enter a new document name in the text box 342.
When a user clicks on or selects the button 350, the window 360 may be displayed that will allow the user to record an annotation for the document "Preliminary Budget", as illustrated in FIG. 6. The window 360 may be displayed on a client device (e.g., the computer 122a) used by a person associated with a collaboration. The window 360 may include a text block 362 in which the user can indicate who is making the annotation. The text block 362 may include a pull down menu that the user can access via box 364. When used, the pull down menu may have a list of all people having the ability to add such an annotation, which may include the people indicated in the window 322. By selecting or clicking button 366, the user can begin recording the annotation. The window 360 may be associated with a particular microphone or other input device on a client device (e.g., computer, telephone) displaying the window 360 that the user can use to provide the audible annotation. The user can stop the recording by selecting or clicking button 368. Selecting or clicking button 370 will cause the just recorded annotation to be replayed while selecting or clicking button 372 will cause the just recorded annotation to be canceled or deleted.
In some embodiments, an annotation for a document may be stored as a .WAV file and is associated with the document or stored as part of the document file. In some embodiments, the annotations for a document may be stored as separate files from the document. Each annotation may be stored or kept as a separate file, thereby allowing the annotations to be accessed and listened to separately and independently. An annotation file also may include information (e.g., icon, icon color, speaker name or other identifier, creation date/time, document identifier, client device identifier, etc.) that associates the annotation with a specific speaker, document, client device, etc. Playing of an annotation may involve retrieval of the appropriate annotation file and delivery of the annotation file to a client device, which can then play the annotation using software operating on the client device. Annotation files may be stored in the same locations as their associated documents, which may be managed or provided by the server 104, an application or module forming part of the server 104, or some other application, device, etc.
Now referring to FIG. 7, a window 380 is illustrated that may allow a user to access annotations associated with a document 381 (e.g., the document entitled "Preliminary Budget"). For example, the document 381 and the window 380 may be displayed by a client device associated with the user if the user clicks on the document entitled "Preliminary Budget" in the window 320 or on the button 348 in the window 340.
In some embodiments, the document 381 may be displayed in or by a conventional word processor, spreadsheet, document manager, or other software application. The displayed document 381 may include icons that indicate annotations to the document 381 , such as the icons 382, 384, 386, 388, 390, 392, 394. The different colors, shapes, fill patterns, flashing rates, sizes, etc. of the icons displayed in or with the document 381 may indicate the providers of the annotations associated with the icons. For example, as illustrated in the window 320 of FIG. 4, the annotation icons 382, 388, 392, 394 are associated with the user identified by email address "fred@largecompany.com". Thus, this user has recorded four annotations that are associated with the document "Preliminary Budget" in the window 380. The relative positions of the icons 382, 388, 392, 394 in the window 380 or the document 381 may indicate the general or specific subject matter of the annotations. Similarly, the annotation icon 384 is associated with the user identified by email address "dave@conglomerate.com" and the annotation icons 386, 390 are associated with the user identified by email address "susan @ independentsmall.com". Clicking or selecting any of the icons displayed in the window 380 may cause the corresponding voice annotation to be played, delivered to a specific client device, etc. The annotation icons in the window 380 also may have associated with them information regarding the dates/times of the creation of their associated annotations. As a result, moving a curser over an icon in the drop down window 380 may allow display in the window 380 of the creation date/time of the annotation associated with the icon.
Now referring to FIG. 8, a window 400 is illustrated that may allow a user to access annotations associated with a document 401 (e.g., the document entitled "Preliminary Budget"). For example, the document 401 and the window 400 may be displayed by a client device associated with the user if the user clicks on the document entitled "Preliminary Budget" in the window 320 or on the button 348 in the window 340. In some embodiments, the document 401 may be displayed in or by a conventional word processor, spreadsheet, document manager, or other software application. The displayed document 401 may include icons that indicate annotations to the document exist, such as the icons 402 and 404. The different colors, shapes, fill patterns, flashing rates, sizes, etc. of the icons 402 and 404 displayed in or with the document 381 may indicate the providers of the annotations associated with the icons. For example, as illustrated in the window 400 of FIG. 8, the annotation icon 402 is associated with the user identified by email address "fred@largecompany.com". Similarly, the annotation icon 404 is associated with the user identified by email address "dave@conglomerate.com". Thus, the icons 402 and 404 indicate that specific users have provided annotations to the document 401. Arrows 406, 408 indicate that the user may view specific annotations for the respective annotators identified by "fred@largecompany.com" and "dave@conglomerate.com". For example, now referring to FIG. 9, a user of the window 400 selecting or clicking on the arrow 406 may cause drop down menu 410 to appear and selecting or clicking on arrow 408 may cause drop menu 412 to appear. The drop down menu 410 indicates that the annotator identified by "fred@largecompany.com" has made or recorded four annotations associated with the document 401 and the drop down menu 412 indicates that the annotator identified by "dave@conglomerate.com" has made one annotation associated with the document 401. The user may listen to any of the four annotations indicated in the drop down menu 410 my selecting or clicking on the iteration numbers "1", "2", "3" or "4" in the drop down menu 410. Similarly, the user may listen to the annotation indicated in the drop down menu 412 by selecting or clicking on the iteration number "1" in the drop down menu 412. Clicking on or selecting any of the iteration numbers in the drop down menus 410 or 412 may cause the annotation associated with the iteration number to be played, delivered to a specific client device, etc.
Each time an annotation is created and associated with the document 401 , the appropriate iteration number may added to the appropriate drop down menu or a drop down menu may be added as needed. Iteration numbers for annotations also may have associated with them information regarding the dates/times of the creation of the annotations. As a result, moving a curser over an iteration number may allow display in the window 400 of the creation date/time of the annotation associated with the iteration number.
Now referring to FIG. 10, a window 420 is illustrated that may allow a user to access annotations associated with a document 421 (e.g., the document entitled "Preliminary Budget"). For example, the document 421 and the window 420 may be displayed by a client device associated with the user if the user clicks on the document entitled "Preliminary Budget" in the window 320 or on the button 348 in the window 340.
In some embodiments, the document 421 may be displayed in or by a conventional word processor, spreadsheet, document manager, or other software application. The displayed document 421 may include icons that indicate annotations to the document exist, such as the icons 424 and 426. The icon 424 is not associated with any particular annotator and may indicate only that annotations to the document 421 exist. Selecting or clicking on the icon 426 may cause a drop down menu 428 to be displayed, as indicated in FIG. 11.
The drop down menu 428 may include a number of icons, each associated with a particular speaker. The relative positions, colors, fill patterns, etc. of the icons in the drop down menu 428 may indicate the order in which annotations where added and by which annotator. Clicking on or selecting any of the icons in the drop down menu 428 may cause the annotation associated with the icon to be played, delivered to a specific client device, etc. The icons in the drop down menu 428 also may have associated with them information regarding the dates/times of the creation of the annotations. As a result, moving a curser over an icon in the drop down menu 429 may allow display in the window 420 of the creation date/time of the annotation associated with the icon.
Process Description Reference is now made to FIG. 12, where a flow chart 450 is shown which represents the operation of a first embodiment a method. The particular arrangement of elements in the flow chart 450 is not meant to imply a fixed order to the elements; embodiments can be practiced in any order that is practicable. In some embodiments, some or all of the elements of the method 450 may be performed or completed by the server 104 or another device or application, as will be discussed in more detail below.
Processing begins at 452 during which the server 104 associates an audible annotation with a document. The document may be accessible to or otherwise associated with multiple people as part of a collaboration effort, conference, etc.
In some embodiments, the server 104 may receive the annotation from a client device (e.g., the computer 122a) as an annotation file recorded by the client device via use of an annotation tool (e.g., the annotation tool windows 340 and 360). In other embodiments, the server 104 may record the annotation directly or receive data indicative of the annotation from another source. The server 104 may store the annotation with, as part of, separate from, or in the same location as the document.
During 454, the server 104 associates a person with the audible annotation. In some embodiments, the person may be the person who recorded the annotation,, the person associated with the client device from which the server 104 received the annotation, a person chosen from a list of people associated with the document or who have the right or ability to create an annotation for the document, or a person chosen via some other means. For example, in some embodiments the server 104 may have or have access to voice or speaker recognition applications or other technology through which the server 104 can identify the speaker of a voice annotation. In some embodiments, 454 may include requesting, establishing, or otherwise determining the identify of the person.
During 456, the server 104 establishes an identifier for the person. The identifier may be a specific icon, icon color, name, code, etc. In some embodiments, a document may be associated with multiple people (e.g., all of the people who have access to a particular document). Each of the people may be assigned a different or distinct identifier during an implementation of 456. In such embodiments, 456 may occur prior to 452 and/or 454.
During 458, the server 104 provides data indicative of the annotation. For example, the server 104 may provide the annotation upon receiving a request for the annotation or the document. As another example, the server 104 may provide or display an icon or other identifier in conjunction with the document to indicate that the annotation exists or is otherwise available. The icon or other identifier may be displayed in, on, or as part of the document itself, as part of a window or toolbar used with the document, or in some other fashion.
During 460, the server 104 provides data indicative of the identifier established during 456. For example, the server 104 may provide the identifier upon receiving a request for the annotation or the document. As another example, the server 104 may provide or display an icon or other identifier in conjunction with the document to indicate that the annotation exists or is otherwise available and/or to indicate that the annotation is associated with or related to the specific person. The icon or other identifier may be displayed in or as part of the document itself, as part of a window or toolbar used with the document, or in some other fashion. Data provided by the server 104 may be indicative of the icon associated with the specific person and/or the annotation.
Reference is now made to FIG. 13, where a flow chart 470 is shown which represents the operation of a second embodiment of a method. The particular arrangement of elements in the flow chart 470 is not meant to imply a fixed order to the elements; embodiments can be practiced in any order that is practicable. In some embodiments, some or all of the elements of the method 450 may be performed or completed by the server 104 or another device or application, as will be discussed in more detail below.
Processing begins at 472 during which the server 104 associates a plurality of people with a document. For example, the people may be participants in a conference or on-line or off-line collaborative effort, each having permission to access the document, make or provide annotations for the document, etc. The list of people or data indicative of the people may be received from an application, client device, or other device. As another example, the list of people may be established by a person setting up a conference, collaborative session or effort, etc.
During 474, the server 104 associates a different identifier to each of the plurality of people associated with the document. For example, the server 104 may associate different icons, icon shapes, icon colors, names, codes, etc. to each of the people. In some embodiments, 474 may be or include the server 104 receiving data indicative of the identifiers to be associated with the people from another application, device, etc.
During 476, the server 104 associates an audible annotation made by one of the plurality of people with the document. For example, in some embodiments, the server 104 may receive the annotation from a client device (e.g., the computer 122a) as an annotation file recorded by the client device via use of an annotation tool (e.g., the annotation tool windows 340 and 360). In other embodiments, the server 104 may record the annotation directly or receive data indicative of the annotation from another source. The server 104 may store the annotation with, as part of, separate from, or in the same location as the document.
During 478, the server 104 associates the identifier of the person who made the annotation with the document. For example, the server 104 may store the identifier with, as part of, separate from, or in the same location as the document and/or the annotation. The server 104 also may display the identifier in or with the document, as part of a window or toolbar used with the document, or in some other fashion.
In some embodiments, the server 104 may receive a request for a document and/or annotation, provide a document and one or more annotations, icons, identifiers, etc. associated with the document, etc.
Server
Now referring to FIG. 14, a representative block diagram of a server or controller 104 is illustrated. The server 104 can comprise a single device or computer, a networked set or group of devices or computers, a workstation, mainframe or host computer, etc., and, in some embodiments, may include some or all of the components described above in regards to FIG. 1. In some embodiments, the server 104 may be adapted to implement one or more elements of the methods disclosed herein.
The server 104 may include a processor, microchip, central processing unit, or computer 550 that is in communication with or otherwise uses or includes one or more communication ports 552 for communicating with user devices and/or other devices. The processor 550 may be or include some or all of the controller 101 previously discussed above. In some embodiments, the processor 550 may be operative to implement one or more of the elements of the methods disclosed herein.
Communication ports may include such things as local area network adapters, wireless communication devices, Bluetooth technology, etc. The server 104 also may include an internal clock element 554 to maintain an accurate time and date for the server 104, create time stamps for communications received or sent by the server 104, etc.
If desired, the server 104 may include one or more output devices 556 such as a printer, infrared or other transmitter, antenna, audio speaker, display screen or monitor (e.g., the monitor 400), text to speech converter, etc., as well as one or more input devices 558 such as a bar code reader or other optical scanner, infrared or other receiver, antenna, magnetic stripe reader, image scanner, roller ball, touch pad, joystick, touch screen, microphone, computer keyboard, computer mouse, etc.
In addition to the above, the server 104 may include a memory or data storage device 560 (which may be or include the memory 103 previously discussed above) to store information, software, databases, documents, communications, device drivers, etc. The memory or data storage device 560 preferably comprises an appropriate combination of magnetic, optical and/or semiconductor memory, and may include, for example, Read-Only Memory (ROM), Random Access Memory (RAM), a tape drive, flash memory, a floppy disk drive, a Zip™ disk drive, a compact disc and/or a hard disk. The server 104 also may include separate ROM 562 and RAM 564.
The processor 550 and the data storage device 560 in the server 104 each may be, for example: (i) located entirely within a single computer or other computing device; or (ii) connected to each other by a remote communication medium, such as a serial port cable, telephone line or radio frequency transceiver. In one embodiment, the server 104 may comprise one or more computers that are connected to a remote server computer for maintaining databases.
A conventional personal computer or workstation with sufficient memory and processing capability may be used as the server 104. In one embodiment, the server 104 operates as or includes a Web server for an Internet environment. The server 104 may be capable of high volume transaction processing, performing a significant number of mathematical calculations in processing communications and database searches. A Pentium™ microprocessor such as the Pentium III™ or IV™ microprocessor, manufactured by Intel Corporation may be used for the processor 550. Equivalent processors are available from Motorola, Inc., AMD, or Sun Microsystems, Inc. The processor 550 also may comprise one or more microprocessors, computers, computer systems, etc.
Software may be resident and operating or operational on the server 104. The software may be stored on the data storage device 560 and may include a control program 566 for operating the server, databases, etc. The control program 566 may control the processor 550. The processor 550 preferably performs instructions of the control program 566, and thereby operates in accordance with the present invention, and particularly in accordance with the methods described in detail herein. The control program 566 may be stored in a compressed, uncompiled and/or encrypted format. The control program 566 furthermore includes program elements that may be necessary, such as an operating system, a database management system and device drivers for allowing the processor 550 to interface with peripheral devices, databases, etc. Appropriate program elements are known to those skilled in the art, and need not be described in detail herein.
The server 104 also may include or store information regarding users, user or client devices, conferences, collaborations, annotations, documents, communications, etc. For example, information regarding one or more collaborations may be stored in a conference information database 568 for use by the server 104 or another device or entity. Information regarding one or more users (e.g., participants in a collaboration effort) may be stored in a user information database 570 for use by the server 104 or another device or entity and information regarding one or more annotations may be stored in an annotation information database 572 for use by the server 104 or another device or entity. In some embodiments, some or all of one or more of the databases may be stored or mirrored remotely from the server 104.
In some embodiments, the instructions of the control program may be read into a main memory from another computer-readable medium, such as from the ROM 562 to the RAM 564. Execution of sequences of the instructions in the control program causes the processor 550 to perform the process elements described herein. In alternative embodiments, hard-wired circuitry may be used in place of, or in combination with, software instructions for implementation of some or all of the methods described herein. Thus, embodiments are not limited to any specific combination of hardware and software.
The processor 550, communication port 552, clock 554, output device 556, input device 558, data storage device 560, ROM 562, and RAM 564 may communicate or be connected directly or indirectly in a variety of ways. For example, the processor 550, communication port 552, clock 554, output device 556, input device 558, data storage device 560, ROM 562, and RAM 564 may be connected via a bus 574.
While specific implementations and hardware configurations for the server 104 have been illustrated, it should be noted that other implementations and hardware configurations are possible and that no specific implementation or hardware configuration is needed. Thus, not all of the components illustrated in FIG. 14 may be needed for the server 104 implementing the methods disclosed herein.
The methods described herein may be embodied as a computer program developed using an object oriented language that allows the modeling of complex systems with modular objects to create abstractions that are representative of real world, physical objects and their interrelationships. However, it would be understood by one of ordinary skill in the art that the invention as described herein could be implemented in many different ways using a wide range of programming techniques as well as general-purpose hardware systems or dedicated controllers. In addition, many, if not all, of the elements for the methods described above are optional or can be combined or performed in one or more alternative orders or sequences without departing from the scope of the present invention and the claims should not be construed as being limited to any particular order or sequence, unless specifically indicated.
Each of the methods described above can be performed on a single computer, computer system, microprocessor, etc. In addition, two or more of the elements in each of the methods described above could be performed on two or more different computers, computer systems, microprocessors, etc., some or all of which may be locally or remotely configured. The methods can be implemented in any sort or implementation of computer software, program, sets of instructions, code, ASIC, or specially designed chips, logic gates, or other hardware structured to directly effect or implement such software, programs, sets of instructions or code. The computer software, program, sets of instructions or code can be storable, writeable, or savable on any computer usable or readable media or other program storage device or media such as a floppy or other magnetic or optical disk, magnetic or optical tape, CD-ROM, DVD, punch cards, paper tape, hard disk drive, Zip™ disk, flash or optical memory card, microprocessor, solid state memory device, RAM, EPROM, or ROM.
Although the present invention has been described with respect to various embodiments thereof, those skilled in the art will note that various substitutions may be made to those embodiments described herein without departing from the spirit and scope of the present invention. The invention described in the above detailed description is not intended to be limited to the specific form set forth herein, but is intended to cover such alternatives, modifications and equivalents as can reasonably be included within the spirit and scope of the appended claims.
The words "comprise," "comprises," "comprising," "include," "including," and "includes" when used in this specification and in the following claims are intended to specify the presence of stated features, elements, integers, components, or steps, but they do not preclude the presence or addition of one or more other features, elements, integers, components, steps, or groups thereof.

Claims

WHAT IS CLAIMED IS:
1. A method for facilitating granting of a permission regarding a stored element (110, 116), comprising: receiving data indicative of an entity; receiving data indicative of a stored element (110, 116); receiving data indicative of a permission to be provided to said entity regarding said stored element (110, 116); and providing said permission to said entity regarding said stored element (110, 116).
2. The method of claim 1 , wherein said receiving data indicative of an entity includes receiving an indication of a selection of said entity by a user.
3. The method of claim 2, wherein said receiving data indicative of a stored element (110, 116) includes receiving an indication of a selection of said stored element (110, 116) by said user.
4. The method of claim 3, wherein said receiving data indicative of a permission to be provided to said entity regarding said stored element (110, 116) includes receiving data indicative of a selection of said permission by said user.
5. The method of claim 1 , further comprising: displaying a list of entities, said list including said entity.
6. The method of claim 5, wherein said receiving data indicative of said entity includes receiving an indication of a selection of an entity from said list.
7. The method of claim 1 , further comprising: determining that said entity is not a valid place to stored said stored element (110, 116).
8. The method of claim 1 , further comprising: providing data to said entity indicative of said permission.
9. The method of claim 8, further comprising: providing data to said entity indicative of said stored element (110, 116).
10. A system for facilitating granting of a permission to an entity regarding a stored element (110, 116), comprising: a memory (260); a communication port (252); and a processor (250) connected to said memory (260) and said communication port (252), said processor (250) being operative to: receive data indicative of an entity; receive data indicative of a stored element (110, 116); receive data indicative of a permission to be provided to said entity regarding said stored element (110, 116); and provide said permission to said entity regarding said stored element (110, 116).
PCT/US2004/012098 2003-06-05 2004-04-19 System and method for indicating an annotation for a document WO2004109428A2 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US10/456,336 2003-06-05
US10/456,336 US7257769B2 (en) 2003-06-05 2003-06-05 System and method for indicating an annotation for a document

Publications (2)

Publication Number Publication Date
WO2004109428A2 true WO2004109428A2 (en) 2004-12-16
WO2004109428A8 WO2004109428A8 (en) 2005-03-17

Family

ID=33490140

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2004/012098 WO2004109428A2 (en) 2003-06-05 2004-04-19 System and method for indicating an annotation for a document

Country Status (2)

Country Link
US (1) US7257769B2 (en)
WO (1) WO2004109428A2 (en)

Families Citing this family (136)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8352400B2 (en) 1991-12-23 2013-01-08 Hoffberg Steven M Adaptive pattern recognition based controller apparatus and method and human-factored interface therefore
US7966078B2 (en) 1999-02-01 2011-06-21 Steven Hoffberg Network media appliance system and method
US20030050927A1 (en) * 2001-09-07 2003-03-13 Araha, Inc. System and method for location, understanding and assimilation of digital documents through abstract indicia
US20050018828A1 (en) * 2003-07-25 2005-01-27 Siemens Information And Communication Networks, Inc. System and method for indicating a speaker during a conference
CN100555264C (en) * 2003-10-21 2009-10-28 国际商业机器公司 The annotate method of electronic document, device and system
US20050125717A1 (en) * 2003-10-29 2005-06-09 Tsakhi Segal System and method for off-line synchronized capturing and reviewing notes and presentations
US20050114357A1 (en) * 2003-11-20 2005-05-26 Rathinavelu Chengalvarayan Collaborative media indexing system and method
US7111230B2 (en) * 2003-12-22 2006-09-19 Pitney Bowes Inc. System and method for annotating documents
US8442331B2 (en) 2004-02-15 2013-05-14 Google Inc. Capturing text from rendered documents using supplemental information
US7707039B2 (en) 2004-02-15 2010-04-27 Exbiblio B.V. Automatic modification of web pages
US20060041484A1 (en) 2004-04-01 2006-02-23 King Martin T Methods and systems for initiating application processes by data capture from rendered documents
US10635723B2 (en) 2004-02-15 2020-04-28 Google Llc Search engines and systems with handheld document data capture devices
US7812860B2 (en) 2004-04-01 2010-10-12 Exbiblio B.V. Handheld device for capturing text from both a document printed on paper and a document displayed on a dynamic display device
DE602005013938D1 (en) * 2004-03-29 2009-05-28 Philips Intellectual Property METHOD FOR CONTROLLING MULTIPLE APPLICATIONS AND DIALOG MANAGEMENT SYSTEM
US9116890B2 (en) 2004-04-01 2015-08-25 Google Inc. Triggering actions in response to optically or acoustically capturing keywords from a rendered document
US7990556B2 (en) 2004-12-03 2011-08-02 Google Inc. Association of a portable scanner with input/output and storage devices
US7894670B2 (en) 2004-04-01 2011-02-22 Exbiblio B.V. Triggering actions in response to optically or acoustically capturing keywords from a rendered document
US8146156B2 (en) 2004-04-01 2012-03-27 Google Inc. Archive of text captures from rendered documents
US20070300142A1 (en) 2005-04-01 2007-12-27 King Martin T Contextual dynamic advertising based upon captured rendered text
US20060081714A1 (en) 2004-08-23 2006-04-20 King Martin T Portable scanning device
US9143638B2 (en) 2004-04-01 2015-09-22 Google Inc. Data capture from rendered documents using handheld device
US20080313172A1 (en) 2004-12-03 2008-12-18 King Martin T Determining actions involving captured information and electronic content associated with rendered documents
US8793162B2 (en) 2004-04-01 2014-07-29 Google Inc. Adding information or functionality to a rendered document via association with an electronic counterpart
US8621349B2 (en) 2004-04-01 2013-12-31 Google Inc. Publishing techniques for adding value to a rendered document
US20060098900A1 (en) 2004-09-27 2006-05-11 King Martin T Secure data gathering from rendered documents
US8713418B2 (en) 2004-04-12 2014-04-29 Google Inc. Adding value to a rendered document
US8788592B1 (en) * 2004-04-15 2014-07-22 Oracle America, Inc. System and method for customizable e-mail message notes
US9460346B2 (en) 2004-04-19 2016-10-04 Google Inc. Handheld device for capturing text from both a document printed on paper and a document displayed on a dynamic display device
US8874504B2 (en) 2004-12-03 2014-10-28 Google Inc. Processing techniques for visual capture data from a rendered document
US8620083B2 (en) 2004-12-03 2013-12-31 Google Inc. Method and system for character recognition
US8489624B2 (en) 2004-05-17 2013-07-16 Google, Inc. Processing techniques for text capture from a rendered document
US7624188B2 (en) * 2004-05-03 2009-11-24 Nokia Corporation Apparatus and method to provide conference data sharing between user agent conference participants
US20060031234A1 (en) * 2004-05-21 2006-02-09 Brodi Beartusk Systems and methods for a collaborative group chat
US20050262075A1 (en) 2004-05-21 2005-11-24 Bea Systems, Inc. Systems and methods for collaboration shared state management
US8346620B2 (en) 2004-07-19 2013-01-01 Google Inc. Automatic modification of web pages
US20070118794A1 (en) * 2004-09-08 2007-05-24 Josef Hollander Shared annotation system and method
US20060075449A1 (en) * 2004-09-24 2006-04-06 Cisco Technology, Inc. Distributed architecture for digital program insertion in video streams delivered over packet networks
US9021456B2 (en) * 2004-12-22 2015-04-28 International Business Machines Corporation Using collaborative annotations to specify real-time process flows and system constraints
EP1675373A1 (en) * 2004-12-23 2006-06-28 Alcatel Conferencing system for exchanging information between subscriber devices via a network unit using different modes
JP5053550B2 (en) * 2005-02-01 2012-10-17 キヤノン株式会社 Document processing apparatus and method and document processing system
US20060212509A1 (en) * 2005-03-21 2006-09-21 International Business Machines Corporation Profile driven method for enabling annotation of World Wide Web resources
US20060218004A1 (en) * 2005-03-23 2006-09-28 Dworkin Ross E On-line slide kit creation and collaboration system
US7734631B2 (en) * 2005-04-25 2010-06-08 Microsoft Corporation Associating information with an electronic document
US8010894B2 (en) * 2005-05-18 2011-08-30 Microsoft Corporation Memory optimizing for re-ordering user edits
US7610545B2 (en) * 2005-06-06 2009-10-27 Bea Systems, Inc. Annotations for tracking provenance
US20070005697A1 (en) * 2005-06-29 2007-01-04 Eric Yuan Methods and apparatuses for detecting content corresponding to a collaboration session
US20070005699A1 (en) * 2005-06-29 2007-01-04 Eric Yuan Methods and apparatuses for recording a collaboration session
US7945621B2 (en) 2005-06-29 2011-05-17 Webex Communications, Inc. Methods and apparatuses for recording and viewing a collaboration session
US7680047B2 (en) * 2005-11-22 2010-03-16 Cisco Technology, Inc. Maximum transmission unit tuning mechanism for a real-time transport protocol stream
US20070124507A1 (en) * 2005-11-28 2007-05-31 Sap Ag Systems and methods of processing annotations and multimodal user inputs
US7877486B2 (en) * 2005-12-08 2011-01-25 International Business Machines Corporation Auto-establishment of a voice channel of access to a session for a composite service from a visual channel of access to the session for the composite service
US7890635B2 (en) * 2005-12-08 2011-02-15 International Business Machines Corporation Selective view synchronization for composite services delivery
US20070136793A1 (en) * 2005-12-08 2007-06-14 International Business Machines Corporation Secure access to a common session in a composite services delivery environment
US7792971B2 (en) 2005-12-08 2010-09-07 International Business Machines Corporation Visual channel refresh rate control for composite services delivery
US10332071B2 (en) 2005-12-08 2019-06-25 International Business Machines Corporation Solution for adding context to a text exchange modality during interactions with a composite services application
US20070133512A1 (en) * 2005-12-08 2007-06-14 International Business Machines Corporation Composite services enablement of visual navigation into a call center
US7809838B2 (en) * 2005-12-08 2010-10-05 International Business Machines Corporation Managing concurrent data updates in a composite services delivery system
US8005934B2 (en) * 2005-12-08 2011-08-23 International Business Machines Corporation Channel presence in a composite services enablement environment
US8189563B2 (en) * 2005-12-08 2012-05-29 International Business Machines Corporation View coordination for callers in a composite services enablement environment
US20070136449A1 (en) * 2005-12-08 2007-06-14 International Business Machines Corporation Update notification for peer views in a composite services delivery environment
US7818432B2 (en) * 2005-12-08 2010-10-19 International Business Machines Corporation Seamless reflection of model updates in a visual page for a visual channel in a composite services delivery system
US20070133511A1 (en) * 2005-12-08 2007-06-14 International Business Machines Corporation Composite services delivery utilizing lightweight messaging
US20070147355A1 (en) * 2005-12-08 2007-06-28 International Business Machines Corporation Composite services generation tool
US20070136421A1 (en) * 2005-12-08 2007-06-14 International Business Machines Corporation Synchronized view state for composite services delivery
US7827288B2 (en) * 2005-12-08 2010-11-02 International Business Machines Corporation Model autocompletion for composite services synchronization
US20070133769A1 (en) * 2005-12-08 2007-06-14 International Business Machines Corporation Voice navigation of a visual view for a session in a composite services enablement environment
US20070132834A1 (en) * 2005-12-08 2007-06-14 International Business Machines Corporation Speech disambiguation in a composite services enablement environment
US20070133773A1 (en) * 2005-12-08 2007-06-14 International Business Machines Corporation Composite services delivery
US8259923B2 (en) 2007-02-28 2012-09-04 International Business Machines Corporation Implementing a contact center using open standards and non-proprietary components
US11093898B2 (en) 2005-12-08 2021-08-17 International Business Machines Corporation Solution for adding context to a text exchange modality during interactions with a composite services application
US20070133509A1 (en) * 2005-12-08 2007-06-14 International Business Machines Corporation Initiating voice access to a session from a visual access channel to the session in a composite services delivery system
US20080027782A1 (en) * 2006-04-07 2008-01-31 Juliana Freire Managing provenance of the evolutionary development of workflows
US20080040181A1 (en) * 2006-04-07 2008-02-14 The University Of Utah Research Foundation Managing provenance for an evolutionary workflow process in a collaborative environment
US8060391B2 (en) * 2006-04-07 2011-11-15 The University Of Utah Research Foundation Analogy based workflow identification
US20070266092A1 (en) * 2006-05-10 2007-11-15 Schweitzer Edmund O Iii Conferencing system with automatic identification of speaker
US8326927B2 (en) * 2006-05-23 2012-12-04 Cisco Technology, Inc. Method and apparatus for inviting non-rich media endpoints to join a conference sidebar session
JP4946189B2 (en) * 2006-06-13 2012-06-06 富士ゼロックス株式会社 Annotation information distribution program and annotation information distribution apparatus
US7934160B2 (en) * 2006-07-31 2011-04-26 Litrell Bros. Limited Liability Company Slide kit creation and collaboration system with multimedia interface
US8358763B2 (en) * 2006-08-21 2013-01-22 Cisco Technology, Inc. Camping on a conference or telephony port
EP2067119A2 (en) 2006-09-08 2009-06-10 Exbiblio B.V. Optical scanners, such as hand-held optical scanners
WO2008031625A2 (en) 2006-09-15 2008-03-20 Exbiblio B.V. Capture and display of annotations in paper and electronic documents
US7847815B2 (en) * 2006-10-11 2010-12-07 Cisco Technology, Inc. Interaction based on facial recognition of conference participants
US8121277B2 (en) * 2006-12-12 2012-02-21 Cisco Technology, Inc. Catch-up playback in a conferencing system
US8594305B2 (en) * 2006-12-22 2013-11-26 International Business Machines Corporation Enhancing contact centers with dialog contracts
US9247056B2 (en) 2007-02-28 2016-01-26 International Business Machines Corporation Identifying contact center agents based upon biometric characteristics of an agent's speech
US9055150B2 (en) 2007-02-28 2015-06-09 International Business Machines Corporation Skills based routing in a standards based contact center using a presence server and expertise specific watchers
US8924844B2 (en) * 2007-03-13 2014-12-30 Visual Cues Llc Object annotation
US20080227076A1 (en) * 2007-03-13 2008-09-18 Byron Johnson Progress monitor and method of doing the same
US20080235597A1 (en) * 2007-03-19 2008-09-25 Mor Schlesinger Systems and methods of data integration for creating custom books
US7937663B2 (en) 2007-06-29 2011-05-03 Microsoft Corporation Integrated collaborative user interface for a document editor program
US7941399B2 (en) 2007-11-09 2011-05-10 Microsoft Corporation Collaborative authoring
US8825758B2 (en) 2007-12-14 2014-09-02 Microsoft Corporation Collaborative authoring modes
US9965638B2 (en) * 2008-01-28 2018-05-08 Adobe Systems Incorporated Rights application within document-based conferencing
US8184141B2 (en) * 2008-02-04 2012-05-22 Siemens Enterprise Communications, Inc. Method and apparatus for face recognition enhanced video mixing
US20090217196A1 (en) * 2008-02-21 2009-08-27 Globalenglish Corporation Web-Based Tool for Collaborative, Social Learning
US8612469B2 (en) * 2008-02-21 2013-12-17 Globalenglish Corporation Network-accessible collaborative annotation tool
US20090222742A1 (en) * 2008-03-03 2009-09-03 Cisco Technology, Inc. Context sensitive collaboration environment
WO2009120921A1 (en) * 2008-03-27 2009-10-01 Knowledge Athletes, Inc. Virtual learning
US8531447B2 (en) 2008-04-03 2013-09-10 Cisco Technology, Inc. Reactive virtual environment
US8352870B2 (en) 2008-04-28 2013-01-08 Microsoft Corporation Conflict resolution
US8095595B2 (en) * 2008-04-30 2012-01-10 Cisco Technology, Inc. Summarization of immersive collaboration environment
US8825594B2 (en) 2008-05-08 2014-09-02 Microsoft Corporation Caching infrastructure
US8190633B2 (en) * 2008-06-16 2012-05-29 The University Of Utah Research Foundation Enabling provenance management for pre-existing applications
US10127231B2 (en) * 2008-07-22 2018-11-13 At&T Intellectual Property I, L.P. System and method for rich media annotation
US20100070881A1 (en) * 2008-09-12 2010-03-18 At&T Intellectual Property I, L.P. Project facilitation and collaboration application
US8751559B2 (en) * 2008-09-16 2014-06-10 Microsoft Corporation Balanced routing of questions to experts
US8892630B1 (en) 2008-09-29 2014-11-18 Amazon Technologies, Inc. Facilitating discussion group formation and interaction
US8924863B2 (en) * 2008-09-30 2014-12-30 Lenovo (Singapore) Pte. Ltd. Collaborative web navigation using document object model (DOM) based document references
US8706685B1 (en) 2008-10-29 2014-04-22 Amazon Technologies, Inc. Organizing collaborative annotations
US9083600B1 (en) * 2008-10-29 2015-07-14 Amazon Technologies, Inc. Providing presence information within digital items
JP5369702B2 (en) * 2009-01-23 2013-12-18 セイコーエプソン株式会社 Shared information display device, shared information display method, and computer program
DE202010018601U1 (en) 2009-02-18 2018-04-30 Google LLC (n.d.Ges.d. Staates Delaware) Automatically collecting information, such as gathering information using a document recognizing device
US9195739B2 (en) * 2009-02-20 2015-11-24 Microsoft Technology Licensing, Llc Identifying a discussion topic based on user interest information
WO2010105245A2 (en) 2009-03-12 2010-09-16 Exbiblio B.V. Automatically providing content associated with captured information, such as information captured in real-time
US8447066B2 (en) 2009-03-12 2013-05-21 Google Inc. Performing actions based on capturing information from rendered documents, such as documents under copyright
US20100325557A1 (en) * 2009-06-17 2010-12-23 Agostino Sibillo Annotation of aggregated content, systems and methods
US9081799B2 (en) 2009-12-04 2015-07-14 Google Inc. Using gestalt information to identify locations in printed information
US9323784B2 (en) 2009-12-09 2016-04-26 Google Inc. Image search using text-based elements within the contents of images
US9118612B2 (en) 2010-12-15 2015-08-25 Microsoft Technology Licensing, Llc Meeting-specific state indicators
US9383888B2 (en) * 2010-12-15 2016-07-05 Microsoft Technology Licensing, Llc Optimized joint document review
US9864612B2 (en) 2010-12-23 2018-01-09 Microsoft Technology Licensing, Llc Techniques to customize a user interface for different displays
US9251130B1 (en) 2011-03-31 2016-02-02 Amazon Technologies, Inc. Tagging annotations of electronic books
US8682973B2 (en) 2011-10-05 2014-03-25 Microsoft Corporation Multi-user and multi-device collaboration
US9043939B2 (en) * 2012-10-26 2015-05-26 International Business Machines Corporation Accessing information during a teleconferencing event
US9483753B2 (en) * 2013-01-05 2016-11-01 Hewlett-Packard Development Company, L.P. Integrating document related communication with a document
US8892679B1 (en) 2013-09-13 2014-11-18 Box, Inc. Mobile device, methods and user interfaces thereof in a mobile device platform featuring multifunctional access and engagement in a collaborative environment provided by a cloud-based platform
US9704137B2 (en) * 2013-09-13 2017-07-11 Box, Inc. Simultaneous editing/accessing of content by collaborator invitation through a web-based or mobile application to a cloud-based collaboration platform
US10866931B2 (en) 2013-10-22 2020-12-15 Box, Inc. Desktop application for accessing a cloud collaboration platform
US9753921B1 (en) 2015-03-05 2017-09-05 Dropbox, Inc. Comment management in shared documents
US10366490B2 (en) * 2017-03-27 2019-07-30 Siemens Healthcare Gmbh Highly integrated annotation and segmentation system for medical imaging
US10657954B2 (en) * 2018-02-20 2020-05-19 Dropbox, Inc. Meeting audio capture and transcription in a collaborative document context
US11488602B2 (en) 2018-02-20 2022-11-01 Dropbox, Inc. Meeting transcription using custom lexicons based on document history
US10467335B2 (en) 2018-02-20 2019-11-05 Dropbox, Inc. Automated outline generation of captured meeting audio in a collaborative document context
CN111144074B (en) * 2018-11-05 2022-04-22 腾讯科技(深圳)有限公司 Document cooperation method and device, computer readable storage medium and computer equipment
US11689379B2 (en) 2019-06-24 2023-06-27 Dropbox, Inc. Generating customized meeting insights based on user interactions and meeting media
US11074400B2 (en) * 2019-09-30 2021-07-27 Dropbox, Inc. Collaborative in-line content item annotations

Family Cites Families (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5671428A (en) * 1991-08-28 1997-09-23 Kabushiki Kaisha Toshiba Collaborative document processing system with version and comment management
IT1256823B (en) * 1992-05-14 1995-12-21 Olivetti & Co Spa PORTABLE CALCULATOR WITH VERBAL NOTES.
US5649104A (en) * 1993-03-19 1997-07-15 Ncr Corporation System for allowing user of any computer to draw image over that generated by the host computer and replicating the drawn image to other computers
US5502727A (en) * 1993-04-20 1996-03-26 At&T Corp. Image and audio communication system having graphical annotation capability
US5537526A (en) * 1993-11-12 1996-07-16 Taugent, Inc. Method and apparatus for processing a display document utilizing a system level document framework
US5623681A (en) * 1993-11-19 1997-04-22 Waverley Holdings, Inc. Method and apparatus for synchronizing, displaying and manipulating text and image documents
US5768607A (en) * 1994-09-30 1998-06-16 Intel Corporation Method and apparatus for freehand annotation and drawings incorporating sound and for compressing and synchronizing sound
US5826025A (en) * 1995-09-08 1998-10-20 Sun Microsystems, Inc. System for annotation overlay proxy configured to retrieve associated overlays associated with a document request from annotation directory created from list of overlay groups
US5838313A (en) * 1995-11-20 1998-11-17 Siemens Corporate Research, Inc. Multimedia-based reporting system with recording and playback of dynamic annotation
US6041335A (en) * 1997-02-10 2000-03-21 Merritt; Charles R. Method of annotating a primary image with an image and for transmitting the annotated primary image
US6279014B1 (en) * 1997-09-15 2001-08-21 Xerox Corporation Method and system for organizing documents based upon annotations in context
US6546405B2 (en) * 1997-10-23 2003-04-08 Microsoft Corporation Annotating temporally-dimensioned multimedia content
US6119147A (en) * 1998-07-28 2000-09-12 Fuji Xerox Co., Ltd. Method and system for computer-mediated, multi-modal, asynchronous meetings in a virtual space
US6230171B1 (en) * 1998-08-29 2001-05-08 International Business Machines Corporation Markup system for shared HTML documents
US6687878B1 (en) * 1999-03-15 2004-02-03 Real Time Image Ltd. Synchronizing/updating local client notes with annotations previously made by other clients in a notes database
US6859909B1 (en) * 2000-03-07 2005-02-22 Microsoft Corporation System and method for annotating web-based documents
US20020116420A1 (en) * 2000-09-28 2002-08-22 Allam Scott Gerald Method and apparatus for displaying and viewing electronic information
US20020099552A1 (en) * 2001-01-25 2002-07-25 Darryl Rubin Annotating electronic information with audio clips
US7103848B2 (en) * 2001-09-13 2006-09-05 International Business Machines Corporation Handheld electronic book reader with annotation and usage tracking capabilities
US20040034832A1 (en) * 2001-10-19 2004-02-19 Xerox Corporation Method and apparatus for foward annotating documents
US20040267693A1 (en) * 2003-06-30 2004-12-30 Darryn Lowe Method and system for evaluating the suitability of metadata

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
No Search *

Also Published As

Publication number Publication date
WO2004109428A8 (en) 2005-03-17
US20040250201A1 (en) 2004-12-09
US7257769B2 (en) 2007-08-14

Similar Documents

Publication Publication Date Title
US7257769B2 (en) System and method for indicating an annotation for a document
US6914519B2 (en) System and method for muting alarms during a conference
EP2064857B1 (en) Apparatus and method for automatic conference initiation
US20050018828A1 (en) System and method for indicating a speaker during a conference
US7184531B2 (en) System and method for authorizing a party to join a conference
US7756923B2 (en) System and method for intelligent multimedia conference collaboration summarization
US7917582B2 (en) Method and apparatus for autocorrelation of instant messages
US7450696B2 (en) Knowledge management, capture and modeling tool for multi-modal communications
US7545758B2 (en) System and method for collaboration summarization playback
US7813488B2 (en) System and method for providing information regarding an identity's media availability
US8321794B2 (en) Rich conference invitations with context
US7130403B2 (en) System and method for enhanced multimedia conference collaboration
JP5615922B2 (en) Mashups and presence found on the phone
Yankelovich et al. Meeting central: making distributed meetings more effective
EP1661024B1 (en) Method and system for providing conferencing services
US8707186B2 (en) Conference recap and recording
US7248684B2 (en) System and method for processing conference collaboration records
US20060234735A1 (en) Presence-enabled mobile access
CN101455033A (en) User presence aggregation at a server
US20050071271A1 (en) System and method for providing information regarding an identity's true availability
EP1429528B1 (en) System and method for collaboration summarization playback
JP2003296257A (en) Network conference system

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A2

Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BW BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE EG ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NA NI NO NZ OM PG PH PL PT RO RU SC SD SE SG SK SL SY TJ TM TN TR TT TZ UA UG US UZ VC VN YU ZA ZM ZW

AL Designated countries for regional patents

Kind code of ref document: A2

Designated state(s): BW GH GM KE LS MW MZ SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IT LU MC NL PL PT RO SE SI SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG

121 Ep: the epo has been informed by wipo that ep was designated in this application
D17 Declaration under article 17(2)a
122 Ep: pct application non-entry in european phase