WO1992021095A1 - Animation - Google Patents

Animation Download PDF

Info

Publication number
WO1992021095A1
WO1992021095A1 PCT/GB1992/000927 GB9200927W WO9221095A1 WO 1992021095 A1 WO1992021095 A1 WO 1992021095A1 GB 9200927 W GB9200927 W GB 9200927W WO 9221095 A1 WO9221095 A1 WO 9221095A1
Authority
WO
WIPO (PCT)
Prior art keywords
subset
line
lines
data
curve
Prior art date
Application number
PCT/GB1992/000927
Other languages
French (fr)
Inventor
Andrew Louis Charles Berend
Mark Jonathan Williams
Michael John Brocklehurst
Gavin Timothy Jones
Stuart Philip Hawkins
Original Assignee
Cambridge Animation Systems Limited
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from GB909026120A external-priority patent/GB9026120D0/en
Priority claimed from GB919100632A external-priority patent/GB9100632D0/en
Priority claimed from GB919102125A external-priority patent/GB9102125D0/en
Priority claimed from GB9110945A external-priority patent/GB2256118A/en
Priority claimed from GB9117409A external-priority patent/GB2258790A/en
Application filed by Cambridge Animation Systems Limited filed Critical Cambridge Animation Systems Limited
Priority to JP4510508A priority Critical patent/JPH06507742A/en
Publication of WO1992021095A1 publication Critical patent/WO1992021095A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T13/00Animation
    • G06T13/802D [Two Dimensional] animation, e.g. using sprites
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • G06T11/001Texturing; Colouring; Generation of texture or colour

Definitions

  • This invention relates to apparatus for, and a method of, producing a sequence of images defining an animated sequence, such as a cartoon.
  • cartoons are manually drawn as a sequence of frames which, when played in succession at relatively high speed, form a moving picture (typical frame rates are 24, 25 or 30 frames per second, although sometimes frames are repeated twice). Even for a short sequence, many thousands of frames thus need to be drawn by hand and production of the hand drawn frames requires large teams of skilled animators and assistance. Almost all cartoon animation today is still produced in this way.
  • the key frames produced by the senior animator are scanned in some manner by input means into an image processor such as a programmed computer, where an internal representation of each is stored.
  • Corresponding points or lines, or other parts of two key frames are identified manually and in between images are generated by producing a sequence of frames in each of which a similar set of points, lines or parts are generated by interpolation between those of two adjacent stored key frames. The remainder of the frame between the identified points or parts is then generated.
  • the senior animator In the above described steps in conventional production of cartoon animation, the senior animator often draws only a rough guide to the shape of the character for each key frame.
  • the rough shape includes only essential outlines, and may be drawn in blue pencil. By comparing a succession of such blue pencil drawings, the animator can construct the overall motion of the character.
  • an assistant adds the detail features (for example facial features) to each frame (key frames and in between frames) and produces a black line image including all the detail required.
  • the image including the blue pencil lines and the fully dark lines is then copied using a copier which does not reproduce the blue lines.
  • a host of lines for each character may be shown in each key frame, on the editing display used by the animator to create the key frames, and this large number of lines needs to be edited each time a new key frame is produced.
  • This can be inconvenient, and wasteful of the valuable time of the animator.
  • we provide an animation system in which the frames to be edited are displayed as line images for the animator to edit, in which each line image comprises a first subset of lines which are displayed in a first manner and are manipulable for editing, and a second subset of lines which are displayed in a different manner or are not displayed, in which the animation system changes the positions of the second subset of lines in accordance with changes to the positions of the first subset.
  • the second subset of lines may be selectively not displayed, to reduce the complexity of the image for editing.
  • each line of the second subset is linked to at least one line of the first subset so that movements of that line of the first subset lead to specified movements of the corresponding linked line of the second subset.
  • the first subset of lines can be viewed as corresponding to the "blue pencil image" produced by animators, and the second subset as corresponding to the detail added by the assistant or "clean-up artist".
  • the invention reduces the work involved in animation by moving the detail lines with the "blue pencil” lines, so that the amount of editing necessary to add the detail lines to each keyframe is greatly reduced.
  • the animator can therefore edit one keyframe to create a new keyframe, by merely changing the curves of the first subset.
  • all keyframes are derived from a common template frame, which includes all lines which will be used in the keyframe, and the animator can copy the template and edit the lines of the first subset to create a new keyframe.
  • the present invention provides apparatus for generating an animated sequence of pictures, which comprises means for storing data defining a plurality of pictures as, for each picture, data defining a plurality of lines, the apparatus storing first line data defining the positions of a first subset of said lines within the picture, second line data for defining the position of a second subset of lines in the picture and link data indicating a correspondence between a second line and at least one first line, the apparatus further comprising means for amending the first line data, and for generating the second line data in dependence upon the amendments to the first line data and on the linking data.
  • Other preferred aspects and embodiments of the invention are as described or claimed hereafter, with advantages that will be apparent from the following: singly or in combination, these embodiments enable the provision of an automatic animation system for character animation which is fast and straightforward to use.
  • Figs. 1a-e illustrate curve approximations
  • Figs. 2a and 2b illustrate the effect of varying the control variables used in parametric cubic curves
  • Fig. 3 is a block diagram of apparatus according to one embodiment of the invention.
  • Fig. 4 is a block diagram of apparatus according to a further embodiment of the invention
  • Fig. 5 is a block diagram of apparatus according to yet further embodiments of the invention.
  • Figs. 6a illustrates the information stored in a memory of the apparatus of this embodiments to represent the curve shown in Fig. 6b;
  • Fig. 7 is a block diagram schematically illustrating the operation of these embodiments in generating a display;
  • Fig. 8 is a flow diagram schematically illustrating the operation of the apparatus of Fig. 7;
  • Fig. 9 is a block diagram schematically illustrating the operation of editing the data shown in Fig. 6a;
  • Fig. 10 is a flow diagram schematically showing the process of operation of the apparatus of Fig. 9;
  • Fig. 11 is a flow diagram showing schematically the sequence of operations undertaken by a user of the apparatus of the above embodiments;
  • Fig. 12 is a block diagram indicating schematically the manner in which data is stored within the memory as a linked list
  • Figs. 13a-13e provide greater details of the data stored within the elements of Fig. 12;
  • Fig. 14 shows in greater detail the arrangement of data stored in the memory corresponding to a displayed picture;
  • Fig. 15 shows schematically the arrangement of the information of Fig. 14 within the memory as a linked list
  • Fig. 16 is a block diagram illustrating schematically the contents of an image store in the above embodiments
  • Fig. 17 shows schematically the arrangement of display areas of a display device in the above embodiments
  • Fig. 18 shows the appearance of related displays in the display areas
  • Figs. 19a-19c show alternative display formats
  • Fig. 20 illustrates the appearance of the display during editing
  • Fig. 21 shows schematically the appearance of a further display area in the above embodiments
  • Fig. 22 shows schematically a method of interpolating animated frames in one embodiment of the invention
  • Fig. 23 shows schematically a part of the method of Fig. 22;
  • Fig. 24 shows schematically a method of converting a key frame to an interpolated frame in that embodiment
  • Fig. 25 shows schematically a method of deleting a key frame in that embodiment
  • Fig. 26 shows schematically a method of moving a key frame in that embodiment
  • Figs. 27a-d show schematically the results displayed on the monitor corresponding to the operations of Figs. 22-26;
  • Fig. 28a-d show schematically the effects on the monitor 160 of converting between a point on a curve and a curve control point;
  • Figs. 29a-d show corresponding amendments to the contents of the memory 121;
  • Fig. 30 show schematically a method of preparing frames for interpolation or addition
  • Fig. 31 show schematically a method of adding frames together to produce a composite frame
  • Fig. 32 show schematically a method of deriving curve point positions in interpolated or added frames
  • Fig. 33 shows illustratively the functional elements of the invention, provided in this embodiment by a single processor
  • Fig. 34 shows schematically the appearance of connected lines on the display in an embodiment of the invention
  • Fig. 35 shows schematically the arrangement of data within the memory 120 corresponding to the connected lines of Fig. 34;
  • Fig. 36 shows schematically a method of generating the display of Fig. 34;
  • Fig. 37 corresponds to Fig. 34 and shows schematically the effect of a first method according to Fig. 36;
  • Fig. 38 shows schematically the effect of a second method according to Fig. 36;
  • Fig. 39 shows schematically the appearance of a line connected to two others on a display in a further embodiment, before and after the process of Fig. 36;
  • Fig. 40 shows in greater detail a part of the process of Fig. 36 producing the effect shown in Fig. 37;
  • Fig. 41 shows in greater detail a part of the process of Fig. 36 generating the display shown in Fig. 39;
  • Fig. 42 shows the display produced by an alternative method to that of Fig. 41, according to a further embodiment of the invention.
  • Fig. 43 shows the effect of a yet further method, according to an alternative embodiment of the invention
  • Fig. 44 shows the display produced by a yet further embodiment of the invention employing a method alternative to that of Fig. 41;
  • Fig. 45 illustrates a display comprising a line connected to multiple other lines
  • Fig. 46 illustrates a display provided by a yet further embodiment of the invention
  • Fig. 47A illustrates the display produced in a first mode according to a further aspect of the invention
  • Fig. 47B illustrates the display produced by a second mode according to that aspect
  • Fig. 48 is a flow diagram showing schematically the sequence of operations performed by the aspect of the invention illustrated in Fig. 47.
  • Fig. 1A a fairly smooth freehand curve is shown.
  • Fig. 1B one way of representing the curve would be to draw a series of straight line segments, meeting at points.
  • the number of straight line segments has to be large, as illustrated in Fig. 1C, before the simulation is at all convincing.
  • the curve may be represented as a series of curve segments running between points. If, as in Fig. 1D, adjacent curve segments have the same slope at the point of which they join, the curve can be made smooth.
  • the slope of the curved segment is also fixed or predetermined so that each segment can be matched to its neighbours to provide a continuous curve if desired.
  • the shape of the curve between the end points is partially dictated by the slopes at the end points, but also by a further item of information at each point which is conveniently visualised as the length of a tangent vector at each point.
  • the curve between the two points may be thought of as having at its end clamped at the end points, at fixed slopes thereat, whilst the tangent vector exercises a pull on the direction of the curve which is proportional to its length, so that if the tangent vector is long the curve tends to follow the tangent over much of its length.
  • the data used to define a curve segment is the coordinates of the end points, the slope of the tangent vector at each end point, and the length of each tangent vector.
  • the data used to define a curve segment are the coordinates of the end points, and the coordinates of the ends of each tangent vectors. Conversion between the Hermite and Bezier format is merely a matter of polar to rectangular conversion, and vice versa.
  • Fig. 2A shows the effect of varying the magnitude or lengths of the tangent vectors, whilst keeping their angle constant. It will be seen that the effect is to "pull" the curve towards the tangent vector, more or less strongly depending on the length of the tangent vector.
  • Fig. 2B shows the effect of varying the angle of the tangent vector whilst keeping its magnitude fixed.
  • a smooth curve is defined by a number of such end points, and two adjacent such segments will share a common end point. If the curve is to be smooth, the tangent angles defined at the end point in relation to each curve segment will be equal, although the tangent vector lengths will in general not.
  • a 3 dimensional cubic spline is specified if each point has 3 dimensional coordinates; corresponding equations in z to those above are derivable by inspection.
  • apparatus comprises a computer 100 comprising a central processing unit 110, a memory device 120 for storing the program sequence for the CPU 110 and providing working read/write memory, a frame store 130 comprising a series of memory locations each associated with, or mapped to, a point in an image to be generated or processed, and an input/output controller 140 providing input and output ports for reading from and writing to external devices, all intercoupled through common parallel data and address buses 150.
  • a computer 100 comprising a central processing unit 110, a memory device 120 for storing the program sequence for the CPU 110 and providing working read/write memory, a frame store 130 comprising a series of memory locations each associated with, or mapped to, a point in an image to be generated or processed, and an input/output controller 140 providing input and output ports for reading from and writing to external devices, all intercoupled through common parallel data and address buses 150.
  • a monitor 160 is connected to the computer 100 and its display updated from the frame store 130 under control of the CPU 110.
  • At least one user input device 170a, 170b is provided; typically a keyboard 170b for inputting commands or control signals for controlling peripheral operations such as starting, finishing and storing the results of an image generation or image processing operation, and a position sensitive input device 170a such as, in combination, a stylus and digitising tablet, or a "mouse", or a touch sensitive screen on the monitor 160, or a "trackerball” device or a joystick.
  • a cursor symbol is generated by the computer 100 for display on the monitor 160 in dependence upon the signal from the position sensitive input device 170a to allow a user to inspect an image on the monitor 160 and select or designate a point or region of the image during image generation or processing.
  • a mass storage device 180 such as, for instance, a hard disk device is preferably provided as a long term image store, since the amount of data associated with a single image stored as a frame at an acceptable resolution is high.
  • the mass storage device 180 also or alternatively comprises a removable medium storage device such as a floppy disk drive or a high capacity tape drive, to allow data to be transferred into and out from the computer 100.
  • a printer 190 for producing a permanent visual output record of the image generated.
  • the output may be provided on a transparency or on a sheet of paper.
  • a film recorder 196 and/or a video recorder 197, and means for generating a suitably formatted moving picture output comprising a succession of frames, are also preferably provided.
  • a picture input device 195 such as a scanner for scanning an image on, for example, a slide, and inputting a corresponding video signal to the computer 100 may also be provided.
  • an animation system in one embodiment of the invention comprises the computer 100 of Fig. 3 providing an animators work station, and arranged to execute three different stored sequences so as to comprise an interpolator 101, a replayer 103 and a renderer 105.
  • the interpolator with which the present invention is chiefly concerned, is arranged to generate sequences of image frames.
  • the replayer 103 is arranged to recall a stored image sequence previously created, and generate a display of the sequence as a moving image on the animators workstation monitor 160.
  • the renderer 105 is arranged to colour the image, and may also affect the way in which the lines are represented (for example, their thickness).
  • the renderer 105 preferably operates as disclosed in our earlier application 9110945.4, our earlier PCT Application PCT/GB91/02/24, and our PCT application PCT/GB927-8 filed on the same day as the present application and claiming the same priorities (Agents Reference 5130799).
  • a plurality of workstations 110a-110c allowing different users to develop different parts of a given animated sequence, sharing a common mass storage (file server) unit 180 such as a disk drive unit with a controlling processor, and connected thereto via a local area network (LAN) such as Ethernet.
  • file server file server
  • LAN local area network
  • processors 110d-110f for performing the rendering operation, interconnected to the workstations 110a-110c via the local area network.
  • This enables the use to be made of simpler computers for the animator workstations 110a-110c; they may, for example, lack maths coprocessor devices and/or, sophisticated graphics engines.
  • the same processors may act either as rendering processors or workstations, depending on demand.
  • control means may be provided for determining the current processing load on each processor and for allocating rendering or workstation tasks to processors connected to the network so as to manage (e.g. balance) the processing load.
  • NeXTCUBE computer including the NeXTdimension colour board, available from NeXTComputer, Inc., USA.
  • This arrangement provides direct formatted outputs for connection to a videocassette recorder or other video storage device, and accepts video input signals. Further, it includes means for compressing images for storage on a disk store 180, and for decompressing such stored images for display.
  • display frames consisting of line drawings of objects, are created and/or edited with reference to stored control point data (preferably data stored in the Bezier format referred to above).
  • a stored representation of a display frame comprises a plurality of control points which define line segments which make up a line representation.
  • the memory 120 includes a working memory area 121 to which data may be written (for example, a random access memory area).
  • an image displayed on the monitor 160 includes at least one line A, which is drawn as a cubic curve defined by three control points A 1 , A 2 , A 3 .
  • Corresponding image frame data representing the line image is stored within a frame table 122 within the working memory 121, as a series of curves (curve 1, curve 2 etc) each of which is defined by a series of control points (point 1, point 2, point 3).
  • Each control point is represented by data comprising positional data ( x i ,y i ) representing the position within the area of the display of that control point, and tangent data (x ei ,y ei , x fi ,y fi ) defining two tangent end points associated with the curved segments on either side of the control point.
  • the tangent extent point data (x ei ,y ei , x fi ,y fi ) are stored as position data X, Y defining the position of the tangent end point. It would also be possible to store instead the x,y offsets from the control point position.
  • the monitor 160 is usually of the raster scanned type and consequently expects a raster scanned image, which is supplied from a memory mapped image store 130 as discussed above. Accordingly, it is necessary to provide a line display generating means 111 arranged to read the stored data representing the curve segments making up the frame, and generate corresponding raster image data comprising a plurality of pixels for storage in the image store 130. Each pixel need only comprise a single data bit or a small number of bits, if the display is monochrome black/white.
  • the line display generator 111 shown in Fig. 7 is accordingly arranged to access the memory 122 to read the stored data, and the image store 130 to write pixel data. As shown in Fig. 8, it calculates intervening point positions, and sets those memory locations within the image store 130 which corresponds to pixels lying on the curve to "dark" and all those which do not to "bright". The contents of the image store 130 are then displayed on the monitor 160.
  • the line display generating means 111 comprises the CPU 110 operating under control of the programme stored in a programme store area in the memory 120.
  • the line display generating means 111 may comprise the CPU 110 operating under the "PostScript" display command language provided within the operating system. The manner in which some basic operations are performed by the above apparatus will now be discussed. EDITING A FRAME
  • the preferred embodiments of the invention provide means for enabling a user to edit a frame. Editing a frame may involve either modifying the trajectory of existing lines or (more rarely) adding new lines. It is therefore necessary both to amend the data held in the frame table 122, and desirably to amend the image data in the image store 130 so as to enable the user to view the effects of the change. It is found that the best way of providing the user with means for amending the frame data stored in the table 122 is to allow him to employ a position sensitive input device 170a, so as to appear to directly amend the displayed representation of the frame on the screen monitor 160.
  • a user manipulates the position sensing input device 170a, for example "mouse", by moving the device 170a so as to generate a signal indicating the direction and extent of the movement.
  • This signal is sensed by the device input/output controller 140, which provides a corresponding signal to a cursor position controller 112 (in practice, provided by the CPU 110 operating under stored program control) which maintains stored current cursor position data in x,y co-ordinates and updates the stored cursor position in accordance with the signal from the device input/output controller 140.
  • the cursor position controller 112 accesses the image store 130 and amends the image data corresponding to the stored cursor position to cause the display of a cursor position symbol D on the display shown on the monitor 160.
  • the user may thus, by moving the input device 170a, move the position of the displayed cursor position symbol D.
  • the display line generator 111 is arranged in the editing mode not only to write data corresponding to the line A into the image store 130, but also to generate a display of the control point data. Accordingly, for each control point A, ,Aont, -che display generator 111 writes data representing a control point symbol (for example, a dark blob) into the image store 130 at address locations corresponding to the control point co-ordinates x,y.
  • the display generator 111 preferably, for each control point, correspondingly generates a second control point symbol E. (or two such symbols) located relative to the A. along a line defined by the control point tangent data at a position x e1 , y e1 , and/or x f1 ,y f1 ; preferably, a line between the two points A 1 and E 1 is likewise generated to show the tangent itself.
  • the user signals an intention so to do (for example by typing a command on the keyboard 170b, or by positioning the cursor symbol at a designated area of a displayed control menu), positions the cursor symbol d at desired point on the display 160, by manipulating the position sensitive input device 170a and generates a control signal to indicate that the desired point has been reached.
  • the cursor position controller 112 supplies the current cursor position data to the frame table 122 as control point position co-ordinates, and the display generator 111 correspondingly writes data representing a control point symbol into the image store 130 at address locations corresponding to the control point co-ordinates.
  • the user then inputs tangent extent point information, for example via the keyboard 170b, or in the manner described below.
  • the supervisory image generator 111 will correspondingly generate the line segment therebetween on the supervisory display by writing the intervening image points into the image store 130.
  • a user manipulates the input device 170a to move the cursor position symbol D to coincide with one of the control point symbols A 1 , or E 1 on the display 160.
  • the user then generates a control signal (for example, by "clicking" a mouse input device 170a).
  • the device input/ output controller 140 responds by supplying a control signal to the cursor position controller 112.
  • the cursor position controller 112 supplies the cursor position data to a supervisory display editor 113,
  • the display editor 113 is thereafter arranged to receive the updated cursor position from the cursor controller 112 and to amend the point data corresponding to the point A 1 with which the cursor symbol coincides, so as to move that point to track subsequent motion of the cursor. If the cursor is located at the point A. on the curve
  • the display generator 111 regenerates the line segment affected by the control point in question within the image store 130 so as to change the representation of the line on the monitor 160.
  • the user Once a line has been amended to a desired position, the user generates a further control signal (e.g by "clicking" the mouse input device 170a), and the supervisory display editor 113 thereafter ceases to amend the contents of the memory 120.
  • the cursor controller 112 continues to update the stored cursor position.
  • the processes performed by the apparatus of the preferred embodiment to the invention to enable a user to define an animated sequence are: 1. Defining Objects to be Animated - for example, characters. As will be disclosed in greater detail below, the apparatus of this embodiment permits the definition of a topological representation of a character or object to be animated.
  • Replaying - the sequence of frames is successively displayed at a display rate corresponding to a video image (24, 25 or 30 frames per second), to enable the user to view a representation of the animated sequence.
  • the sequence may be replayed with an associated sound track, to assess the correctness of timings and synchronisation.
  • Rendering - frames or sequences are coloured and/or shaded, and/or mixed with a desired background, to produce a finished video sequence.
  • Fig. 11 One typical sequence of operations of this embodiment is shown in Fig. 11.
  • the user will wish to create a character or object to animate.
  • the shape of the object will be changeable, but its underlying topology is maintained constant and the user will therefore create an initial "template” or set of data storing this underlying topology.
  • the template is a view of the character or object which includes all the lines (and, therefore, is defined by all the control points) which it is desired to show in later pictures of the character or object.
  • the template picture or frame is created on the monitor 160, preferably using the position sensitive input device 170a (for example "mouse") as described in greater detail below.
  • the template frame may comprise a first subset of curves stored with data indicating that they are primary, "blue pencil” or unattached curves and a second subset stored with data indicating that they are secondary "black pencil” or attached curves.
  • the template data (the curve control points, together with identification data labelling the template) permanently, on the mass storage device 180.
  • the user may summon a stored template from mass storage 180.
  • the next stage may be to create a number of key frames.
  • key frames are frames spaced apart in time which includes some change or shape or position of the character or object to be animated. Each key frame therefore has corresponding data identifying the point in the animated sequence at which the key frame occurs.
  • Key frames may be produced directly from the template to which they correspond, by copying the control point data making up the template and then editing the copied control point data to cause the key frame to diverge from the template.
  • the editing is preferably performed interactively, using as above the position sensitive input device 170a, and viewing the effects of the editing on the monitor 160.
  • the edited control point data then comprises the key frame data.
  • a key frame may likewise be produced by copying an existing frame; in this case it will be an indirect copy of the template frame.
  • the keyframe creation may be performed by first manipulating the primary curves and then, if necessary, re-editing the secondary curve position generated automatically by the system.
  • primary curve editing only the primary curves are displayed and edited on the monitor 160.
  • the key frame control point data comprise offset data defining the difference between a given key frame data and the corresponding data in the template.
  • the key frames need not be individually amended.
  • the key frames thus generated, or key frames recalled from the mass storage device 180, may then be processed to derive the intervening frames (interpolated frames).
  • Each interpolated frame comprises, as above, a set of control points defining the curves or lines making up the image frame.
  • Each control point of each interpolated frame is derived to lie between the control points of the pair of key frames it lies between.
  • the number of interpolated frames depends upon the separation in time of the two key frames between which the interpolation is performed.
  • the user may next view the interpolated sequence.
  • key frames are separated by less than one second, or less than 30 interpolants (although greater separations are of course possible) and it is therefore possible to provide a display including several key frames and the interpolants lying therebetween simultaneously on the screen of the monitor 160.
  • the user may store the sequence of interpolated frames in mass storage 180, or may wish to amend the sequence in some manner.
  • a first type of amendment comprises changing the time occurrence of the key frame; in this case, the key frame itself is not redrawn but the number of interpolants will change and consequently the interpolation must be repeated.
  • the user may wish to edit a key frame. Finally, he may (as discussed below) decide that a sequence cannot be directly interpolated and that therefore a new key frame needs to be inserted between two existing key frames; this may be achieved by converting an interpolated frame into a key frame (as discussed below in greater detail).
  • the next stage may typically be to animate the sequence, to test whether the timing and appearance is correct.
  • the apparatus therefore displays each key frame and interpolated frame of the sequence in turn, at short intervals in time. If the sequence is to be displayed at "normal" running speed, the interval is 1/24, 1/25 or 1/30 second between frames. Preferably, however, the user can vary the frame repetition rate so as to view the sequence in slow motion. Preferably, the user can also designate a short sub-sequence to be animated, and can move repeatedly forwards or backwards through the short sub-sequence. If the sequence is not correct, then as before the user will edit either the appearance or position in time of the key frame, or add or delete a key frame. The control point data making up the frames of the sequence are then typically saved to mass storage device 180, for later use.
  • the frames may be coloured and/or filled and/or added to an existing background ("rendered"), to generate a corresponding series of raster image frames which may be displayed on a colour monitor, saved on a video tape recorder, or compression coded and stored on the mass storage device 180.
  • a finished animation will involve different characters, and will be produced in segments or sequences.
  • a table 1000 of data is defined which includes data establishing the identity of the animated sequence (a title), data relating to the soundtrack, and a table of sequences 1100, 1200, 1300, 1400 of successive frames.
  • the sequences will occur in succession.
  • the sequences are stored as a linked list in the working memory 121; in other words, the complete animation table stores data identifying the location in the memory 121 of the first (and preferably the last) of the sequence tables 1100 ...., and each sequence table 1100 ... includes data identifying the address in memory 121 of the next sequence table (and preferably, the previous sequence table).
  • the invention therefore enables separate movements to be defined for different parts of the same template. This is achieved by creating separate key framese and interpolated frames therebetween for different parts of the template, and editing the separate sets of key frames and interpolants to achieve the desired motion, and then subsequently merging together the separate sets as will be discussed below.
  • Each set of key frames and interpolants does form a sequence over time, but for consistency the term "sequence” will be reserved in the following for the merged sequence of frames, and the term “timeline” will be used to describe the sequential set of frames (key frames and interpolated frames) corresponding to separate parts of the templates, separately animated which are merged to form the sequence.
  • sequence will be reserved in the following for the merged sequence of frames
  • timeline will be used to describe the sequential set of frames (key frames and interpolated frames) corresponding to separate parts of the templates, separately animated which are merged to form the sequence.
  • the single timeline also comprises the finished sequence.
  • each sequence table 1100 comprises data defining the template frame which the sequence animates, data (e.g. a pointer) indicating to which animation or epoch 1000 the sequence 1100 corresponds, a set of frame tables
  • curve sets comprising the composite or merged sequence (conveniently stored as a linked list of frames), a set of timeline tables 1110, 1120, 1130
  • data defining a currently displayed timeline and, conveniently, a set of frames or curve-sets which comprises the merged sum of all timelines except that currently displayed. This enables the currently displayed timeline to be easily edited, then merged with this "base" sequence of frames to replace the existing composited sequence.
  • the length, and the first and last frame addresses of the composite sequence are also stored.
  • each timeline table 1110, 1120 ... likewise defines a series of frame data tables, and for convenience these are stored as a linked list of key frames 1111, 1112, 1113 ... .
  • each key frame data table 1111, 1112, 1113 includes a pointer to a frame table 122, but also includes further data.
  • a pointer to a list of interpolant frame tables comprising those defining the interpolant frames lying after that key frame and prior to the next is included.
  • Frame tables 122 are associated with a stored frame type indicator, which in this case indicates that the frame table 122 is a key frame. Additionally, data defining the key frame number (i.e. its order amongst the key frames in the timeline 1110) is stored.
  • the interpolant frame data tables 1111A, 1111B, 1111C ... for each key frame 1111 each comprise a pointer to a frame curve set data table 122.
  • Each also includes an interpolation factor (typically 0-1) defining the extent to which the frame depends upon the following key frame 1112; thus, for successive interpolated frames 1111A, 1111B, 1111C ..., the interpolation factor gradually rises from close to 0 to close to 1.
  • the interpolated frame 1111A and the key frame 1111 each store a frame number, which defines their position in the timeline 1110 and sequence 1100. Frame numbers correspond to points in time succeeding one another by 1/24, 1/25 or 1/30 of a second (or whatever the frame repetition period is desired to be).
  • Figs. 14 and 15 show the arrangement of the frame table 122 of Fig. 6a in greater detail.
  • Each frame table 122 includes a list of lines or curves making up a set which represent the object or character which the frame depicts (and corresponds topologically to the template).
  • the template, key frames and interpolated frames may thus all be represented by similar frame tables 122.
  • the lines or curves are conveniently provided as a linked list of curve tables 2100, 2200, 2300, 2400, each curve table comprising a list of curve control points (again conveniently stored as a link list) 2110, 2120, 2130.
  • Each control point 2110 comprises position data defining the control point coordinates, and position data defining the control point tangent end coordinates.
  • the curve segment to the next control point may include attribute control points (which will be discussed in greater detail below) for controlling the values of attributes such as colour and transparency during the rendering process, or for enabling compatibility during interpolation as discussed below, and in this case it is desirable for the positions of these attribute control points to be interpolated between key frames at which they are defined, for use in the subsequent rendering operation.
  • attribute control points which will be discussed in greater detail below
  • control points 2110 include a list or table of attribute control points over the curve segment to the next curve control point.
  • Each attribute control point table entry may comprise data defining the value of the attribute controlled by the point (for example, the line colour or transparency), and comprises data defining the position along the line segment of the attribute control point; conveniently, this is the value of the parameter t at the point. Further details of attribute control points will be found in our above referenced other application.
  • the display generator 111 reads the corresponding frame table 122 and generates corresponding image data in an image buffer 130, which is then displayed.
  • a plurality of image buffers 130a, 130c are provided within the image store 130.
  • One buffer 130a comprises the display buffer, which represents (is mapped to) the display produced on the monitor 160.
  • the other buffers 130b, 130c ... provide "windows", as is known generally in the computing art, and each of which contains image data corresponding to a raster image optionally forming a portion of the monitor 160 display.
  • the images held in the frame buffers 130b, 130c ... are combined into the buffer 130a, and the size, position and order (i.e.
  • the display generated on the monitor 160 may therefore include display areas 160b-160e corresponding to some or all of the buffers 130b-130e, although preferably means (the input means 170 and CPU 110) are provided for enabling a user to select only one or some such display areas.
  • the buffer 130b as discussed above comprises image data which corresponds to a single selected frame of a sequence.
  • the buffer 130c is likewise dimensioned to contain a single frame image, which is however the image corresponding to the stored template.
  • the buffer 130d is arranged to store an image which comprises a montage of a succession of frames (key frames and interpolated frames) having successive frame numbers, and defining part of a timeline or sequence, in a manner described in greater detail below.
  • the buffer 130e stores a bar chart image comprising a plurality of bars each corresponding to one frame of the image in the buffer 130d, and each displaying the value of the interpolant factor for a corresponding frame as a length along the bar.
  • a given frame may be represented in three different manners simultaneously; firstly, as an individual display on the display area 160b which corresponds to the contents of the image store 130b; secondly, as part of sequence displayed in the sequence display 160d corresponding to the image store 130d; and thirdly, as a bar of the bar chart representing the timeline in which the image is included, displayed in the display area 160e corresponding to the image buffer 130e.
  • the image held in the sequence buffer 130d may be presented in differing formats.
  • a first format shown in Figs. 18a and 18b
  • the sequence image is produced by writing into the buffer I30d raster image data corresponding to each of the frames making up the sequence, so as to generate a display 160d in which the frame images are progressively displaced one from the other, but with some overlap so that each frame image partially overwrites its predecessor in the buffer 130d.
  • the frame images could also be provided with no progressive displacement (i.e. superimposed).
  • Fig. 19c shows an alternative embodiment in which each frame image is written into the buffer 130d into a spatially separate portion thereof, without overlap. This embodiment is also illustrated in Fig. 18.
  • FIG. 19c is of assistance in viewing motion, since corresponding parts of the object in successive frames are close together.
  • the representation of Fig. 19a enables each frame to be more clearly examined.
  • preferred embodiments of the invention provide means (e.g. the keyboard 170b) for selecting between these modes. It may also permit the displacement between successive frame images in the mode shown in Fig. 19c to be varied. Referring to Fig. 20, the presentation of a frame in the frame display area 160b when it is desired to edit a frame is shown.
  • the display generator 111 is arranged not only to generate the frame image data in the frame buffer 130b, but also to generate symbols (e.g. dots) at curvature control points.
  • symbols e.g. dots
  • the tangent end points and, more preferably, the tangent extent lines are also drawn.
  • a cursor symbol shown as a "+" is displayed, to enable a user to edit the frame image as discussed above using the position sensitive input device 170b.
  • the display area 160e displays the bar chart data display held in the timeline buffer 130e.
  • Each bar relates to a frame (key frame or interpolated frame) within a single timeline.
  • the length of the bar shows the interpolation factor associated with interpolated frames, and since key frames are (by definition) not interpolated, they have either maximum or minimum bar lengths.
  • the usefulness of the timeline display 160e and corresponding buffer 130e is, firstly, in providing the user with a synopsis of the information shown in the sequence image area 160d and, secondly, in providing a particularly simple way of editing the timeline, and seeing the effects on the timeline as a whole, by using the position sensitive input device 170a to position the cursor symbol at selected bars of the display 160e and signalling an appropriate control signal.
  • One type of such amendment is to alter the interpolation factor of given interpolated frame.
  • the height of the bar for that frame is varied to follow the cursor position symbol manipulated by the user, and the interpolation value stored in the corresponding frame table 111a is amended accordingly.
  • the values of the interpolation factor in successive frames follow a progressive sequence which is typically a linear sequence, but could equally follow any predetermined curve (usually monotonic) between the neighbouring key frames.
  • each bar is displayed in two colours, the height of the bar comprising the interface between the two, and that the colours of key frame bars (determined using the key frame numbers thereof) should alternate, and the height of the interpolant bars should rise with interpolation factor after one key frame, then fall with interpolation factor after the next, so as to present a rising and falling bar height rather than a sawtooth pattern. This is found easier for a user to interpret.
  • Such a progression gives the user an immediately visible sequence, which is considerably easier to use than having to specify each interpolant value individually, and it is found that in most cases, the same progression (for example, linear interpolation) can be employed.
  • it is extremely useful to be able to amend the sequence merely by amending the interpolation value of a given frame (rather than redrawing or editing the frame), and this is particularly advantageously achieved by onscreen manipulation of a bar chart display using the position sensitive input device 170a.
  • the chart display provides a readily visualised means for achieving this; using a position sensitive input device 170a, the user may designate one or a number of frames, and then move the frames along the timeline using the cursor symbol to a desired new position.
  • the apparatus is arranged to alter the frame numbers of the frames selected by the user, and to generate new intervening frames (or delete old frames) as required. More details will be given below.
  • the user signals a desire to create a new template by generating an appropriate signal using the keyboard 170b or position sensitive input device 170a, typically by selecting an option from a menu displayed (possibly permanently) on the monitor 160.
  • the CPU 110 then creates within the working memory 121 a template table, which comprises a frame table 122 and a datum indicating that the frame is a template frame. Because the template frame is not itself employed in a sequence, no sequence numbers are necessary.
  • the user will typically signal, via the keyboard 170b, a name to be associated with the template, which is stored therewith in the working memory 121.
  • the template display area 160c is generated on the monitor 160.
  • the cursor symbol is displayed within the display area 160c, and the user can proceed to build up a template. To do so, the user selects from the following options:
  • Creating a new curve - the current cursor position provides the x,y coordinates of the first control point.
  • the length of the tangent at this point is set to 0.
  • These values are written into the frame table 122.
  • the cursor position is continually monitored, and provides the second control point position coordinates, the values in the table 122 being continously updated with movements of the cursor until a control signal is generated by the user to fix the coordinates of the second control point (when the desired location is reached).
  • a line between the first and second control points is continually generated by the line generator 111 within the template buffer 130c, displayed on the template display area 160c, to enable the user to determine the correct position for the second control point.
  • the second control point tangent length is likewise initially set to 0.
  • Adding a new control point - a further control point can be inserted within a curve, to increase the complexity of the curve by dividing a segment of the curve into two.
  • the user positions the cursor symbol at a desired point along a curve displayed on the template display area 160c, and initiates a signal via the keyboard 170b, or the position sensitive input device 170a (for example, by "clicking" a mouse device).
  • the current cursor position coordinates are read, and the identity of the two control points which lie to either side of the current position along the line segment are determined.
  • the cubic equation is solved using the current cursor coordinates, to derive the value of the parameter t at the current cursor position.
  • a new control point record 2110 is created within the frame table 122 for the template, including pointers to the records of the two neighbouring control points on the curve.
  • the "next control point”and “previous control point” pointer field in the surrounding control point data records are amended to point to the new control point.
  • the slope and magnitudes of the tangents at the new control point are calculated, and stored in the new control point record.
  • the new control point may then be edited, to change the shape of the curve running through it.
  • the control point at which the cursor is located is looked up in the table 122 using the current cursor position coordinates, and the corresponding control point record is deleted.
  • the "next control point” and “previous control point” fields of the neighbouring control points on the curve segment are amended to point to each other and omit reference to deleted control point.
  • the desired line drawing is built up on the template display area 160c, and a corresponding set of curve data is stored in a frame table 122 labelled as corresponding to a template.
  • the user can also add attribute control points to control attributes of the finally rendered image, by positioning the cursor symbol to a desired point along the curve segment represented on the display device 160 and generating appropriate signal (e.g. by pressing appropriate key on the keyboard 170b) .
  • appropriate signal e.g. by pressing appropriate key on the keyboard 170b
  • the current cursor position is used to find the preceding curvature control point along the curve, to the attribute control point list of which a new attribute control point record is inserted and the pointers of surrounding attribute control points altered accordingly.
  • the value of the parameter t is derived and stored in the attribute control point record, and the user may input data concerning the value of the attribute at that point for later use (as described in our earlier UK Application 9110945.4, above referenced PCT Application PCT/GB91/02124 and
  • the contents of the template table 122 are stored to the mass storage device 180 so as to be recallable using the name or identification data for the template (e.g. the file name).
  • the set of curves comprising the key frame may be edited and stored in a key frame table, corresponding images being displayed in the frame display area 160b derived from the frame buffer 130b, in the same manner as described above with reference to a template.
  • the point in time of occurrence of the key frame in the sequence is also of significance; it is therefore necessary to store data defining the sequence and timeline to which the key frame belongs; the position of the key frame in the sequence relative to other key frames; and the absolute position in the sequence or timeline of the key frame (the frame number).
  • the key frame may be allocated the corresponding frame number.
  • the apparatus is preferably arranged to allocate the key frame a frame number equal to the currentt largest frame number plus one, and a key frame number equal to the current large key frame number plus one, so that the frame is added to the end of the existing timeline.
  • a new key frame table 122 is then created within the memory 120, and the CPU 110 copies the contents of the template frame table into the new key frame table so that the new key frame is identical to the template.
  • the address within the memory 120 of the new key frame is then inserted into the "next key frame" pointer of the neighbouring key frame or key frames in the timeline, and any other necessary pointers within the memory are set to reflect the addition of the new key frame.
  • the timeline image buffer 130e is amended to cause the generation of a new bar at the key frame position in the display area 160e, and then the interpolated frames of the preceding key frame, if any, are recalculated (as discussed in greater detail below).
  • a set of interpolated frames to the succeeding key frame are also calculated, and corresponding interpolated frame data tables are set up within the memory 120, as a list pointing to the new key frame.
  • the sequence display buffer is then updated to include the newly interpolated frames, and the display on the monitor 160 in the display area 160d is correspondingly altered, as is the timeline bar chart display area 160e.
  • the CPU 110 first checks (box 2201) whether there is an immediately previous key frame in the timeline. If there is, in other words if the present key frame is not the first frame of the timeline, the separation in frame numbers of the two key frames is found (box 2202) and the number of interpolants is set equal to this. The interpolation routine shown in Fig. 23 is then executed to interpolate from the preceding keyframe (box 2203).
  • the length of the list of interpolated frames of the earlier key frame is examined (box 2301); if the new number of interpolants required differs from the current number in the list, or if there is no current list, the current list is deleted and a new list of the required length is created (box 2302 in memory).
  • One non-linear function which may be used is sigmoidal; that is, tends to horizontal at either end and is monotonically rising in between, so as to slow the interpolation rate towards either keyframe, and smooth the transition through the key frame; other functions smoothing the transition are equally possible.
  • the curve data for each interpolated frame is derived and stored in the associated interpolated frame table.
  • Each key frame is derived from the same template, and hence will have the same number of curve control points.
  • the CPU 110 therefore takes the first curve control point of the earlier key frame and that of the later key frame, and stores for the first curve control point of the interpolated frame a value intermediate between the two.
  • the y coordinate is likewise given by:
  • a preferred embodiment allows values of L greater than unity; this permits a character to "overshoot", which gives a desirable visual effect in cartoon animation.
  • the (1-L) term may be set to zero.
  • the CPU 110 then proceeds to the next control point in the lists of the two key frame, and proceeds until all control points of all curves of the two key frames have been interpolated to produce corresponding control points in the interpolated frame.
  • the CPU 110 selects the next interpolated frame, with a correspondingly higher interpolation factor, and repeats the process .
  • the CPU 110 next (box 2204) determines whether there is a following key frame occurring in the timeline, (e.g. by referring to the pointers maintained in the timeline table 1110) and, if so (in other words, if the key frame is not the last frame of the time) the process of box 2202 is repeated (box 2205) and the process shown in Fig. 23 is again executed (box 2206) to interpolate frames between the key frame and the following key frame.
  • the CPU 110 amends the data held in the timeline image buffer 130e to reflect the new interpolation factors, and updates the display area 160e (box 2207).
  • a new image is generated in the sequence image store 130d corresponding to the new interpolant values and the sequence display area 160d is updated.
  • one convenient way of improving the sequences is to convert one of the interpolated frames into a key frame, and then edit the key frame as desired.
  • the user may for example position the cursor symbol at the bar on the timeline display area 160e and issue an appropriate control signal (for example, by "clicking" a mouse device 170a twice).
  • the CPU 110 then identifies the cursor position and derives the frame number of the corresponding interpolated frame.
  • the CPU 110 reads the frame table for that interpolated frame and locates the key frame in relation to which the interpolated frame from the key frame list is stored (box 2401).
  • the CPU 110 creates (box 2402) a new key frame data table, and allocates the key frame the next key frame number after that of the key frame to which the interpolated frame belonged. The frame number is retained.
  • the curve data of the interpolated frame table 122 is then copied into the new key frame table, and the interpolated frame table is deleted.
  • Reference to the new key frame is inserted into the list of key frames maintained in the time line table after the parent keyframe of the interpolant (box 2403).
  • the new key frame is then selected for display in the frame display area 160b (box 2404), and corresponding image data is generated in the frame image buffer 130d and displayed on monitor 160 (box 2405).
  • the frame is preferably displayed as shown in Fig. 20, with the curve control points and tangents indicated for editing.
  • Fig. 22 is executed (box 2407), and the sequence display frame store 130d is correspondingly modified to generate an updated sequence display in the display area 160d (box 2408).
  • the appearance of the other interpolated frames may not change until the new key frame has been edited, but the interpolation factors for each will have changed. Deleting a Key Frame
  • a user signals a desire to delete a key frame (for example, by positioning the cursor symbol at the corresponding bar of the bar chart display area 160e and issuing an appropriate control signal using the position sensitive input device 170a or keyboard 170b)
  • the CPU 110 reads the key frame number of the key frame concerned and accesses the timeline data table.
  • the key frame numbers of succeeding key frames in the list maintained by the timeline table are accordingly decremented by one (box 2501), and then the current key frame table is deleted from the memory 121 (box 2502). All interpolated frame tables, listed within the key frame table are also deleted.
  • the CPU 110 tests (box 2503) whether there is an earlier key frame in the key frame list. If the key frame is the first key frame of the timeline, the only further action taken is to regenerate the image data in the sequence display buffer 130d and update (box 2505) the sequence display 160d and likewise amend the timeline buffer 130e and display area 160e (box 2504), to remove the references to the deleted keyframe and its interpolated frame. The succeeding frames may also be shifted back in time.
  • the CPU 110 performs the interpolation process shown in Fig. 23 (box 2506) from the key frame which preceded the deleted frame to its new successor in the timeline. Since the frame numbers of the following key frames have not been changed, the key frame will be replaced for display by an interpolated frame (box 2507).
  • the sequence image in the sequence image store 130d and the bar chart image in the bar chart image buffer 130e are updated by the CPU 110 (box 2505), and correspondingly redisplayed on the monitor 160.
  • the preferred embodiment enables the user to indicate a particular key frame and change its time of occurrence in the sequence (e.g. frame number).
  • the user indicates an intention to move the key frame by positioning the cursor symbol at a desired key frame bar on the bar chart display area 160e and inputting an appropriate control signal, via the keyboard 170b or position sensitive input device 170a, and then moving the cursor symbol to the desired new key frame location.
  • the CPU 110 determines from the cursor symbol position the frame number corresponding to the new location and tests (box 2601) whether it has moved beyond its neighbouring keys. If the frame has not been moved past either of its neighbouring key frames, the frame number of the key frame is changed to that of the new location (box 2602) and the interpolation routine of Fig. 22 is then executed (box 2603). If the key frame is moved on to the frame number of its neighbouring key frames, the preexisting key frame is deleted (box 2604) and the key frame list is amended to avoid reference to it. The key frame numbers of all following key frames in the key frame list are then decremented (box 2605). After this, the CPU 110 continues, as above, by allocating the key frame a new frame number and interpolating using the process of Fig. 22 (boxes 2602, 2603).
  • the CPU 110 first (box 2606) removes the key frame from the key frame list and links the pointers of the neighbouring key frames, and then (box 2607) executes the interpolation routine of Fig. 23 to regenerate the interpolated frames for the key frame preceding the deleted key frame.
  • the CPU 110 locates the key frame at or immediately preceding in the key frame list the new frame to which the selected key frame is to be moved. If there is already a key frame at the position to which the selected key frame is to be moved, the CPU 110 deletes the record of that key frame (box 2609). The selected key frame is then inserted in the key frame list maintained in the timeline table, just after the previous key frame position, by amending the "previous" and "next" pointers in the key frame tables concerned (box 2610).
  • the key frame numbers of key frames between the old position and the new position are then decremented (box 2611) to reflect the new key order. Furthermore, if the key frame has replaced an existing key frame at its new position, subsequent key frames are also decremented. Thereafter, the CPU 110 proceeds as above to update the key frame frame number (box 2602), generate new interpolated frames between the key frame and its neighbour on either side (box 2603), and regenerate the sequence image buffer 130d and display 160d, and correspondingly the timeline buffer 130e and display area 160e (box 2612).
  • the CPU 110 is arranged to be capable of accepting an instruction to move a block of successive frames in time; the above process is in this embodiment essentially repeated for each such frame.
  • Figs. 27a-d the results of the sequence operations described above will be illustrated.
  • the user positions the position sensitive input device so as to move the cursor symbol to the next vacant point in the bar chart display area 160e on the monitor 160, and initiates a control signal indicating a desire to create a new key frame thereat.
  • the CPU 110 copies the template (or an existing key frame) to create a new key frame table in the memory 121 as discussed above.
  • the sequence display buffer 130d is regenerated, and the display area 160 consequently displays the new key frame at the end of the sequence.
  • the bar chart display area 160e likewise displays a new key frame bar.
  • the apparatus is arranged also to generate a new key frame which is a copy of an existing key frame; in this case, the user may designate the existing key frame he wishes to copy using the position sensitive input device 170a to position the cursor symbol appropriately, and upon generating an appropriate control signal via an input device the CPU 110 will, rather than copying the template table, copy the designated key frame table curve data to produce the new key frame table curve data.
  • the user then generates an input signal indicating an intention to move the just created key frame four frames later in time.
  • the CPU 110 performs the routine of the centre path of Fig. 26, and four interpolated frames are added to the interpolated frame list of the preceding key frame.
  • the sequence display and timeline displays 160d, 160e are then updated as above.
  • the user signals a desire to delete the preceding key frame and the CPU 110 executes the routine of Fig. 25. Since the last two key frames are now substantially identical, it will be seen that the key frames interpolated therebetween are likewise identical.
  • the user next signals an intention to convert one of the intervening interpolated frames into a key frame, to allow for subsequent editing.
  • the CPU 110 follows the routine of Fig. 24, and updates the displays 160d and 160e.
  • each key frame (and consequently each interpolated frame also) includes only those curves defined by curve control points which exist in the template frame.
  • the method of adding control points and new curves to the template has already been discussed above.
  • each key frame comprises the same curve data as the template to which it is consequently identical.
  • the user will often wish to delete some parts of the template for a given key frame; for instance, when an object is turned, many lines become invisible as they are obscured by other parts of the object.
  • the key frame corresponding to the turned object would therefore not be required to include those lines.
  • the user can delete some control points (and/or curves from a key frame, and the pointers in the frame table 122 will be correspondingly reset to omit references to the deleted points and curves.
  • the CPU 110 does not affect any other key frame or the template frame table.
  • the repositioning of the pointers within the frame table 122 does not affect the correspondence between the remaining control points and curves and their counterparts in the template set.
  • Each is still uniquely identifiable to the CPU as corresponding to a particular point in the template set. It is thus possible for different frames to correspond to different subsets of the template set. It may also occur that, whilst preparing a particular key frame, a user wishes to add a further control point or a curve comprising a number of control points.
  • the apparatus is arranged to allow a user to add further control points to a key frame, exactly in the manner described for the template frame, but upon his doing so CPU 110 is arranged to add a corresponding point to the template frame table.
  • the template frame table therefore always comprises a super set of the points held in each key frame. Interpolation between frames, and adding of frames of different timelines to produce composite frames, is still possible even if one frame includes extra curve control points or curves.
  • a first step is to make the two frames compatible by equalising the number of points to be interpolated between or to be added together. To illustrate the manner in which this is achieved, reference is made to Figs. 28 and 29.
  • a curve is shown as it would appear in the frame display area 160e.
  • the shape of the curve is defined by two control points A1, A2, at which the corresponding curve tangents are indicated.
  • Three attribute control points B1, B2, B3 are shown on the curve segment between the two curve control points A1, A2.
  • Fig. 29 shows the corresponding curve table 2100 stored within the working memory 121.
  • the table includes two curve control point records, first corresponding to A1, and pointing to the next record corresponding to A2.
  • the curve control point record corresponding to A1 also points to the list of attribute control point records, the first of which corresponds to B1, which in turn points to that corresponding to B2, which likewise points to that corresponding to B3.
  • the CPU 110 upon the user generating a control signal indicating that a selected attribute control point B2 is to be converted into a curvature control point located at the same position on the curve, the CPU 110 creates a new curve control point record A3 within the curve table 2100.
  • the record corresponding to the point A1 is altered to point to the new record, which in turn points to A2.
  • the attribute control point record corresponding to B2 is deleted from the attribute control point list.
  • the control point data stored for the new control point A3 corresponds to the position at the curve previously occupied by the attribute control point, and tangent extents such that the tangent slope at the new control point is the same as it had been at the attribute control point B2.
  • the lengths of the tangents at three curvature control points A1, A2, A3 are calculated so as to keep the shape of the curve unchanged; it will be observed from Fig. 28b that the length, but not the angles, of the tangent set control points A1 and A2 have altered. Accordingly, new extent point data is written to the records corresponding to A1 and A2 by the CPU 110.
  • the attribute control point B3 is deleted from the list of the curvature control point record for A1, and added to that for A2.
  • the position data defining the positions along the curve of the attribute control points Bl, B3 are recalculated within the curve segments from A1-A3 and A3-A2, and the new data are stored with the attribute control point records Bl, B3.
  • the user may employ the apparatus according to this embodiment to amend the curve shape by altering the control point data as described above; in particular, as shown in Fig. 28c, poonts of inflection may be introduced by setting the tangent extent points to define different tangent angles. Referring to Fig.
  • the apparatus of this embodiment is arranged also to convert a control point into an attribute control point if desired; in this case, the control point record A3 is deleted and the pointer stored with the record for A1 is amended to point to the record for A2.
  • a new attribute control point record for new attribute control point B2 is created.
  • the attribute point records for B2 and B3 are added to the list held for the curvature control point record for A1.
  • the curve is recalculated by the CPU 110, and the position data for the three attribute control points are amended.
  • Key frames in this embodiment of the invention are permitted to include more curvature control points than does the template frame from which they are derived, where a corresponding attribute control point exists in the template frame.
  • the CPU 110 is therefore arranged, when a curvature control point not having a corresponding curvature control point in another frame is located, to locate the corresponding attribute control point in the other frame and convert that point into a curvature control point as discussed above with reference to Figs. 28a and 28b and Figs. 29a and 29b.
  • the two frames will then be in correspondence, and may be added or interpolated between as discussed above.
  • one advantage of the preferred embodiments is that different parts of an object may be animated separately and the separate sub-sequences (timelines) can be amalgamated together. This is possible because all frames of the different timelines have the same topology, or are all the sub-set of a common template table.
  • the operation of adding frames is similar to that of interpolation, as discussed below, except that whereas in interpolation predetermined proportions of a pair of frames are added, in addition it is generally (although not necessarily) the case that equal proportions of each frame are added.
  • the CPU 110 locates a pair (or, in general, a plurality) of frames of different timelines occurring at the same point in time, and derives a composite frame by taking, for each curve control point of the composite frame, the corresponding curve control points in each of the existing frames. From the coordinates of these, the coordinates of the corresponding point of the template frame is subtracted so as to generate difference coordinates, defining the difference between the control point coordinates of the key frames, and the coordinates of the corresponding points of the template frame to which they correspond.
  • difference coordinates for a corresponding control point in each frame are then added together to form summed difference coordinates for that control point of the composite frame, to which the absolute coordinates of the corresponding control point in the template frame table are added to derive the composite control point coordinates.
  • each composite control point corresponds to the sum of the corresponding template control point coordinates, and the vector sum of the differences between the corresponding control points of time aligned frames of different timelines and the template.
  • the arithmetic can of course be rearranged so that the coordinates of the frames of the timeline are added together first and then predetermined multiples of the template coordinates are subtracted from the sum.
  • one way of adding a plurality of frames is as follows.
  • the CPU 110 creates a new frame table hereafter termed a difference table temporarily within the memory 121 for each frame which is to be added.
  • the coordinates of each curve control point of each frame are subtracted from those of the corresponding point stored in the template table, and the difference in coordinates are stored in the difference frame table corresponding to that frame.
  • the CPU 110 then creates a result frame table in the memory 110. It then reads the template table, and for each curve record, checks whether that curve record is present in any of the difference frames. If the corresponding curve exists in no difference frames, the CPU 110 proceeds to the next curve in the template table. If the corresponding curve exists in all difference frames, for each curve control point in the sequence, the sum of the difference coordinates for the corresponding control points in the difference tables is taken and the result is added to the coordinates of the corresponding point in the template table and stored in the result table. The next curve in the template table is then processed. If a curve is not in all the frames to be added, the CPU 110 tests whether any of the frames to be added are key frames and, if so, whether the curve in question is in a key frame.
  • the sum of the difference coordinates for the frames in which the curve is present is taken and added to the template coordinates as before. If not, in other words if the curve is present only in an interpolated frame or frames, the curve is omitted from the result frame table.
  • the result frame table will include all the curve control points necessary.
  • the CPU 110 derives the positions of any attribute points, as shown in Fig. 32, by taking in turn each curve in the results frame table and considering each attribute point in turn. If an attribute point occurs on all the curves to be added, the CPU 110 derives averaged or interpolated values for the attribute point position parameter and, optionally, for any attribute data (e.g. line width, or opacity profile) which may be stored in the attribute point records. The interpolated values (e.g. the average values) are then stored in the results table.
  • attribute data e.g. line width, or opacity profile
  • the CPU 110 allocates a value equal to the position value in the template for each frame in which the attribute point is absent and interpolates a new position between all frame values as above.
  • the interpolation of the attribute point position is not simply an interpolation between the two parametric position data values in the corresponding pair of frames interpolated between, but is derived by deriving the length of the corresponding curve segment in the interpolated frame, and the actual curve segment length is divided into the required interpolation ratio, the corresponding position on the curve is found, and the corresponding value of the parameter t at that position is derived and stored as the interpolated attribute point position.
  • the key frame attribute point position data is stored in the results table as this is relatively more significant than the position derived from an interpolated frame.
  • a current composite sequence comprising a set of frame tables is maintained together with a corresponding base sequence corresponding a further set of frame tables, the base sequence comprising the composite sum as discussed above of all timelines other than at presently being displayed for editing.
  • the current timeline After the current timeline has been edited, it is thus merely added to the current basic composite sequence to generate a new composite sequence, thus reducing the amount of computation necessary.
  • interpolation and addition will be seen to be closely similar; although in the above described embodiments, for clarity, interpolation between frames and addition of frame differences from the template are described, it is possible on the one hand to interpolate using frame differences (adding the result to the template frame coordinates) and on the other hand to add frames (subtracting the template coordinates or a multiple thereof afterwards); in practice, for convenience, the memory 121 may contain either frame tables stored as absolute point coordinates or frame tables stored as coordinates defining the difference from the corresponding coordinates in the template table.
  • the processes described in Figs. 30-32 are equally applicable, and are preferably applied, to interpolation, mutatis mutandis.
  • the replayer 103 in one embodiment of the invention is provided by the CPU 110 operating under suitable stored program control.
  • the replayer is arranged to display the frames at a rate corresponding to the frame repetition rate (24, 25 or 30 Hz) at which the sequence is to be displayed, so that the operator can view the sequence at a realistic rate.
  • the replayer 103 is arranged also to accept input commands from the keyboard or other input device specifying the speed of replay. This is particularly useful in enabling an operator to view crucial parts of the sequence in slow motion, or to move quickly through a sequence for cursory inspection.
  • the replayer 103 is arranged to accept input signals (from the keyboard 170b or more preferably, the position sensitive input device 170a in cooperation with the timeline display) to specify an initial and/or a final frame in the sequence between which the sequence is to be replayed. An operator can thereby designate a particular part of the sequence to be replayed, and the replayer 103 will display in turn each frame between the initial and end frames. It is particularly convenient if the replayer 103 is arranged to constantly cycle between the start and finish frames; this may either be by displaying the sequence repeatedly from the first frame to the last frame, or by displaying the sequence forwardly (from start to finish) and then backwardly (from last to first) repeatedly. This is found particularly useful in enabling the operator to localise a particular frame or series of frames which are incorrect, for subsequent editing.
  • the replayer 113 If the CPU 110 operates sufficiently fast, it would be possible for the replayer 113 to be arranged to access the memory 120, and to cause the display generator 111 to access in turn each frame table 122 corresponding to each frame of a sequence between the first and last frames specified.
  • the replayer 103 is arranged instead to perform an initial operation of creating, for each frame to be displayed, a raster image by causing the display generator 111 to access in turn each frame table 122 and generate an image in the image store 130, and after each image is created the replayer 103 is arranged to cause the image to be stored on the mass storage device (e.g. hard disk) 180.
  • the mass storage device e.g. hard disk
  • a computer which includes image compression means for compression encoding the image for storage on hard disk is preferred, since otherwise the volume of image data stored corresponding to the frames of even a relatively short sequence is extremely large.
  • the replayer 103 is arranged to display the sequence by accessing the image data corresponding to each frame in turn to refresh the image store 130 at the desired frame repetition rate. Once the operator signals a desire to cease replaying, the image data files corresponding to the frames in the replayed sequence may be deleted from the mass storage device 180, to reduce the memory used.
  • the replayer 103 is also arranged during the initial phase of preparing the sequence of images to cause the renderer 105 to render each frame as discussed below, so that the replayed sequence can be seen in colour and/or against the background.
  • the renderer 105 may again comprise the CPU 110 operating under stored program control, or may be provided by different computer 100. In either case, the operation of the renderer, as is conventional, is to colour the image and/or to mix the picture with a background picture.
  • the renderer 105 therefore reads the data stored in a table 122 corresponding to a frame to be rendered, and processes the frame in accordance with predetermined stored colour and/or background information.
  • the attribute control points stored, as described above may include colour and other attribute information (for example transparency), the manner of rendering which is described in our British Application No.
  • attribute control points are employed for several purposes; firstly, to set the values of attributes which will subsequently be used during rendering, so that a considerable amount of rendering information need be specified only for individual key frames and is automatically inserted into frames interpolated there between at correct positions; secondly, as a means for providing the possibility of extra curve control points, to increase the complexity where necessary without doing so otherwise, whilst maintaining topological similarity between the frames. Other applications are likewise not precluded.
  • attribute values need not be set at separate points to those used to define curvature, but may be provided at curvature control points; although it is very much preferred to provide the flexibility to define attributes at points along the curve segment as well.
  • apparatus allowing the definition of lines in terms of a limited number of control points controlling the path of the lines and also allowing the specification of attribute or feature properties at predetermined points along the lines is used, as described in our International Application GB91/02124, with such points marking the points at which further curves or structures of curves are to be connected to a line.
  • a flag may be stored indicating that one line is linked to another line or forms an assembly therewith and the two may be displayed connected on the supervisory display 160.
  • parameters e.g. the position
  • the supervisory display editor 113 upon moving scaling or otherwise varying (for example by another affine transformation) one object, parameters (e.g. the position) of the other are automatically correspondingly amended by the supervisory display editor 113.
  • the lines which are linked together are displayed on the supervisory display 160 with paths joined.
  • a control point symbol J is displayed. This symbol represents a path control point for the joining path, but has no effect upon the joined path. It is constrained to be moved along the joined path, to change the point of contact of the objects. Data relating to this point within the display is therefore stored within two separate line tables, as shown in Fig.
  • the stored data represents, as usual, position coordinates and tangent end point data whereas in the table 122A relating to the line to which it is connected, the data stored is a position value (relative to the line, and preferably a value of the interpolated parameter at that position along the line) and the address within the memory 120 of the line table 122B corresponding to the joining or subsidiary line.
  • the supervisory display editor 113 is operable firstly to change the parametric position of the joining point within the joined line table 122a, and secondly, to access the connected line table 122B using the stored base address, calculate the actual positional coordinates of the joining point, and amend the x,y coordinates of the curve control points in the attached line by a corresponding amount.
  • the editor 113 is likewise operable in this embodiment to recalculate the actual positions of the joining points of that line, and access the line tables 122B of the attached lines joined at those points, and correspondingly amend the curve control position data therein, as shown in Figure 6B, and the line generator updates the display on monitor 160.
  • the attached curve B is attached at only one point, shown as J, on the curve A.
  • the editor 113 first calculates the positional coordinates of the attachment point J on the curve A from the line table 122A.
  • This coordinate is the new coordinate for the corresponding point on the curve B.
  • this point is a curve control point for the curve B.
  • the entry in the table 122A may point to the control point entry in the table 122B.
  • the previous value of the position of this point is read from the Table 122B, and the difference in coordinates between the old position and the new position is calculated by the editor 113.
  • the editor 113 then reads each of the remaining curve control point and tangent end point coordinate data in turn from the table 122B, adds the coordinate difference, and stores the resulting shifted coordinate data back in the table 122B.
  • the shifted line may then be displayed.
  • Each curve table 122 is examined and an attachment state flag 901 is read (shown in Figure 35).
  • the attachment state flag 901 indicates whether a curve is attached to another (for example the curve B in Fig. 34), in which case its position is dictated by that of the curve to which it is attached, or whether the curve is unattached.
  • the attachment point positions along the line to which the second line is attached may be edited in the manner of other attribute points as described above and in our earlier filed international application GB91/02124; after any such variation of the attachment point, the editor 113 recalculates the positions of the attached line control points as described above.
  • the attachment between the two lines Is created or broken by adding or deleting an attribute point to the unattached line in the same manner as described above, and its position along the line is likewise editable by the user.
  • the state of the flag 901 in the table 122B is changed and the line is subsequently treated as an unattached line.
  • the data in table 122B is recalculated as described above.
  • the attribute point stored on the unattached line stores also data which defines the position of the second line relative to the attribute point (for example positional offset data specifying the coordinate difference between a control point of the attached curve from the attachment point of the unattached curve), and the editor 113 calculates the position of the attached curve taking into account this position data.
  • the editor 113 is arranged to vary the inclination of the attached curve with variations in the inclination of the curve to which it is attached.
  • Figure 37 (which corresponds to Figure 34) illustrates this.
  • the editor 113 is arranged not only to calculate the position of the attachment point, but also the tangent at that position.
  • the previous tangent value is retained, and the editor 113 calculates the angle through which the tangent has rotated (in other words, the angular difference between the old tangent and the present tangent ) and derives the control point positions of the attached curve by firstly translating the point positions, as described above, by the positional shift undergone by the attachment point and then rotating the shifted positions about the attachment point by the same amount as the rotation undergone by the tangent at the attachment point, as shown in Fig. 40.
  • an attached curve may be attached at two points, either to two different curves or to two portions of the same curve.
  • Figure 39 illustrates a case where a curve is attached at two points (in this case, at its curve control points at either end) to two other curves.
  • the two attachment points have been edited, and changed position. If the attached curve is to remain attached at both points, the new attached curve position data needs to be calculated taking account of changes to both of the curves to which it is attached.
  • the editor 113 first determines from the line tables 122 which curves, if any, within the image are doubly attached as shown in Fig.
  • the first step is to determine the positions, on each of the curves to which it is attached, of the attachment attribute points in the above embodiments. These two attachment point positions define between them a connecting straight line, shown in Fig. 39.
  • the total length of the line between the two end points is calculated, as the root of the sum of the squares of the x and y coordinate difference between the two attachment points.
  • the editor 113 calculates the minimum distance between each of the control point positions and tangent end point positions along the attached curve and the connecting line between the two points of attachment (in other words, the distance off the line of each point). This is readily calculated, as the minimum distance is the distance along a normal to the line which runs through the control point concerned. For each control point, the minimum distance value is temporarily stored.
  • the editor 113 also calculates the distance along the line of the point where each normal through each control point on the attached curve meets the line (in other words , the distance along the line of each control or tangent each point), and divides this into the total line length to provide a fractional length measurement for each control point along the line.
  • the editor 113 calculates the coefficients m and c of the new line running through the new attachment points derived after editing or interpolation of the unattached lines, as shown in Fig. 39B. The total length of this new line is also calculated. Each control point position of the attached curve is then calculated by calculating the corresponding distance along the new line using the stored fractional distance for each control point multiplied by the new line length, and then locating the new control point position at the stored off-line distance along a normal from that point. It would alternatively be possible to store the distances off the line as fractions of the total line length, and for the editor 113 to multiply the fractions by the new line length, so as to preserve the attached curve shape without distortion (other than scaling).
  • an initial straight line between the two attachment points is calculated as described above, but after editing (or interpolation), rather than calculating a new straight line, a curved line is calculated having the curvature at its ends dictated by the change in tangent angle at the new attachment point at either end.
  • the control point positions for the attached curve are then derived using the distances from and along this curved line.
  • the editor 113 derives initially the positions of the attachment points on each unattached curve and the tangent to the curve of each attachment point. The angle between the tangent to the curve at the attachment point and the connecting straight line is then calculated at each point. As before, the editor 113 then finds the new position for the attachment points, and also calculates the new tangent value to the unattached curve at each of the attachment points.
  • a new curve is then calculated to run between the two attachment points, having at each attachment point a tangent angle which is rotated from the tangent to the unattached curve by the same amount as was the straight line prior to editing, as shown in Fig. 42B.
  • the curved line may be calculated as a Bezier curve with tangent lengths of one third the length of the initial straight line.
  • any rotation of the tangent to the unattached curve causes an equal consequent rotation in the tangent at the end of the line running between the two attachment points.
  • the spacings of the control points of the attached curve along the new curved line connecting the two attachment points are used, as in the previous embodiment to locate the new control point positions.
  • the editor 113 is arranged to apply a spatial transformation to find the new control point positions as follows:
  • the editor 113 finds a patch of space which includes all the control points for the attached curve prior to editing, shown as P 1 in Fig. 43A.
  • this is achieved by defining a rectangle having two sides parallel to the straight line connecting the two attachment points, and two ends normal to the two sides running through the attachment points, the two sides parallel to the connecting line being defined by the greatest extents of the curve away from the connecting line (In other words, the "bounding box" of the curve).
  • the position of each control point within this area P 1 is then found, by using the sides as coordinate axes, as a pair of coordinates (u,v), scaled as a fraction of the whole length of the respective side, thus, a point in the lowest corner has coordinates (0,0) and one in the highest corner has coordinates (1,1).
  • the rotation of the tangent to the unattached curve at each attachment point after amendment is also calculated.
  • a new surface area P 2 is then derived, in which the two end edges run through the attachment points and are normal to the new straight line running between the two attachment points.
  • the other two edges are parallel Bezier curves, running parallel to the curved connecting line, calculated (as described above with reference to Fig. 42) by rotating the ends of the connecting straight line to match rotations in tangent angle of the unattached curves of the attachment points.
  • the editor 113 has therefore derived the coefficients which define two straight lines and two Bezier curves, which define the area P 2 after editing.
  • the initial patch P 1 is the same as in the previous embodiment but the newly calculated P 2 is derived to have its two end edges straight, and each end edge is rotated by an amount corresponding to the rotation of the tangent to the unattached curve at the attachment point.
  • the other two sides correspond to the curve running between the two attachment points calculated by rotating the tangent ends as described with reference to Figure 42. Since the two end sides of the patch P 2 are now in general no longer parallel, the two sides are rotated and scaled as necessary to run between the ends.
  • the editor 113 prevents the breakage of parallel tangents at each control point (for example as before by transforming the tangent end points and interpolating the path control points in between the two).
  • alternative methods of transforming the control point positions to take account of changes in shape of the region including the control point may be employed.
  • a curve may be attached at more than two points; in this case, the attachments resolve to a series of either doubly attached curve segments or singly attached curved segments, and the editor 113 is preferably arranged to process each segment according to one of the above described embodiments in turn. This could lead to discontinuities at the attachment points, where tangents may be caused to become non-parallel.
  • This problem is relatively straightforward for a human operator to overcome by appropriate editing, or alternatively the editor 113 could be arranged to subsequently make tangents at connection points parallel one to another (for example at an averaged inclination).
  • the editor 113 may permit the user to define, at a control point, a flag indicating that that control point is not to be moved with the attached curve, but is to remain fixed in space.
  • a fixed point acts in effect like a further attachment point.
  • the fixed point position and the tangent at the fixed point are employed instead of the attachment point and attachment tangent in the above described embodiments; optionally the fixed point position may be used and the tangents ignored.
  • a further attached curve may be connected to an attached curve, and in this case since the position of the second curve depends upon that of the first it is necessary to first calculate the new curve control point positions of the first attached curve, and then from this those of the second curve are calculated by the editor 113 as described in the above embodiments.
  • the order in which attached curves are to be drawn therefore defines a hierarchy.
  • a curve should not be attached to itself since this would result in the position of the curve depending on prior knowledge of its position. Cycles in the hierarchy are therefore not permitted.
  • An example of an application of such a hierarchical structure is in the building of the face of a cartoon character.
  • the outline of the face is connected to the body.
  • the hair is connected to the top of the face outline.
  • the lower jaw outline is connected to the bottom of the face outline.
  • the lines on the chin, on the bottom lip are connected to the outline lower jaw.
  • the nose and eyes are connected to the sides of the face outline, by spaced connections defining a distance between the outline and the eyes, at a single connection point, so that the eyes do not deform but merely move with changes in the face.
  • the pupils are connected to the eyes.
  • the eyebrows are connected to the forehead and to the eyes.
  • the other features may similarly be built up hierarchically. Thus, the movement of a line higher up in the hierarchy (for example on the body) causes the editor 113 to move all the lines lower down in the hierarchy (in the face) greatly reducing the effort required of the human animator.
  • an operator can specify a first subset of lines of a line image as being for direct manipulation and a second subset as being attached to the first subset, to be automatically manipulated by the editor 113 on manipulation of the first subset for example using the above embodiments.
  • the first and second subsets are processed differently by the line image generator 111.
  • the line image generator 111 is operable in two modes according to a user's preference (selected by operating the keyboard 170B to enter a mode change command, or by positioning the input device 170A at an appropriate portion of the display); a first mode in which only the first subset of lines (blue pencil lines) are generated on the display 160 and a second mode in which the second subset (and, optionally, the first) are displayed.
  • the line image generator 111 checks the line table 122 of each curve and in the event that the flag 901 is set to indicate that the curve is attached, the line image of that line is not generated in the first ("blue pencil") mode.
  • the display editor 113 updates the curve tables for the undisplayed attached curves.
  • the display for example, in creating a key frame
  • it may be displayed in the second mode (by the animator, or an assistant animator or clean up artist) in which the attached curves are visible as in Fig. 47B, and any minor amendments to the attached curve positions may be made as described above using the display editor 113.
  • a framework level in which a skeletal first subset of curves is manipulated
  • an outline level in which a character outline (the curves of which are attached to the skeletal curves) may be edited, and a detailed level in which detailed curves attached to the outline curve or the skeletal curves may be edited.
  • Each of these three subsets of curves may be separately displayed in a corresponding display, or there may be provided three modes in which only the first set, the first and second sets, and all three sets of curves are displayed.
  • an animator may edit a frame to provide a simple large scale motion such as movement of a leg, by merely manipulating one frame curve and the editor 113 consequently changes the shape of the outline and the detail attached to the outline without human intervention.
  • this aspect of the invention lies in providing separate display modes, in a first of which one set of lines are manipulatable so as to affect the appearance of a second set of lines which are displayed in a second mode.
  • this aspect has been described with reference to the method of curve attachment described above, it could also be implemented using prior art techniques of curve grouping, for example; although this is not preferred. Likewise, image portions other than lines could be displayed and manipulated.

Abstract

Apparatus for generating an output image which comprises means for displaying said images as a plurality of lines; store means for storing line data related to said lines; line generating means for generating said displayed lines from said stored data; and editing means for editing said stored data so as to change the appearance of said display; characterised in that said lines comprise a first subset of lines and a second subset of lines, and in that the generator means is arranged to generate a display of the first subset of lines which is different to the display of the second subset of lines, and in that the editing means, after editing stored data of said first subset, is arranged to make consequential amendments of stored data of said second subset prior to displaying said second subset.

Description

ANIMATION
FIELD OF THE INVENTION
This invention relates to apparatus for, and a method of, producing a sequence of images defining an animated sequence, such as a cartoon.
DESCRIPTION OF THE BACKGROUND ART
Traditionally, cartoons are manually drawn as a sequence of frames which, when played in succession at relatively high speed, form a moving picture (typical frame rates are 24, 25 or 30 frames per second, although sometimes frames are repeated twice). Even for a short sequence, many thousands of frames thus need to be drawn by hand and production of the hand drawn frames requires large teams of skilled animators and assistance. Almost all cartoon animation today is still produced in this way.
Some essential steps in production of cartoon animations are:
1. "key frame" production, in which a senior animator draws each character at significant points throughout the sequence;
" in betweening", in which more junior (less skilled) animators create the missing intermediate frames by a process of interpolating by eye between adjacent key frames, and then
3. "line testing", in which the sequence of key frames and in between frames are recorded on film or video tape and then replayed to check for other errors. If necessary, frames are redrawn at this point; otherwise, the pencil drawings are then transferred to clear cel, inked in, painted in the required colours, placed on the required background, and recorded on film or video.
In view of the sheer volume of drawings required, and of the time and expense involved in producing cartoons by this method, some attempts have been made to automate parts of the process. Inking and colouring has successfully been automated, resulting in some savings in manpower and time.
It has also previously been proposed to automate the interpolation or "in betweening" stage. In such proposals, the key frames produced by the senior animator are scanned in some manner by input means into an image processor such as a programmed computer, where an internal representation of each is stored. Corresponding points or lines, or other parts of two key frames, are identified manually and in between images are generated by producing a sequence of frames in each of which a similar set of points, lines or parts are generated by interpolation between those of two adjacent stored key frames. The remainder of the frame between the identified points or parts is then generated.
Such proposals have been uniformally unsuccessful, however, because the problem of identifying corresponding parts in two key frames derived from the original picture material is extremely difficult. Two key frames drawn by the same artist may appear similar to the human eye, but every point of the two line drawings may be different and the image processing apparatus is unable to distinguish between differences which correspond to motion or intentional change, and are hence to be interpolated, and those which are merely accidental. Our earlier international application GB91/02122 and corresponding US. application serial no. , claiming priority from UK applications 9110945.4, 9117409.4 and others (all these applications being incorporated herein in their entirety by reference) describes an animation system which alleviates many of the problems of the prior art.
In the above described steps in conventional production of cartoon animation, the senior animator often draws only a rough guide to the shape of the character for each key frame. The rough shape includes only essential outlines, and may be drawn in blue pencil. By comparing a succession of such blue pencil drawings, the animator can construct the overall motion of the character.
Once the senior animator has done so, an assistant adds the detail features (for example facial features) to each frame (key frames and in between frames) and produces a black line image including all the detail required. The image including the blue pencil lines and the fully dark lines is then copied using a copier which does not reproduce the blue lines.
Since 1500 pictures are required for every minute of animation, of which perhaps 300 will be keyframes, a vast number of re-posed version of the same character need to be drawn in total.
In our above referenced international application, a host of lines for each character may be shown in each key frame, on the editing display used by the animator to create the key frames, and this large number of lines needs to be edited each time a new key frame is produced. This can be inconvenient, and wasteful of the valuable time of the animator. Accordingly, in one aspect, we provide an animation system in which the frames to be edited are displayed as line images for the animator to edit, in which each line image comprises a first subset of lines which are displayed in a first manner and are manipulable for editing, and a second subset of lines which are displayed in a different manner or are not displayed, in which the animation system changes the positions of the second subset of lines in accordance with changes to the positions of the first subset.
For example, the second subset of lines may be selectively not displayed, to reduce the complexity of the image for editing.
Preferably, each line of the second subset is linked to at least one line of the first subset so that movements of that line of the first subset lead to specified movements of the corresponding linked line of the second subset.
The first subset of lines can be viewed as corresponding to the "blue pencil image" produced by animators, and the second subset as corresponding to the detail added by the assistant or "clean-up artist". However, the invention reduces the work involved in animation by moving the detail lines with the "blue pencil" lines, so that the amount of editing necessary to add the detail lines to each keyframe is greatly reduced.
In such an animation system, the animator can therefore edit one keyframe to create a new keyframe, by merely changing the curves of the first subset. In a preferred embodiment, all keyframes are derived from a common template frame, which includes all lines which will be used in the keyframe, and the animator can copy the template and edit the lines of the first subset to create a new keyframe.
It is known to provide computer graphics apparatus in which a number of lines of the line image are moved together. One type of such apparatus treats all such lines lying within a given rectangular area or field together and applies the a transformation to all the lines. For example, the area may be defined by separately defined bounding edges and all the lines within the area may be affected by distorting the edges, as if they lay on the surface of a rubber sheet. However, lines cannot be directly manipulated, as the edges are used to control the distortion. Also, lines within the area cannot be separately treated. In a simpler system, a number of curves are designated as a single group, and are all moved or subjected to some other simple transform together. In this simpler system, changes of line shape (important for animation) are not possible.
In another aspect, the present invention provides apparatus for generating an animated sequence of pictures, which comprises means for storing data defining a plurality of pictures as, for each picture, data defining a plurality of lines, the apparatus storing first line data defining the positions of a first subset of said lines within the picture, second line data for defining the position of a second subset of lines in the picture and link data indicating a correspondence between a second line and at least one first line, the apparatus further comprising means for amending the first line data, and for generating the second line data in dependence upon the amendments to the first line data and on the linking data. Other preferred aspects and embodiments of the invention are as described or claimed hereafter, with advantages that will be apparent from the following: singly or in combination, these embodiments enable the provision of an automatic animation system for character animation which is fast and straightforward to use.
Equally, it will be appreciated that many aspects of the invention may be employed in other fields of animation, however (such as computer aided design), or even in the production of single still images.
The invention will now be described, by way of example only, with reference to the accompanying drawings, in which:
BRIEF DESCRIPTION OF DRAWINGS
Figs. 1a-e illustrate curve approximations;
Figs. 2a and 2b illustrate the effect of varying the control variables used in parametric cubic curves;
Fig. 3 is a block diagram of apparatus according to one embodiment of the invention;
Fig. 4 is a block diagram of apparatus according to a further embodiment of the invention; Fig. 5 is a block diagram of apparatus according to yet further embodiments of the invention;
Figs. 6a illustrates the information stored in a memory of the apparatus of this embodiments to represent the curve shown in Fig. 6b; Fig. 7 is a block diagram schematically illustrating the operation of these embodiments in generating a display; Fig. 8 is a flow diagram schematically illustrating the operation of the apparatus of Fig. 7;
Fig. 9 is a block diagram schematically illustrating the operation of editing the data shown in Fig. 6a;
Fig. 10 is a flow diagram schematically showing the process of operation of the apparatus of Fig. 9;
Fig. 11 is a flow diagram showing schematically the sequence of operations undertaken by a user of the apparatus of the above embodiments;
Fig. 12 is a block diagram indicating schematically the manner in which data is stored within the memory as a linked list;
Figs. 13a-13e provide greater details of the data stored within the elements of Fig. 12; Fig. 14 shows in greater detail the arrangement of data stored in the memory corresponding to a displayed picture;
Fig. 15 shows schematically the arrangement of the information of Fig. 14 within the memory as a linked list;
Fig. 16 is a block diagram illustrating schematically the contents of an image store in the above embodiments; Fig. 17 shows schematically the arrangement of display areas of a display device in the above embodiments;
Fig. 18 shows the appearance of related displays in the display areas;
Figs. 19a-19c show alternative display formats;
Fig. 20 illustrates the appearance of the display during editing;
Fig. 21 shows schematically the appearance of a further display area in the above embodiments; Fig. 22 shows schematically a method of interpolating animated frames in one embodiment of the invention;
Fig. 23 shows schematically a part of the method of Fig. 22;
Fig. 24 shows schematically a method of converting a key frame to an interpolated frame in that embodiment; Fig. 25 shows schematically a method of deleting a key frame in that embodiment;
Fig. 26 shows schematically a method of moving a key frame in that embodiment;
Figs. 27a-d show schematically the results displayed on the monitor corresponding to the operations of Figs. 22-26; Fig. 28a-d show schematically the effects on the monitor 160 of converting between a point on a curve and a curve control point; Figs. 29a-d show corresponding amendments to the contents of the memory 121;
Fig. 30 show schematically a method of preparing frames for interpolation or addition;
Fig. 31 show schematically a method of adding frames together to produce a composite frame; Fig. 32 show schematically a method of deriving curve point positions in interpolated or added frames; and
Fig. 33 shows illustratively the functional elements of the invention, provided in this embodiment by a single processor;
Fig. 34 shows schematically the appearance of connected lines on the display in an embodiment of the invention;
Fig. 35 shows schematically the arrangement of data within the memory 120 corresponding to the connected lines of Fig. 34; Fig. 36 shows schematically a method of generating the display of Fig. 34;
Fig. 37 corresponds to Fig. 34 and shows schematically the effect of a first method according to Fig. 36;
Fig. 38 shows schematically the effect of a second method according to Fig. 36;
Fig. 39 shows schematically the appearance of a line connected to two others on a display in a further embodiment, before and after the process of Fig. 36; Fig. 40 shows in greater detail a part of the process of Fig. 36 producing the effect shown in Fig. 37;
Fig. 41 shows in greater detail a part of the process of Fig. 36 generating the display shown in Fig. 39;
Fig. 42 shows the display produced by an alternative method to that of Fig. 41, according to a further embodiment of the invention;
Fig. 43 shows the effect of a yet further method, according to an alternative embodiment of the invention; and Fig. 44 shows the display produced by a yet further embodiment of the invention employing a method alternative to that of Fig. 41;
Fig. 45 illustrates a display comprising a line connected to multiple other lines;
Fig. 46 illustrates a display provided by a yet further embodiment of the invention; Fig. 47A illustrates the display produced in a first mode according to a further aspect of the invention; and
Fig. 47B illustrates the display produced by a second mode according to that aspect;
Fig. 48 is a flow diagram showing schematically the sequence of operations performed by the aspect of the invention illustrated in Fig. 47.
PARAMETRIC CURVES Before discussing the invention in detail a brief description of parametric curves will be given; such curves form part of a common general knowledge of the skilled worker, and are referred to in, for example, "Interactive Computer Graphics", P Burger and D Gillies, 1989, Edison Wesley, ISBN 0-201-17439-1, or "An Introduction to Splines for Use in Computer Graphics and Geometric Modelling", by R H Bartels, J C Beatty and B A Barsky, published by Morgan Kaufmann, ISBN 0-934613-27-3 (both incorporated herein by reference).
Referring to Fig. 1A, a fairly smooth freehand curve is shown. Referring to Fig. 1B, one way of representing the curve would be to draw a series of straight line segments, meeting at points. However, the number of straight line segments has to be large, as illustrated in Fig. 1C, before the simulation is at all convincing.
Alternatively, the curve may be represented as a series of curve segments running between points. If, as in Fig. 1D, adjacent curve segments have the same slope at the point of which they join, the curve can be made smooth.
One well known type of curve approximating technique employs a cubic curve in which the coordinate variables x and y are each represented as a third order or cubic polynomial of some parameter t. Commonly, the value of the parameter is constrained to lie between 0 and 1. Thus, each curve segment is described as: x = axt + bxt 2+ cxt + dx (i) y = ayt 3+ byt 2+ cyt + d (2)
Each segment has two end points, at which t = 0 and t = 1. The coordinates of the t = 0 end point are therefore x0= dx, y0= dy, and those of the t = 1 point are given by:
x1 = ax + bx + cx + dx ( 3 ) y1 = ay + by + cy + dy ( 4 ) At the end points, the slope of the curved segment is also fixed or predetermined so that each segment can be matched to its neighbours to provide a continuous curve if desired. The shape of the curve between the end points is partially dictated by the slopes at the end points, but also by a further item of information at each point which is conveniently visualised as the length of a tangent vector at each point. The curve between the two points may be thought of as having at its end clamped at the end points, at fixed slopes thereat, whilst the tangent vector exercises a pull on the direction of the curve which is proportional to its length, so that if the tangent vector is long the curve tends to follow the tangent over much of its length. The tangent vector may be derived from the above equations (l)-(4) and vice versa; for example, where the end of the Bezier tangent vector at the t = 0 point has coordinates x2,y2, and that at the end of the t = 1 point has coordinates x3,y3, the coefficients a, b, c, d are given by: dx = x0 ( likewise dy = y0 ) ( 5 ) b = 3(x0 - 2x2 + x3) (and likewise by) (6) c = 3(x2 - x0) (and likewise cy) (7) ax 3x2 - x0 - 3x3 + x1 (and likewise ay) (8
The differential of the curve equation with respect to the variable t is: c + 2bt + 3at2 (9)
The differential values at the t = 0 and t = 1 points are, respectively, 3(x2 - x0) = cx;
3(y2 - y0) = cy; 3(x1 - x3) = cx + 2bx + 3ax;
3(Y1 - Y 3 ) = cy + 2by + 3ay
From these equations, by inspection, it will be seen that the length of the tangent to the Bezier control points ( x2,y2), (x3,y3) is 1/3 that of the actual tangent vector. Although the actual tangent vector could be employed, it is mathematically more convenient to employ the Bezier tangent vector (which has the same direction but l/3rd the magnitude).
In the so called Hermite form of a cubic equation, the data used to define a curve segment is the coordinates of the end points, the slope of the tangent vector at each end point, and the length of each tangent vector. In the Bezier format, the data used to define a curve segment are the coordinates of the end points, and the coordinates of the ends of each tangent vectors. Conversion between the Hermite and Bezier format is merely a matter of polar to rectangular conversion, and vice versa.
Fig. 2A shows the effect of varying the magnitude or lengths of the tangent vectors, whilst keeping their angle constant. It will be seen that the effect is to "pull" the curve towards the tangent vector, more or less strongly depending on the length of the tangent vector. Fig. 2B shows the effect of varying the angle of the tangent vector whilst keeping its magnitude fixed.
Other types of cubic curve are also known; for example, the B-spline, which is defined by two ends points and a plurality of intervening control points through which the curve does not pass. However, the Bezier curve description is used in many applications because it is relatively easy to manipulate; for instance, in matching an approximated curve to an existing curve, the coordinates and tangent angles at points along the curve can directly be measured and employed. The PostScript command language used to control many laser printers employs this curve description, accepting values defining the coordinates of curve segment end points and the coordinates of corresponding tangent end points .
In general, a smooth curve is defined by a number of such end points, and two adjacent such segments will share a common end point. If the curve is to be smooth, the tangent angles defined at the end point in relation to each curve segment will be equal, although the tangent vector lengths will in general not.
However, as shown in Fig. le, it is possible to represent a line with a curvature discontinuity by providing that the tangent angle at end point is different for each of the two segments it defines.
For present purposes, the main usefulness of this form of curve representation is that a smooth, bold curve can be defined using only a small number of coefficients or control points, and parts of it can be amended without extensive recalculation of the whole line.
A 3 dimensional cubic spline is specified if each point has 3 dimensional coordinates; corresponding equations in z to those above are derivable by inspection.
Apparatus for performing the invention will now be described.
GENERAL DESCRIPTION OF APPARATUS
Referring to Fig. 3, apparatus according to an embodiment of the invention comprises a computer 100 comprising a central processing unit 110, a memory device 120 for storing the program sequence for the CPU 110 and providing working read/write memory, a frame store 130 comprising a series of memory locations each associated with, or mapped to, a point in an image to be generated or processed, and an input/output controller 140 providing input and output ports for reading from and writing to external devices, all intercoupled through common parallel data and address buses 150.
A monitor 160 is connected to the computer 100 and its display updated from the frame store 130 under control of the CPU 110. At least one user input device 170a, 170b is provided; typically a keyboard 170b for inputting commands or control signals for controlling peripheral operations such as starting, finishing and storing the results of an image generation or image processing operation, and a position sensitive input device 170a such as, in combination, a stylus and digitising tablet, or a "mouse", or a touch sensitive screen on the monitor 160, or a "trackerball" device or a joystick. A cursor symbol is generated by the computer 100 for display on the monitor 160 in dependence upon the signal from the position sensitive input device 170a to allow a user to inspect an image on the monitor 160 and select or designate a point or region of the image during image generation or processing. A mass storage device 180 such as, for instance, a hard disk device is preferably provided as a long term image store, since the amount of data associated with a single image stored as a frame at an acceptable resolution is high. Preferably, the mass storage device 180 also or alternatively comprises a removable medium storage device such as a floppy disk drive or a high capacity tape drive, to allow data to be transferred into and out from the computer 100.
Also preferably provided, connected to the input/output device 140, is a printer 190 for producing a permanent visual output record of the image generated. The output may be provided on a transparency or on a sheet of paper. A film recorder 196 and/or a video recorder 197, and means for generating a suitably formatted moving picture output comprising a succession of frames, are also preferably provided.
A picture input device 195 such as a scanner for scanning an image on, for example, a slide, and inputting a corresponding video signal to the computer 100 may also be provided.
Referring to Fig. 4, an animation system in one embodiment of the invention comprises the computer 100 of Fig. 3 providing an animators work station, and arranged to execute three different stored sequences so as to comprise an interpolator 101, a replayer 103 and a renderer 105. The interpolator, with which the present invention is chiefly concerned, is arranged to generate sequences of image frames. The replayer 103 is arranged to recall a stored image sequence previously created, and generate a display of the sequence as a moving image on the animators workstation monitor 160. The renderer 105 is arranged to colour the image, and may also affect the way in which the lines are represented (for example, their thickness). The renderer 105 preferably operates as disclosed in our earlier application 9110945.4, our earlier PCT Application PCT/GB91/02/24, and our PCT application PCT/GB92..... filed on the same day as the present application and claiming the same priorities (Agents Reference 5130799).
Referring to Fig. 5, in an embodiment of the invention for larger scale animation production, there may be a plurality of workstations 110a-110c allowing different users to develop different parts of a given animated sequence, sharing a common mass storage (file server) unit 180 such as a disk drive unit with a controlling processor, and connected thereto via a local area network (LAN) such as Ethernet.
Since the rendering process usually requires many more pixel calculations than the interpolation process according to the invention for each frame, it may be advantageous to provide separate processors (typically, a smaller number) 110d-110f for performing the rendering operation, interconnected to the workstations 110a-110c via the local area network. This enables the use to be made of simpler computers for the animator workstations 110a-110c; they may, for example, lack maths coprocessor devices and/or, sophisticated graphics engines. Alternatively, the same processors may act either as rendering processors or workstations, depending on demand. In this case, control means may be provided for determining the current processing load on each processor and for allocating rendering or workstation tasks to processors connected to the network so as to manage (e.g. balance) the processing load.
One example of a suitable computer 100 for implementing the above embodiments of Figs. 3 and 4 is the NeXTCUBE computer including the NeXTdimension colour board, available from NeXTComputer, Inc., USA. This arrangement provides direct formatted outputs for connection to a videocassette recorder or other video storage device, and accepts video input signals. Further, it includes means for compressing images for storage on a disk store 180, and for decompressing such stored images for display.
In this embodiment of the invention, display frames, consisting of line drawings of objects, are created and/or edited with reference to stored control point data (preferably data stored in the Bezier format referred to above). In other words, a stored representation of a display frame comprises a plurality of control points which define line segments which make up a line representation. Referring to Fig. 6a, the memory 120 includes a working memory area 121 to which data may be written (for example, a random access memory area). Referring to Fig. 6b, an image displayed on the monitor 160 includes at least one line A, which is drawn as a cubic curve defined by three control points A1, A2 , A3. Corresponding image frame data representing the line image is stored within a frame table 122 within the working memory 121, as a series of curves (curve 1, curve 2 etc) each of which is defined by a series of control points (point 1, point 2, point 3). Each control point is represented by data comprising positional data ( xi,yi) representing the position within the area of the display of that control point, and tangent data (xei,yei, xfi,yfi) defining two tangent end points associated with the curved segments on either side of the control point. The tangent extent point data (xei,yei, xfi,yfi) are stored as position data X, Y defining the position of the tangent end point. It would also be possible to store instead the x,y offsets from the control point position.
The monitor 160 is usually of the raster scanned type and consequently expects a raster scanned image, which is supplied from a memory mapped image store 130 as discussed above. Accordingly, it is necessary to provide a line display generating means 111 arranged to read the stored data representing the curve segments making up the frame, and generate corresponding raster image data comprising a plurality of pixels for storage in the image store 130. Each pixel need only comprise a single data bit or a small number of bits, if the display is monochrome black/white.
The line display generator 111 shown in Fig. 7 is accordingly arranged to access the memory 122 to read the stored data, and the image store 130 to write pixel data. As shown in Fig. 8, it calculates intervening point positions, and sets those memory locations within the image store 130 which corresponds to pixels lying on the curve to "dark" and all those which do not to "bright". The contents of the image store 130 are then displayed on the monitor 160. In practice, the line display generating means 111 comprises the CPU 110 operating under control of the programme stored in a programme store area in the memory 120. Where the computer 100 comprises the above mentioned NeXT Computer, the line display generating means 111 may comprise the CPU 110 operating under the "PostScript" display command language provided within the operating system. The manner in which some basic operations are performed by the above apparatus will now be discussed. EDITING A FRAME
As will be discussed in greater detail below, the preferred embodiments of the invention provide means for enabling a user to edit a frame. Editing a frame may involve either modifying the trajectory of existing lines or (more rarely) adding new lines. It is therefore necessary both to amend the data held in the frame table 122, and desirably to amend the image data in the image store 130 so as to enable the user to view the effects of the change. It is found that the best way of providing the user with means for amending the frame data stored in the table 122 is to allow him to employ a position sensitive input device 170a, so as to appear to directly amend the displayed representation of the frame on the screen monitor 160.
DEFINING AND EDITING A CURVE
In this embodiment, referring to Fig. 9, a user manipulates the position sensing input device 170a, for example "mouse", by moving the device 170a so as to generate a signal indicating the direction and extent of the movement. This signal is sensed by the device input/output controller 140, which provides a corresponding signal to a cursor position controller 112 (in practice, provided by the CPU 110 operating under stored program control) which maintains stored current cursor position data in x,y co-ordinates and updates the stored cursor position in accordance with the signal from the device input/output controller 140. The cursor position controller 112 accesses the image store 130 and amends the image data corresponding to the stored cursor position to cause the display of a cursor position symbol D on the display shown on the monitor 160. The user may thus, by moving the input device 170a, move the position of the displayed cursor position symbol D. In a preferred embodiment, the display line generator 111 is arranged in the editing mode not only to write data corresponding to the line A into the image store 130, but also to generate a display of the control point data. Accordingly, for each control point A, ,A„, -che display generator 111 writes data representing a control point symbol (for example, a dark blob) into the image store 130 at address locations corresponding to the control point co-ordinates x,y.
Further, the display generator 111 preferably, for each control point, correspondingly generates a second control point symbol E. (or two such symbols) located relative to the A. along a line defined by the control point tangent data at a position xe1, ye1, and/or xf1,yf1 ; preferably, a line between the two points A1 and E1 is likewise generated to show the tangent itself.
To enter a new curve A, the user signals an intention so to do (for example by typing a command on the keyboard 170b, or by positioning the cursor symbol at a designated area of a displayed control menu), positions the cursor symbol d at desired point on the display 160, by manipulating the position sensitive input device 170a and generates a control signal to indicate that the desired point has been reached. The cursor position controller 112 supplies the current cursor position data to the frame table 122 as control point position co-ordinates, and the display generator 111 correspondingly writes data representing a control point symbol into the image store 130 at address locations corresponding to the control point co-ordinates. The user then inputs tangent extent point information, for example via the keyboard 170b, or in the manner described below. When a second path control point has been thus defined and stored in the table 122, the supervisory image generator 111 will correspondingly generate the line segment therebetween on the supervisory display by writing the intervening image points into the image store 130.
Referring to Fig. 10, to amend the shape or path of the line A displayed on the supervisory display, a user manipulates the input device 170a to move the cursor position symbol D to coincide with one of the control point symbols A1, or E1 on the display 160. To indicate that the cursor is at the desired position, the user then generates a control signal (for example, by "clicking" a mouse input device 170a). The device input/ output controller 140 responds by supplying a control signal to the cursor position controller 112.
The cursor position controller 112 supplies the cursor position data to a supervisory display editor 113,
(comprising in practice the CPU 110 operating under stored program control) which compares the stored cursor position with, for each point, the point A position (X,Y) and the point E position (xe,ye)
When the cursor position is determined to coincide with any point position A or tangent end position E , the display editor 113 is thereafter arranged to receive the updated cursor position from the cursor controller 112 and to amend the point data corresponding to the point A1 with which the cursor symbol coincides, so as to move that point to track subsequent motion of the cursor. If the cursor is located at the point A. on the curve
A, manipulation by a user of the input device 170a amends the position data (X1,Y1 ) in the line table
122, but leaves the tangent data (xe1,ye1) unaffected. If, on the other the cursor is located at an end of tangent point E1, manipulation by a user of the input device 170a alters the tangent end point data in the frame table 122 within the memory 120, leaving the control point position data (x,y) unaffected.
In either case, after each such amendment to the contents of the line table 122, the display generator 111 regenerates the line segment affected by the control point in question within the image store 130 so as to change the representation of the line on the monitor 160.
Once a line has been amended to a desired position, the user generates a further control signal (e.g by "clicking" the mouse input device 170a), and the supervisory display editor 113 thereafter ceases to amend the contents of the memory 120. The cursor controller 112 continues to update the stored cursor position.
This method of amending the line representation is found to be particularly simple and quick to use. GENERAL DESCRIPTION OF PROCESSES FOR 2-D ANIMATION
The processes performed by the apparatus of the preferred embodiment to the invention to enable a user to define an animated sequence are: 1. Defining Objects to be Animated - for example, characters. As will be disclosed in greater detail below, the apparatus of this embodiment permits the definition of a topological representation of a character or object to be animated.
2. Defining Key Frames - image frames in which the character previously defined is represented in a particular shape, orientation or position are defined, corresponding to spaced apart frames of an animated sequence.
3. Creating Interpolated Frames - from the key frames created above, a plurality of intervening frames in which the object is manipulated.
4. Displaying/Editing - the sequence of key frames and interpolated frames, or a representation thereof, is displayed and may be edited.
5. Replaying - the sequence of frames is successively displayed at a display rate corresponding to a video image (24, 25 or 30 frames per second), to enable the user to view a representation of the animated sequence. The sequence may be replayed with an associated sound track, to assess the correctness of timings and synchronisation.
6. Rendering - frames or sequences are coloured and/or shaded, and/or mixed with a desired background, to produce a finished video sequence.
GENERAL OVERVIEW OF 2-D SYSTEM OPERATION
One typical sequence of operations of this embodiment is shown in Fig. 11. Initially, the user will wish to create a character or object to animate. The shape of the object will be changeable, but its underlying topology is maintained constant and the user will therefore create an initial "template" or set of data storing this underlying topology. The template is a view of the character or object which includes all the lines (and, therefore, is defined by all the control points) which it is desired to show in later pictures of the character or object. The template picture or frame is created on the monitor 160, preferably using the position sensitive input device 170a (for example "mouse") as described in greater detail below. As discussed in greater detail below, the template frame may comprise a first subset of curves stored with data indicating that they are primary, "blue pencil" or unattached curves and a second subset stored with data indicating that they are secondary "black pencil" or attached curves.
At this stage, it may be desirable to store the template data (the curve control points, together with identification data labelling the template) permanently, on the mass storage device 180. Equally, rather than creating a template anew, the user may summon a stored template from mass storage 180.
The next stage may be to create a number of key frames. As is the case with hand produced animations, key frames are frames spaced apart in time which includes some change or shape or position of the character or object to be animated. Each key frame therefore has corresponding data identifying the point in the animated sequence at which the key frame occurs. Key frames may be produced directly from the template to which they correspond, by copying the control point data making up the template and then editing the copied control point data to cause the key frame to diverge from the template. The editing is preferably performed interactively, using as above the position sensitive input device 170a, and viewing the effects of the editing on the monitor 160. The edited control point data then comprises the key frame data. A key frame may likewise be produced by copying an existing frame; in this case it will be an indirect copy of the template frame.
As discussed in greater detail below, the keyframe creation may be performed by first manipulating the primary curves and then, if necessary, re-editing the secondary curve position generated automatically by the system. In the primary curve editing, only the primary curves are displayed and edited on the monitor 160.
At this point it may be convenient to save the key frame data to the mass storage device 180. Preferably, the key frame control point data comprise offset data defining the difference between a given key frame data and the corresponding data in the template. Thus, when the template is amended, the key frames need not be individually amended. Other advantages of this representation are discussed below.
The key frames thus generated, or key frames recalled from the mass storage device 180, may then be processed to derive the intervening frames (interpolated frames). Each interpolated frame comprises, as above, a set of control points defining the curves or lines making up the image frame. Each control point of each interpolated frame is derived to lie between the control points of the pair of key frames it lies between. The number of interpolated frames depends upon the separation in time of the two key frames between which the interpolation is performed.
The user may next view the interpolated sequence. Typically, key frames are separated by less than one second, or less than 30 interpolants (although greater separations are of course possible) and it is therefore possible to provide a display including several key frames and the interpolants lying therebetween simultaneously on the screen of the monitor 160. At this point, the user may store the sequence of interpolated frames in mass storage 180, or may wish to amend the sequence in some manner.
A first type of amendment comprises changing the time occurrence of the key frame; in this case, the key frame itself is not redrawn but the number of interpolants will change and consequently the interpolation must be repeated. Alternatively, the user may wish to edit a key frame. Finally, he may (as discussed below) decide that a sequence cannot be directly interpolated and that therefore a new key frame needs to be inserted between two existing key frames; this may be achieved by converting an interpolated frame into a key frame (as discussed below in greater detail).
The next stage may typically be to animate the sequence, to test whether the timing and appearance is correct. The apparatus therefore displays each key frame and interpolated frame of the sequence in turn, at short intervals in time. If the sequence is to be displayed at "normal" running speed, the interval is 1/24, 1/25 or 1/30 second between frames. Preferably, however, the user can vary the frame repetition rate so as to view the sequence in slow motion. Preferably, the user can also designate a short sub-sequence to be animated, and can move repeatedly forwards or backwards through the short sub-sequence. If the sequence is not correct, then as before the user will edit either the appearance or position in time of the key frame, or add or delete a key frame. The control point data making up the frames of the sequence are then typically saved to mass storage device 180, for later use.
Additionally, or alternatively, the frames may be coloured and/or filled and/or added to an existing background ("rendered"), to generate a corresponding series of raster image frames which may be displayed on a colour monitor, saved on a video tape recorder, or compression coded and stored on the mass storage device 180.
It will be clear from Fig. 11 and the following description that the above described sequences is by no means exhaustive of the options open at each stage to the user.
STORAGE OF DATA IN MEMORY 120
From the foregoing, it will be apparent that a number of different types of data must be evaluated and stored by the apparatus to enable a completed animated sequence to be produced. One exemplary arrangement of data will now be discussed.
A finished animation will involve different characters, and will be produced in segments or sequences. Referring to Fig. 13a, for each completed animation (labelled "epoch" in Figs. 12 and 13) a table 1000 of data is defined which includes data establishing the identity of the animated sequence (a title), data relating to the soundtrack, and a table of sequences 1100, 1200, 1300, 1400 of successive frames. The sequences will occur in succession. Conveniently, the sequences are stored as a linked list in the working memory 121; in other words, the complete animation table stores data identifying the location in the memory 121 of the first (and preferably the last) of the sequence tables 1100 ...., and each sequence table 1100 ... includes data identifying the address in memory 121 of the next sequence table (and preferably, the previous sequence table).
In animation of cartoons, for example, it is very common for several parts of a character or object to move simultaneously and substantially independently. For example, a character may walk and talk at the same time. In a preferred embodiment, the invention therefore enables separate movements to be defined for different parts of the same template. This is achieved by creating separate key framese and interpolated frames therebetween for different parts of the template, and editing the separate sets of key frames and interpolants to achieve the desired motion, and then subsequently merging together the separate sets as will be discussed below. Each set of key frames and interpolants does form a sequence over time, but for consistency the term "sequence" will be reserved in the following for the merged sequence of frames, and the term "timeline" will be used to describe the sequential set of frames (key frames and interpolated frames) corresponding to separate parts of the templates, separately animated which are merged to form the sequence. Of course, where the whole object is animated simultaneously, the single timeline also comprises the finished sequence.
Thus, in general, referring to Fig. 13b, each sequence table 1100 comprises data defining the template frame which the sequence animates, data (e.g. a pointer) indicating to which animation or epoch 1000 the sequence 1100 corresponds, a set of frame tables
(curve sets) comprising the composite or merged sequence (conveniently stored as a linked list of frames), a set of timeline tables 1110, 1120, 1130
(discussed below), data defining a currently displayed timeline, and, conveniently, a set of frames or curve-sets which comprises the merged sum of all timelines except that currently displayed. This enables the currently displayed timeline to be easily edited, then merged with this "base" sequence of frames to replace the existing composited sequence. The length, and the first and last frame addresses of the composite sequence are also stored.
Referring to Fig. 13c, each timeline table 1110, 1120 ... likewise defines a series of frame data tables, and for convenience these are stored as a linked list of key frames 1111, 1112, 1113 ... .
Referring to Fig. 13d, each key frame data table 1111, 1112, 1113 includes a pointer to a frame table 122, but also includes further data. A pointer to a list of interpolant frame tables comprising those defining the interpolant frames lying after that key frame and prior to the next is included. Frame tables 122 are associated with a stored frame type indicator, which in this case indicates that the frame table 122 is a key frame. Additionally, data defining the key frame number (i.e. its order amongst the key frames in the timeline 1110) is stored. Referring to Fig. 13e, the interpolant frame data tables 1111A, 1111B, 1111C ... for each key frame 1111 each comprise a pointer to a frame curve set data table 122. Each also includes an interpolation factor (typically 0-1) defining the extent to which the frame depends upon the following key frame 1112; thus, for successive interpolated frames 1111A, 1111B, 1111C ..., the interpolation factor gradually rises from close to 0 to close to 1. The interpolated frame 1111A and the key frame 1111 each store a frame number, which defines their position in the timeline 1110 and sequence 1100. Frame numbers correspond to points in time succeeding one another by 1/24, 1/25 or 1/30 of a second (or whatever the frame repetition period is desired to be).
Figs. 14 and 15 show the arrangement of the frame table 122 of Fig. 6a in greater detail. Each frame table 122 includes a list of lines or curves making up a set which represent the object or character which the frame depicts (and corresponds topologically to the template). The template, key frames and interpolated frames may thus all be represented by similar frame tables 122. The lines or curves are conveniently provided as a linked list of curve tables 2100, 2200, 2300, 2400, each curve table comprising a list of curve control points (again conveniently stored as a link list) 2110, 2120, 2130. Each control point 2110 comprises position data defining the control point coordinates, and position data defining the control point tangent end coordinates. The curve segment to the next control point may include attribute control points (which will be discussed in greater detail below) for controlling the values of attributes such as colour and transparency during the rendering process, or for enabling compatibility during interpolation as discussed below, and in this case it is desirable for the positions of these attribute control points to be interpolated between key frames at which they are defined, for use in the subsequent rendering operation.
Accordingly, the control points 2110 ... include a list or table of attribute control points over the curve segment to the next curve control point. Each attribute control point table entry may comprise data defining the value of the attribute controlled by the point (for example, the line colour or transparency), and comprises data defining the position along the line segment of the attribute control point; conveniently, this is the value of the parameter t at the point. Further details of attribute control points will be found in our above referenced other application.
The use of a linked list arrangement discussed above for storing frame data, timeline data and sequence data is not essential, but is particularly preferred since this enables individual frames, sub-sequences or sequences to be moved in time by breaking and replacing links at either side of the frame, sequence or sub-sequence.
During operation of the apparatus, all of the above data tables will normally be resident in the working memory 121.
DISPLAYS ON MONITOR 160
From the foregoing, it will be apparent that a considerable amount of data is held by the apparatus during use, and that the data is held in a form in which it is not immediately comprehensible to the user. The manner in which data is presented to the user through the monitor 160, and in which it may be amended by the user, is therefore of considerable importance, and has major functional effects on the performance of the apparatus according to the invention.
Firstly, it is advantageous to provide a single frame display, and means for designating a stored frame (an interpolated or a key frame) for display. To display a single frame, the display generator 111 reads the corresponding frame table 122 and generates corresponding image data in an image buffer 130, which is then displayed.
As discussed below, two different types of frame display may be provided; a first in which only primary curves are displayed and a second in which all curves are displayed.
Referring to Fig. 16, accordingly, preferably, a plurality of image buffers 130a, 130c, are provided within the image store 130. One buffer 130a comprises the display buffer, which represents (is mapped to) the display produced on the monitor 160. The other buffers 130b, 130c ... provide "windows", as is known generally in the computing art, and each of which contains image data corresponding to a raster image optionally forming a portion of the monitor 160 display. The images held in the frame buffers 130b, 130c ... are combined into the buffer 130a, and the size, position and order (i.e. which window overwrites which) may be determined by the user manipulating the keyboard 170b or mouse 170a, as is provided in the operating systems of many commercially available personal computers in a manner which is well known and forms no part of the present invention. Referring to Fig. 17, the display generated on the monitor 160 may therefore include display areas 160b-160e corresponding to some or all of the buffers 130b-130e, although preferably means (the input means 170 and CPU 110) are provided for enabling a user to select only one or some such display areas.
The buffer 130b as discussed above comprises image data which corresponds to a single selected frame of a sequence.
The buffer 130c is likewise dimensioned to contain a single frame image, which is however the image corresponding to the stored template.
The buffer 130d is arranged to store an image which comprises a montage of a succession of frames (key frames and interpolated frames) having successive frame numbers, and defining part of a timeline or sequence, in a manner described in greater detail below.
The buffer 130e stores a bar chart image comprising a plurality of bars each corresponding to one frame of the image in the buffer 130d, and each displaying the value of the interpolant factor for a corresponding frame as a length along the bar.
The appearance of an exemplary corresponding display is as shown in Fig. 18, in which the dark images correspond to key frames and the grey images correspond to interpolated frames. It will be seen that in this embodiment, a given frame may be represented in three different manners simultaneously; firstly, as an individual display on the display area 160b which corresponds to the contents of the image store 130b; secondly, as part of sequence displayed in the sequence display 160d corresponding to the image store 130d; and thirdly, as a bar of the bar chart representing the timeline in which the image is included, displayed in the display area 160e corresponding to the image buffer 130e.
Referring to Figs. 19a-c, the image held in the sequence buffer 130d may be presented in differing formats. In a first format, shown in Figs. 18a and 18b, the sequence image is produced by writing into the buffer I30d raster image data corresponding to each of the frames making up the sequence, so as to generate a display 160d in which the frame images are progressively displaced one from the other, but with some overlap so that each frame image partially overwrites its predecessor in the buffer 130d. The frame images could also be provided with no progressive displacement (i.e. superimposed). Fig. 19c shows an alternative embodiment in which each frame image is written into the buffer 130d into a spatially separate portion thereof, without overlap. This embodiment is also illustrated in Fig. 18. The display format of Fig. 19c is of assistance in viewing motion, since corresponding parts of the object in successive frames are close together. The representation of Fig. 19a, however, enables each frame to be more clearly examined. Advantageously, preferred embodiments of the invention provide means (e.g. the keyboard 170b) for selecting between these modes. It may also permit the displacement between successive frame images in the mode shown in Fig. 19c to be varied. Referring to Fig. 20, the presentation of a frame in the frame display area 160b when it is desired to edit a frame is shown. When the user indicates a desire to edit a frame by selecting that frame (by manipulating the keyboard 170b or position sensitive input device 170a) the display generator 111 is arranged not only to generate the frame image data in the frame buffer 130b, but also to generate symbols (e.g. dots) at curvature control points. Preferably, the tangent end points and, more preferably, the tangent extent lines are also drawn. Finally, a cursor symbol (shown as a "+") is displayed, to enable a user to edit the frame image as discussed above using the position sensitive input device 170b.
Referring to Fig. 21, the display area 160e displays the bar chart data display held in the timeline buffer 130e. Each bar relates to a frame (key frame or interpolated frame) within a single timeline. The length of the bar shows the interpolation factor associated with interpolated frames, and since key frames are (by definition) not interpolated, they have either maximum or minimum bar lengths. The usefulness of the timeline display 160e and corresponding buffer 130e is, firstly, in providing the user with a synopsis of the information shown in the sequence image area 160d and, secondly, in providing a particularly simple way of editing the timeline, and seeing the effects on the timeline as a whole, by using the position sensitive input device 170a to position the cursor symbol at selected bars of the display 160e and signalling an appropriate control signal.
One type of such amendment is to alter the interpolation factor of given interpolated frame. In this case, the height of the bar for that frame is varied to follow the cursor position symbol manipulated by the user, and the interpolation value stored in the corresponding frame table 111a is amended accordingly. Initially, the values of the interpolation factor in successive frames follow a progressive sequence which is typically a linear sequence, but could equally follow any predetermined curve (usually monotonic) between the neighbouring key frames. It is advantageous to provide that each bar is displayed in two colours, the height of the bar comprising the interface between the two, and that the colours of key frame bars (determined using the key frame numbers thereof) should alternate, and the height of the interpolant bars should rise with interpolation factor after one key frame, then fall with interpolation factor after the next, so as to present a rising and falling bar height rather than a sawtooth pattern. This is found easier for a user to interpret.
Such a progression gives the user an immediately visible sequence, which is considerably easier to use than having to specify each interpolant value individually, and it is found that in most cases, the same progression (for example, linear interpolation) can be employed. However, it is extremely useful to be able to amend the sequence merely by amending the interpolation value of a given frame (rather than redrawing or editing the frame), and this is particularly advantageously achieved by onscreen manipulation of a bar chart display using the position sensitive input device 170a.
Another type of amendment involves moving a frame or a series of frames in time. The chart display provides a readily visualised means for achieving this; using a position sensitive input device 170a, the user may designate one or a number of frames, and then move the frames along the timeline using the cursor symbol to a desired new position. In this case, the apparatus is arranged to alter the frame numbers of the frames selected by the user, and to generate new intervening frames (or delete old frames) as required. More details will be given below.
DESCRIPTION OF PARTICULAR OPERATIONS
A description of one exemplary method of performance of particular operations will now be described. Creating a Template Frame
The user signals a desire to create a new template by generating an appropriate signal using the keyboard 170b or position sensitive input device 170a, typically by selecting an option from a menu displayed (possibly permanently) on the monitor 160.
The CPU 110 then creates within the working memory 121 a template table, which comprises a frame table 122 and a datum indicating that the frame is a template frame. Because the template frame is not itself employed in a sequence, no sequence numbers are necessary. The user will typically signal, via the keyboard 170b, a name to be associated with the template, which is stored therewith in the working memory 121. The template display area 160c is generated on the monitor 160. The cursor symbol is displayed within the display area 160c, and the user can proceed to build up a template. To do so, the user selects from the following options:
Creating a new curve - the current cursor position provides the x,y coordinates of the first control point. The length of the tangent at this point is set to 0. These values are written into the frame table 122. The cursor position is continually monitored, and provides the second control point position coordinates, the values in the table 122 being continously updated with movements of the cursor until a control signal is generated by the user to fix the coordinates of the second control point (when the desired location is reached). A line between the first and second control points is continually generated by the line generator 111 within the template buffer 130c, displayed on the template display area 160c, to enable the user to determine the correct position for the second control point. The second control point tangent length is likewise initially set to 0.
Amending a control point - as described above with reference to Fig. 10.
Adding a new control point - a further control point can be inserted within a curve, to increase the complexity of the curve by dividing a segment of the curve into two. Accordingly, the user positions the cursor symbol at a desired point along a curve displayed on the template display area 160c, and initiates a signal via the keyboard 170b, or the position sensitive input device 170a (for example, by "clicking" a mouse device). The current cursor position coordinates are read, and the identity of the two control points which lie to either side of the current position along the line segment are determined. The cubic equation is solved using the current cursor coordinates, to derive the value of the parameter t at the current cursor position. A new control point record 2110 is created within the frame table 122 for the template, including pointers to the records of the two neighbouring control points on the curve. The "next control point"and "previous control point" pointer field in the surrounding control point data records are amended to point to the new control point. The slope and magnitudes of the tangents at the new control point are calculated, and stored in the new control point record. The new control point may then be edited, to change the shape of the curve running through it.
Deleting a control point - an appropriate signal being generated by the user, the control point at which the cursor is located is looked up in the table 122 using the current cursor position coordinates, and the corresponding control point record is deleted. The "next control point" and "previous control point" fields of the neighbouring control points on the curve segment are amended to point to each other and omit reference to deleted control point.
By adding and editing line segments, the desired line drawing is built up on the template display area 160c, and a corresponding set of curve data is stored in a frame table 122 labelled as corresponding to a template.
In addition to the above described operations, which directly affect the shape of the template, the user can also add attribute control points to control attributes of the finally rendered image, by positioning the cursor symbol to a desired point along the curve segment represented on the display device 160 and generating appropriate signal (e.g. by pressing appropriate key on the keyboard 170b) . On doing so, the current cursor position is used to find the preceding curvature control point along the curve, to the attribute control point list of which a new attribute control point record is inserted and the pointers of surrounding attribute control points altered accordingly. The value of the parameter t is derived and stored in the attribute control point record, and the user may input data concerning the value of the attribute at that point for later use (as described in our earlier UK Application 9110945.4, above referenced PCT Application PCT/GB91/02124 and
PCT application PCT/GB92/ ; agents ref: 5130799).
Having built up a desired template, typically the contents of the template table 122 are stored to the mass storage device 180 so as to be recallable using the name or identification data for the template (e.g. the file name).
Creating a Key Frame
The set of curves comprising the key frame may be edited and stored in a key frame table, corresponding images being displayed in the frame display area 160b derived from the frame buffer 130b, in the same manner as described above with reference to a template. The point in time of occurrence of the key frame in the sequence is also of significance; it is therefore necessary to store data defining the sequence and timeline to which the key frame belongs; the position of the key frame in the sequence relative to other key frames; and the absolute position in the sequence or timeline of the key frame (the frame number).
The user may input these via the keyboard 170b. Alternatively, if the cursor tracker 112 identifies the cursor position as corresponding to that of one of the bars of the bar chart display shown in display area 160e, the key frame may be allocated the corresponding frame number. In the absence of either, the apparatus is preferably arranged to allocate the key frame a frame number equal to the currentt largest frame number plus one, and a key frame number equal to the current large key frame number plus one, so that the frame is added to the end of the existing timeline.
A new key frame table 122 is then created within the memory 120, and the CPU 110 copies the contents of the template frame table into the new key frame table so that the new key frame is identical to the template. The address within the memory 120 of the new key frame is then inserted into the "next key frame" pointer of the neighbouring key frame or key frames in the timeline, and any other necessary pointers within the memory are set to reflect the addition of the new key frame. The timeline image buffer 130e is amended to cause the generation of a new bar at the key frame position in the display area 160e, and then the interpolated frames of the preceding key frame, if any, are recalculated (as discussed in greater detail below). If there is a succeeding key frame, a set of interpolated frames to the succeeding key frame are also calculated, and corresponding interpolated frame data tables are set up within the memory 120, as a list pointing to the new key frame. The sequence display buffer is then updated to include the newly interpolated frames, and the display on the monitor 160 in the display area 160d is correspondingly altered, as is the timeline bar chart display area 160e.
Interpolation
When, as above, a key frame is created or amended or deleted or moved in time, it is necessary to recalculate the interpolated frames on either side of the change. Referring to Fig. 22, the CPU 110 first checks (box 2201) whether there is an immediately previous key frame in the timeline. If there is, in other words if the present key frame is not the first frame of the timeline, the separation in frame numbers of the two key frames is found (box 2202) and the number of interpolants is set equal to this. The interpolation routine shown in Fig. 23 is then executed to interpolate from the preceding keyframe (box 2203).
Referring to Fig. 23, the length of the list of interpolated frames of the earlier key frame is examined (box 2301); if the new number of interpolants required differs from the current number in the list, or if there is no current list, the current list is deleted and a new list of the required length is created (box 2302 in memory). Each interpolant in the list is allocated an interpolation factor (box 2303); for the i'th list member (i.e. the i'th frame after the earlier key frame) in a list comprising n members, the interpolation factor is L=i/n for linear interpolation, or F(i/n) where F is a non-linear function. One non-linear function which may be used is sigmoidal; that is, tends to horizontal at either end and is monotonically rising in between, so as to slow the interpolation rate towards either keyframe, and smooth the transition through the key frame; other functions smoothing the transition are equally possible.
Next (box 2304), the curve data for each interpolated frame is derived and stored in the associated interpolated frame table. Each key frame is derived from the same template, and hence will have the same number of curve control points. For a given interpolated frame having an interpolation factor L, the CPU 110 therefore takes the first curve control point of the earlier key frame and that of the later key frame, and stores for the first curve control point of the interpolated frame a value intermediate between the two. The x,y position of the interpolated point is derived as : x = x1 (1-L) + x2L, where x1 is the earlier frame (to the list of which the interpolated frame belongs) and x2 is the later frame. The y coordinate is likewise given by:
(1-L) + y2L.
The coordinates of the tangent extent point at the control point are derived in exactly the same manner.
A preferred embodiment allows values of L greater than unity; this permits a character to "overshoot", which gives a desirable visual effect in cartoon animation. In this case, the (1-L) term may be set to zero. For example, overshoot may be provided by interpolating from L = 0 to 1.2 in 8 frames, and 1.2 to 1.0 in two following frames. The CPU 110 then proceeds to the next control point in the lists of the two key frame, and proceeds until all control points of all curves of the two key frames have been interpolated to produce corresponding control points in the interpolated frame. The CPU 110 then selects the next interpolated frame, with a correspondingly higher interpolation factor, and repeats the process . Returning to Fig. 22, the CPU 110 next (box 2204) determines whether there is a following key frame occurring in the timeline, (e.g. by referring to the pointers maintained in the timeline table 1110) and, if so (in other words, if the key frame is not the last frame of the time) the process of box 2202 is repeated (box 2205) and the process shown in Fig. 23 is again executed (box 2206) to interpolate frames between the key frame and the following key frame. Once the corresponding interpolated frame tables have been calculated, the CPU 110 amends the data held in the timeline image buffer 130e to reflect the new interpolation factors, and updates the display area 160e (box 2207).
Likewise, a new image is generated in the sequence image store 130d corresponding to the new interpolant values and the sequence display area 160d is updated.
Converting an Interpolated Frame into a Key Frame
Where an interpolated sequence is unsatisfactory (as sometimes occurs when the sequence is to show an object moving in three dimensions, since the interpolation only interpolates in two dimensions) one convenient way of improving the sequences is to convert one of the interpolated frames into a key frame, and then edit the key frame as desired. To signal his intention to do so, the user may for example position the cursor symbol at the bar on the timeline display area 160e and issue an appropriate control signal (for example, by "clicking" a mouse device 170a twice). The CPU 110 then identifies the cursor position and derives the frame number of the corresponding interpolated frame. Next, the CPU 110 reads the frame table for that interpolated frame and locates the key frame in relation to which the interpolated frame from the key frame list is stored (box 2401).
Next, referring to Fig. 24, the CPU 110 creates (box 2402) a new key frame data table, and allocates the key frame the next key frame number after that of the key frame to which the interpolated frame belonged. The frame number is retained. The curve data of the interpolated frame table 122 is then copied into the new key frame table, and the interpolated frame table is deleted.
Reference to the new key frame is inserted into the list of key frames maintained in the time line table after the parent keyframe of the interpolant (box 2403). The new key frame is then selected for display in the frame display area 160b (box 2404), and corresponding image data is generated in the frame image buffer 130d and displayed on monitor 160 (box 2405). The frame is preferably displayed as shown in Fig. 20, with the curve control points and tangents indicated for editing.
Subsequent key frames in the list stored in the timeline table are renumbered, each incremented by one, to take account of the insertion of a new key frame (box 2406)1 Next, the interpolation process of
Fig. 22 is executed (box 2407), and the sequence display frame store 130d is correspondingly modified to generate an updated sequence display in the display area 160d (box 2408). With linear interpolation, the appearance of the other interpolated frames may not change until the new key frame has been edited, but the interpolation factors for each will have changed. Deleting a Key Frame
Where possible, it is desirable to simplify the calculation of the sequence by minimising the number of key frames. Accordingly, it may on occasion be possible to delete a key frame, and correspondingly interpolate frames between the two surrounding key frames. Referring to Fig. 25, when a user signals a desire to delete a key frame (for example, by positioning the cursor symbol at the corresponding bar of the bar chart display area 160e and issuing an appropriate control signal using the position sensitive input device 170a or keyboard 170b), the CPU 110 reads the key frame number of the key frame concerned and accesses the timeline data table. The key frame numbers of succeeding key frames in the list maintained by the timeline table are accordingly decremented by one (box 2501), and then the current key frame table is deleted from the memory 121 (box 2502). All interpolated frame tables, listed within the key frame table are also deleted.
The CPU 110 tests (box 2503) whether there is an earlier key frame in the key frame list. If the key frame is the first key frame of the timeline, the only further action taken is to regenerate the image data in the sequence display buffer 130d and update (box 2505) the sequence display 160d and likewise amend the timeline buffer 130e and display area 160e (box 2504), to remove the references to the deleted keyframe and its interpolated frame. The succeeding frames may also be shifted back in time.
On the other hand, if the deleted key frame occurs later in the sequence, the CPU 110 performs the interpolation process shown in Fig. 23 (box 2506) from the key frame which preceded the deleted frame to its new successor in the timeline. Since the frame numbers of the following key frames have not been changed, the key frame will be replaced for display by an interpolated frame (box 2507). The sequence image in the sequence image store 130d and the bar chart image in the bar chart image buffer 130e are updated by the CPU 110 (box 2505), and correspondingly redisplayed on the monitor 160.
Moving a Key Frame
To change the length of an interpolated sequence in time, or to rearrange the order of the sequence, the preferred embodiment enables the user to indicate a particular key frame and change its time of occurrence in the sequence (e.g. frame number).
Typically, the user indicates an intention to move the key frame by positioning the cursor symbol at a desired key frame bar on the bar chart display area 160e and inputting an appropriate control signal, via the keyboard 170b or position sensitive input device 170a, and then moving the cursor symbol to the desired new key frame location.
Referring to Fig. 26, the CPU 110 determines from the cursor symbol position the frame number corresponding to the new location and tests (box 2601) whether it has moved beyond its neighbouring keys. If the frame has not been moved past either of its neighbouring key frames, the frame number of the key frame is changed to that of the new location (box 2602) and the interpolation routine of Fig. 22 is then executed (box 2603). If the key frame is moved on to the frame number of its neighbouring key frames, the preexisting key frame is deleted (box 2604) and the key frame list is amended to avoid reference to it. The key frame numbers of all following key frames in the key frame list are then decremented (box 2605). After this, the CPU 110 continues, as above, by allocating the key frame a new frame number and interpolating using the process of Fig. 22 (boxes 2602, 2603).
If the key frame has been moved past either of its neighbours, the CPU 110 first (box 2606) removes the key frame from the key frame list and links the pointers of the neighbouring key frames, and then (box 2607) executes the interpolation routine of Fig. 23 to regenerate the interpolated frames for the key frame preceding the deleted key frame.
Next, in box 2608, the CPU 110 locates the key frame at or immediately preceding in the key frame list the new frame to which the selected key frame is to be moved. If there is already a key frame at the position to which the selected key frame is to be moved, the CPU 110 deletes the record of that key frame (box 2609). The selected key frame is then inserted in the key frame list maintained in the timeline table, just after the previous key frame position, by amending the "previous" and "next" pointers in the key frame tables concerned (box 2610).
The key frame numbers of key frames between the old position and the new position are then decremented (box 2611) to reflect the new key order. Furthermore, if the key frame has replaced an existing key frame at its new position, subsequent key frames are also decremented. Thereafter, the CPU 110 proceeds as above to update the key frame frame number (box 2602), generate new interpolated frames between the key frame and its neighbour on either side (box 2603), and regenerate the sequence image buffer 130d and display 160d, and correspondingly the timeline buffer 130e and display area 160e (box 2612).
In a preferred embodiment, the CPU 110 is arranged to be capable of accepting an instruction to move a block of successive frames in time; the above process is in this embodiment essentially repeated for each such frame. Example Sequence of Operations
Referring to Figs. 27a-d, and to Fig. 18, the results of the sequence operations described above will be illustrated. Referring to Fig. 27a, the user positions the position sensitive input device so as to move the cursor symbol to the next vacant point in the bar chart display area 160e on the monitor 160, and initiates a control signal indicating a desire to create a new key frame thereat.
The CPU 110 copies the template (or an existing key frame) to create a new key frame table in the memory 121 as discussed above. The sequence display buffer 130d is regenerated, and the display area 160 consequently displays the new key frame at the end of the sequence. The bar chart display area 160e likewise displays a new key frame bar. Preferably, the apparatus is arranged also to generate a new key frame which is a copy of an existing key frame; in this case, the user may designate the existing key frame he wishes to copy using the position sensitive input device 170a to position the cursor symbol appropriately, and upon generating an appropriate control signal via an input device the CPU 110 will, rather than copying the template table, copy the designated key frame table curve data to produce the new key frame table curve data.
Referring to Fig. 27b, the user then generates an input signal indicating an intention to move the just created key frame four frames later in time. The CPU 110 performs the routine of the centre path of Fig. 26, and four interpolated frames are added to the interpolated frame list of the preceding key frame. The sequence display and timeline displays 160d, 160e are then updated as above.
Referring to Fig. 27c, the user signals a desire to delete the preceding key frame and the CPU 110 executes the routine of Fig. 25. Since the last two key frames are now substantially identical, it will be seen that the key frames interpolated therebetween are likewise identical. Referring to Fig. 27d, the user next signals an intention to convert one of the intervening interpolated frames into a key frame, to allow for subsequent editing. The CPU 110 follows the routine of Fig. 24, and updates the displays 160d and 160e.
Adding Further Curves to a Frame
In the above described embodiments, each key frame (and consequently each interpolated frame also) includes only those curves defined by curve control points which exist in the template frame. The method of adding control points and new curves to the template has already been discussed above.
Initially, as discussed above, each key frame comprises the same curve data as the template to which it is consequently identical. However, the user will often wish to delete some parts of the template for a given key frame; for instance, when an object is turned, many lines become invisible as they are obscured by other parts of the object. The key frame corresponding to the turned object would therefore not be required to include those lines.
Accordingly, the user can delete some control points (and/or curves from a key frame, and the pointers in the frame table 122 will be correspondingly reset to omit references to the deleted points and curves. In this case, the CPU 110 does not affect any other key frame or the template frame table. However, the repositioning of the pointers within the frame table 122 does not affect the correspondence between the remaining control points and curves and their counterparts in the template set. Each is still uniquely identifiable to the CPU as corresponding to a particular point in the template set. It is thus possible for different frames to correspond to different subsets of the template set. It may also occur that, whilst preparing a particular key frame, a user wishes to add a further control point or a curve comprising a number of control points. To do so directly would however introduce points which had no counterparts in the template set. It is nonetheless inconvenient to have to edit the template set directly to produce a result in a particular key frame. In preferred embodiments, therefore, the apparatus is arranged to allow a user to add further control points to a key frame, exactly in the manner described for the template frame, but upon his doing so CPU 110 is arranged to add a corresponding point to the template frame table. The template frame table therefore always comprises a super set of the points held in each key frame. Interpolation between frames, and adding of frames of different timelines to produce composite frames, is still possible even if one frame includes extra curve control points or curves.
The operations of interpolating between the frames and adding to frames both reguire a one-to-one correspondence between curve control points. Thus, to perform either of these operations, a first step is to make the two frames compatible by equalising the number of points to be interpolated between or to be added together. To illustrate the manner in which this is achieved, reference is made to Figs. 28 and 29.
In Fig. 28a, a curve is shown as it would appear in the frame display area 160e. The shape of the curve is defined by two control points A1, A2, at which the corresponding curve tangents are indicated. Three attribute control points B1, B2, B3 are shown on the curve segment between the two curve control points A1, A2. Fig. 29 shows the corresponding curve table 2100 stored within the working memory 121. The table includes two curve control point records, first corresponding to A1, and pointing to the next record corresponding to A2. The curve control point record corresponding to A1 also points to the list of attribute control point records, the first of which corresponds to B1, which in turn points to that corresponding to B2, which likewise points to that corresponding to B3.
Referring to Fig. 28b, upon the user generating a control signal indicating that a selected attribute control point B2 is to be converted into a curvature control point located at the same position on the curve, the CPU 110 creates a new curve control point record A3 within the curve table 2100. The record corresponding to the point A1 is altered to point to the new record, which in turn points to A2. The attribute control point record corresponding to B2 is deleted from the attribute control point list. The control point data stored for the new control point A3 corresponds to the position at the curve previously occupied by the attribute control point, and tangent extents such that the tangent slope at the new control point is the same as it had been at the attribute control point B2. The lengths of the tangents at three curvature control points A1, A2, A3 are calculated so as to keep the shape of the curve unchanged; it will be observed from Fig. 28b that the length, but not the angles, of the tangent set control points A1 and A2 have altered. Accordingly, new extent point data is written to the records corresponding to A1 and A2 by the CPU 110. The attribute control point B3 is deleted from the list of the curvature control point record for A1, and added to that for A2.
The position data defining the positions along the curve of the attribute control points Bl, B3 are recalculated within the curve segments from A1-A3 and A3-A2, and the new data are stored with the attribute control point records Bl, B3. Having created a new curvature control point A3, the user may employ the apparatus according to this embodiment to amend the curve shape by altering the control point data as described above; in particular, as shown in Fig. 28c, poonts of inflection may be introduced by setting the tangent extent points to define different tangent angles. Referring to Fig. 28d, the apparatus of this embodiment is arranged also to convert a control point into an attribute control point if desired; in this case, the control point record A3 is deleted and the pointer stored with the record for A1 is amended to point to the record for A2. A new attribute control point record for new attribute control point B2 is created. The attribute point records for B2 and B3 are added to the list held for the curvature control point record for A1. The curve is recalculated by the CPU 110, and the position data for the three attribute control points are amended.
Key frames in this embodiment of the invention are permitted to include more curvature control points than does the template frame from which they are derived, where a corresponding attribute control point exists in the template frame. Thus, when two frames are to be added or interpolated between, one may include curvature control points not present in the other, but the other will include a corresponding attribute control point, since it is derived from the same template. Referring to Fig. 30, the CPU 110 is therefore arranged, when a curvature control point not having a corresponding curvature control point in another frame is located, to locate the corresponding attribute control point in the other frame and convert that point into a curvature control point as discussed above with reference to Figs. 28a and 28b and Figs. 29a and 29b. The two frames will then be in correspondence, and may be added or interpolated between as discussed above.
ADDING FRAMES FROM PARALLEL TIMELINES
As stated above, one advantage of the preferred embodiments is that different parts of an object may be animated separately and the separate sub-sequences (timelines) can be amalgamated together. This is possible because all frames of the different timelines have the same topology, or are all the sub-set of a common template table. The operation of adding frames is similar to that of interpolation, as discussed below, except that whereas in interpolation predetermined proportions of a pair of frames are added, in addition it is generally (although not necessarily) the case that equal proportions of each frame are added.
Essentially, the CPU 110 locates a pair (or, in general, a plurality) of frames of different timelines occurring at the same point in time, and derives a composite frame by taking, for each curve control point of the composite frame, the corresponding curve control points in each of the existing frames. From the coordinates of these, the coordinates of the corresponding point of the template frame is subtracted so as to generate difference coordinates, defining the difference between the control point coordinates of the key frames, and the coordinates of the corresponding points of the template frame to which they correspond.
The difference coordinates for a corresponding control point in each frame are then added together to form summed difference coordinates for that control point of the composite frame, to which the absolute coordinates of the corresponding control point in the template frame table are added to derive the composite control point coordinates.
Thus, each composite control point corresponds to the sum of the corresponding template control point coordinates, and the vector sum of the differences between the corresponding control points of time aligned frames of different timelines and the template.
More generally, it is possible to form the sum of the vector differences weighted by predetermined constants, so that the composite frame depends more upon the frame from one timeline than from another.
Equally, the arithmetic can of course be rearranged so that the coordinates of the frames of the timeline are added together first and then predetermined multiples of the template coordinates are subtracted from the sum.
In this way, a composite sequence of frames which correspond to the sums of the deviations from the template of the different timeline of sequence can be formed.
Referring to Fig. 31, one way of adding a plurality of frames is as follows. The CPU 110 creates a new frame table hereafter termed a difference table temporarily within the memory 121 for each frame which is to be added. The coordinates of each curve control point of each frame are subtracted from those of the corresponding point stored in the template table, and the difference in coordinates are stored in the difference frame table corresponding to that frame.
When difference tables have been set up for all frames to be added, the difference tables are made mutually compatible according to the process of Fig. 30.
The CPU 110 then creates a result frame table in the memory 110. It then reads the template table, and for each curve record, checks whether that curve record is present in any of the difference frames. If the corresponding curve exists in no difference frames, the CPU 110 proceeds to the next curve in the template table. If the corresponding curve exists in all difference frames, for each curve control point in the sequence, the sum of the difference coordinates for the corresponding control points in the difference tables is taken and the result is added to the coordinates of the corresponding point in the template table and stored in the result table. The next curve in the template table is then processed. If a curve is not in all the frames to be added, the CPU 110 tests whether any of the frames to be added are key frames and, if so, whether the curve in question is in a key frame. If so, the sum of the difference coordinates for the frames in which the curve is present is taken and added to the template coordinates as before. If not, in other words if the curve is present only in an interpolated frame or frames, the curve is omitted from the result frame table.
Once the CPU 110 has considered all the curves in the template table, the result frame table will include all the curve control points necessary. At this stage, the CPU 110 derives the positions of any attribute points, as shown in Fig. 32, by taking in turn each curve in the results frame table and considering each attribute point in turn. If an attribute point occurs on all the curves to be added, the CPU 110 derives averaged or interpolated values for the attribute point position parameter and, optionally, for any attribute data (e.g. line width, or opacity profile) which may be stored in the attribute point records. The interpolated values (e.g. the average values) are then stored in the results table. If an attribute point is not present in all the frames to be added, then unless one of the frames in which it occurs is a key frame, the CPU 110 allocates a value equal to the position value in the template for each frame in which the attribute point is absent and interpolates a new position between all frame values as above.
Preferably, the interpolation of the attribute point position is not simply an interpolation between the two parametric position data values in the corresponding pair of frames interpolated between, but is derived by deriving the length of the corresponding curve segment in the interpolated frame, and the actual curve segment length is divided into the required interpolation ratio, the corresponding position on the curve is found, and the corresponding value of the parameter t at that position is derived and stored as the interpolated attribute point position.
If the attribute point is present in a key frame, the key frame attribute point position data is stored in the results table as this is relatively more significant than the position derived from an interpolated frame.
As mentioned above, preferably, for each sequence, a current composite sequence comprising a set of frame tables is maintained together with a corresponding base sequence corresponding a further set of frame tables, the base sequence comprising the composite sum as discussed above of all timelines other than at presently being displayed for editing. After the current timeline has been edited, it is thus merely added to the current basic composite sequence to generate a new composite sequence, thus reducing the amount of computation necessary.
The operations of interpolation and addition will be seen to be closely similar; although in the above described embodiments, for clarity, interpolation between frames and addition of frame differences from the template are described, it is possible on the one hand to interpolate using frame differences (adding the result to the template frame coordinates) and on the other hand to add frames (subtracting the template coordinates or a multiple thereof afterwards); in practice, for convenience, the memory 121 may contain either frame tables stored as absolute point coordinates or frame tables stored as coordinates defining the difference from the corresponding coordinates in the template table. The processes described in Figs. 30-32 are equally applicable, and are preferably applied, to interpolation, mutatis mutandis.
Replayer 103
The replayer 103 in one embodiment of the invention is provided by the CPU 110 operating under suitable stored program control.
In one embodiment, the replayer is arranged to display the frames at a rate corresponding to the frame repetition rate (24, 25 or 30 Hz) at which the sequence is to be displayed, so that the operator can view the sequence at a realistic rate. Preferably, however, the replayer 103 is arranged also to accept input commands from the keyboard or other input device specifying the speed of replay. This is particularly useful in enabling an operator to view crucial parts of the sequence in slow motion, or to move quickly through a sequence for cursory inspection. In another preferred embodiment, the replayer 103 is arranged to accept input signals (from the keyboard 170b or more preferably, the position sensitive input device 170a in cooperation with the timeline display) to specify an initial and/or a final frame in the sequence between which the sequence is to be replayed. An operator can thereby designate a particular part of the sequence to be replayed, and the replayer 103 will display in turn each frame between the initial and end frames. It is particularly convenient if the replayer 103 is arranged to constantly cycle between the start and finish frames; this may either be by displaying the sequence repeatedly from the first frame to the last frame, or by displaying the sequence forwardly (from start to finish) and then backwardly (from last to first) repeatedly. This is found particularly useful in enabling the operator to localise a particular frame or series of frames which are incorrect, for subsequent editing.
If the CPU 110 operates sufficiently fast, it would be possible for the replayer 113 to be arranged to access the memory 120, and to cause the display generator 111 to access in turn each frame table 122 corresponding to each frame of a sequence between the first and last frames specified. However, many CPU's available at present are incapable of generating entire frames of data in real time; thus, the replayer 103 is arranged instead to perform an initial operation of creating, for each frame to be displayed, a raster image by causing the display generator 111 to access in turn each frame table 122 and generate an image in the image store 130, and after each image is created the replayer 103 is arranged to cause the image to be stored on the mass storage device (e.g. hard disk) 180. In this context, a computer which includes image compression means for compression encoding the image for storage on hard disk is preferred, since otherwise the volume of image data stored corresponding to the frames of even a relatively short sequence is extremely large. Once image data or a plurality of frames has been stored on the mass storage device 180, the replayer 103 is arranged to display the sequence by accessing the image data corresponding to each frame in turn to refresh the image store 130 at the desired frame repetition rate. Once the operator signals a desire to cease replaying, the image data files corresponding to the frames in the replayed sequence may be deleted from the mass storage device 180, to reduce the memory used. In a preferred embodiment, the replayer 103 is also arranged during the initial phase of preparing the sequence of images to cause the renderer 105 to render each frame as discussed below, so that the replayed sequence can be seen in colour and/or against the background.
Having viewed the replayed sequence or part thereof, it will often be desired to edit the sequence and in this case, the operator instructs the CPU 110 to cease replaying and commence editing.
Renderer 105
The renderer 105 may again comprise the CPU 110 operating under stored program control, or may be provided by different computer 100. In either case, the operation of the renderer, as is conventional, is to colour the image and/or to mix the picture with a background picture. The renderer 105 therefore reads the data stored in a table 122 corresponding to a frame to be rendered, and processes the frame in accordance with predetermined stored colour and/or background information. In particular, the attribute control points stored, as described above, may include colour and other attribute information (for example transparency), the manner of rendering which is described in our British Application No. 9110945.4, PCT Application PCT/GB91/02124 and our copending PCT application PCT/GB92/ filed on the same day as the present application (Agents References 5130799) incorporated herein by reference in its entirety. Modifications and Other Embodiments
In the foregoing, it will be noted that attribute control points are employed for several purposes; firstly, to set the values of attributes which will subsequently be used during rendering, so that a considerable amount of rendering information need be specified only for individual key frames and is automatically inserted into frames interpolated there between at correct positions; secondly, as a means for providing the possibility of extra curve control points, to increase the complexity where necessary without doing so otherwise, whilst maintaining topological similarity between the frames. Other applications are likewise not precluded. Likewise, attribute values need not be set at separate points to those used to define curvature, but may be provided at curvature control points; although it is very much preferred to provide the flexibility to define attributes at points along the curve segment as well.
According to the present invention, apparatus allowing the definition of lines in terms of a limited number of control points controlling the path of the lines and also allowing the specification of attribute or feature properties at predetermined points along the lines is used, as described in our International Application GB91/02124, with such points marking the points at which further curves or structures of curves are to be connected to a line.
Referring to Fig. 3, a flag may be stored indicating that one line is linked to another line or forms an assembly therewith and the two may be displayed connected on the supervisory display 160. In this case, upon moving scaling or otherwise varying (for example by another affine transformation) one object, parameters (e.g. the position) of the other are automatically correspondingly amended by the supervisory display editor 113.
Preferably, as shown in Fig. 34, the lines which are linked together are displayed on the supervisory display 160 with paths joined. At the join, a control point symbol J is displayed. This symbol represents a path control point for the joining path, but has no effect upon the joined path. It is constrained to be moved along the joined path, to change the point of contact of the objects. Data relating to this point within the display is therefore stored within two separate line tables, as shown in Fig. 35; in the subsidiary line table 122B, the stored data represents, as usual, position coordinates and tangent end point data whereas in the table 122A relating to the line to which it is connected, the data stored is a position value (relative to the line, and preferably a value of the interpolated parameter at that position along the line) and the address within the memory 120 of the line table 122B corresponding to the joining or subsidiary line. Accordingly, when the joining point position is changed, for example by the editor 113, the supervisory display editor 113 is operable firstly to change the parametric position of the joining point within the joined line table 122a, and secondly, to access the connected line table 122B using the stored base address, calculate the actual positional coordinates of the joining point, and amend the x,y coordinates of the curve control points in the attached line by a corresponding amount.
Whenever a line is moved or transformed in any way, by varying its line table data, the editor 113 is likewise operable in this embodiment to recalculate the actual positions of the joining points of that line, and access the line tables 122B of the attached lines joined at those points, and correspondingly amend the curve control position data therein, as shown in Figure 6B, and the line generator updates the display on monitor 160.
In a first embodiment, illustrated in Figure 34, the attached curve B is attached at only one point, shown as J, on the curve A. When the curve A is altered (moved or changed in shape) the editor 113 first calculates the positional coordinates of the attachment point J on the curve A from the line table 122A. This coordinate is the new coordinate for the corresponding point on the curve B. Preferably (although not necessarily) this point is a curve control point for the curve B. In this case, the entry in the table 122A may point to the control point entry in the table 122B. The previous value of the position of this point is read from the Table 122B, and the difference in coordinates between the old position and the new position is calculated by the editor 113. The editor 113 then reads each of the remaining curve control point and tangent end point coordinate data in turn from the table 122B, adds the coordinate difference, and stores the resulting shifted coordinate data back in the table 122B. The shifted line may then be displayed.
Each curve table 122 is examined and an attachment state flag 901 is read (shown in Figure 35). The attachment state flag 901 indicates whether a curve is attached to another (for example the curve B in Fig. 34), in which case its position is dictated by that of the curve to which it is attached, or whether the curve is unattached.
It would also be possible to generate in between frames by interpolating only unattached curves, the editor 113 then generating new attached curve tables for each interpolant frame. However, it is preferred to interpolate all curves, attached or unattached, as described above and then edit individually any frames where interpolation has separated the attachments between curves, or add new keyframes to reduce the detachment.
The attachment point positions along the line to which the second line is attached may be edited in the manner of other attribute points as described above and in our earlier filed international application GB91/02124; after any such variation of the attachment point, the editor 113 recalculates the positions of the attached line control points as described above.
It will be seen that the above method of amending the position of the attached curve B effectively shifts the curve B in space without changing its shape or its inclination in space as illustrated in Fig. 38. This is desirable for many applications.
The attachment between the two lines Is created or broken by adding or deleting an attribute point to the unattached line in the same manner as described above, and its position along the line is likewise editable by the user. After breaking the attachment by deleting the attachment attribute point, the state of the flag 901 in the table 122B is changed and the line is subsequently treated as an unattached line. After moving the attachment point, the data in table 122B is recalculated as described above. It will be understood that the attachment between the two lines does not require that the two lines actually meet in space; in an alternative embodiment, the attribute point stored on the unattached line stores also data which defines the position of the second line relative to the attribute point (for example positional offset data specifying the coordinate difference between a control point of the attached curve from the attachment point of the unattached curve), and the editor 113 calculates the position of the attached curve taking into account this position data.
In a further embodiment, the editor 113 is arranged to vary the inclination of the attached curve with variations in the inclination of the curve to which it is attached. Figure 37 (which corresponds to Figure 34) illustrates this. In this embodiment, the editor 113 is arranged not only to calculate the position of the attachment point, but also the tangent at that position. The previous tangent value is retained, and the editor 113 calculates the angle through which the tangent has rotated (in other words, the angular difference between the old tangent and the present tangent ) and derives the control point positions of the attached curve by firstly translating the point positions, as described above, by the positional shift undergone by the attachment point and then rotating the shifted positions about the attachment point by the same amount as the rotation undergone by the tangent at the attachment point, as shown in Fig. 40.
In a further embodiment, an attached curve may be attached at two points, either to two different curves or to two portions of the same curve. Figure 39 illustrates a case where a curve is attached at two points (in this case, at its curve control points at either end) to two other curves. In Fig. 39B, the two attachment points have been edited, and changed position. If the attached curve is to remain attached at both points, the new attached curve position data needs to be calculated taking account of changes to both of the curves to which it is attached. In this embodiment, referring to Fig. 41, when an attached curve is to be interpolated or when the line image is edited by a user, the editor 113 first determines from the line tables 122 which curves, if any, within the image are doubly attached as shown in Fig. 39. Unattached curves and singly attached curves are processed as described in the above embodiments. For each doubly attached curve, the first step is to determine the positions, on each of the curves to which it is attached, of the attachment attribute points in the above embodiments. These two attachment point positions define between them a connecting straight line, shown in Fig. 39. The two attachment point positions before interpolation or editing, shown in Figure 39A, define a first line, and those after editing define a second, different line. Initially, the editor 113 calculates the constants m and c defining the line y = mx + c through the two attachment points, using the attachment point coordinates. Next, the total length of the line between the two end points is calculated, as the root of the sum of the squares of the x and y coordinate difference between the two attachment points. Next, the editor 113 calculates the minimum distance between each of the control point positions and tangent end point positions along the attached curve and the connecting line between the two points of attachment (in other words, the distance off the line of each point). This is readily calculated, as the minimum distance is the distance along a normal to the line which runs through the control point concerned. For each control point, the minimum distance value is temporarily stored. The editor 113 also calculates the distance along the line of the point where each normal through each control point on the attached curve meets the line ( in other words , the distance along the line of each control or tangent each point), and divides this into the total line length to provide a fractional length measurement for each control point along the line.
Thus, for each control point of the attached curve, a distance away from the line between the attached points is temporarily stored together with a distance along the line.
Next, the editor 113 calculates the coefficients m and c of the new line running through the new attachment points derived after editing or interpolation of the unattached lines, as shown in Fig. 39B. The total length of this new line is also calculated. Each control point position of the attached curve is then calculated by calculating the corresponding distance along the new line using the stored fractional distance for each control point multiplied by the new line length, and then locating the new control point position at the stored off-line distance along a normal from that point. It would alternatively be possible to store the distances off the line as fractions of the total line length, and for the editor 113 to multiply the fractions by the new line length, so as to preserve the attached curve shape without distortion (other than scaling).
The above described embodiment is effective in maintaining an attached curve smooth, and in the correct positional relationship to the two curves to which it is attached. However, as can be seen from inspection of Fig. 39B, the angle at which the attached curve meets the curve to which it is attached has not been preserved. In many line images, this would not lead to a desired final result. It would, of course, be possible for the human operator to quickly edit the attached curves (for example each interpolated frame), and this is considerably more effective than completely redrawing the attached curves in each interpolated frame. However, in further embodiments of the invention, the editor 113 acts to amend the attached curve to maintain a relationship between the angular position of the attached curve and that of the curve to which it is attached, at the attachment region.
Referring to Figure 42, in one such embodiment, an initial straight line between the two attachment points is calculated as described above, but after editing (or interpolation), rather than calculating a new straight line, a curved line is calculated having the curvature at its ends dictated by the change in tangent angle at the new attachment point at either end. The control point positions for the attached curve are then derived using the distances from and along this curved line. In more detail, in this embodiment the editor 113 derives initially the positions of the attachment points on each unattached curve and the tangent to the curve of each attachment point. The angle between the tangent to the curve at the attachment point and the connecting straight line is then calculated at each point. As before, the editor 113 then finds the new position for the attachment points, and also calculates the new tangent value to the unattached curve at each of the attachment points.
A new curve is then calculated to run between the two attachment points, having at each attachment point a tangent angle which is rotated from the tangent to the unattached curve by the same amount as was the straight line prior to editing, as shown in Fig. 42B. For example, the curved line may be calculated as a Bezier curve with tangent lengths of one third the length of the initial straight line. Thus, any rotation of the tangent to the unattached curve causes an equal consequent rotation in the tangent at the end of the line running between the two attachment points. The spacings of the control points of the attached curve along the new curved line connecting the two attachment points are used, as in the previous embodiment to locate the new control point positions. As shown in Figure 42B, if the effect of editing one of the unattached curves is to warp the connecting line between the two attachment points, the attached curve control points, and hence the attached curve, follow the warp. In this embodiment, it is possible for the tangent end point positions of the attached curve to be differently amended so that tangents which were previously colinear become disjointed, which when the attached curve is reproduced on the supervisory screen can result in the appearance of discontinuity to the line. This is generally visually undesirable, and consequently in a preferred embodiment of the invention the editor 113 prevents the tangents at each control point from becoming differently angularly aligned. This is achieved, in one embodiment, by arranging the editor 113 to recalculate only the positions of the tangent end points, and then to interpolate the path control point position to lie on a straight line between the two tangent ends, preferably at a distance between the two which divides the line in the same ratio as the ratio of the two tangent lengths before amendment.
In a further embodiment, the editor 113 is arranged to apply a spatial transformation to find the new control point positions as follows:
Firstly, the editor 113 finds a patch of space which includes all the control points for the attached curve prior to editing, shown as P1 in Fig. 43A.
Conveniently, this is achieved by defining a rectangle having two sides parallel to the straight line connecting the two attachment points, and two ends normal to the two sides running through the attachment points, the two sides parallel to the connecting line being defined by the greatest extents of the curve away from the connecting line (In other words, the "bounding box" of the curve). The position of each control point within this area P1 is then found, by using the sides as coordinate axes, as a pair of coordinates (u,v), scaled as a fraction of the whole length of the respective side, thus, a point in the lowest corner has coordinates (0,0) and one in the highest corner has coordinates (1,1). Next, as in the previous embodiment, the rotation of the tangent to the unattached curve at each attachment point after amendment is also calculated. A new surface area P2 is then derived, in which the two end edges run through the attachment points and are normal to the new straight line running between the two attachment points. The other two edges are parallel Bezier curves, running parallel to the curved connecting line, calculated (as described above with reference to Fig. 42) by rotating the ends of the connecting straight line to match rotations in tangent angle of the unattached curves of the attachment points.
The editor 113 has therefore derived the coefficients which define two straight lines and two Bezier curves, which define the area P2 after editing.
These coefficients are then employed to apply a Coons patch transformation to the (u,v) coordinates for each control point and tangent end point, in a manner described, for example in "Curves and surfaces for computer aided geometric design, a practical guide" (second edition) by Gerald Farin, ISBN0-12-249051-7, published by Academic Press, for example at paragraph 20.2 thereof. The transformed coordinates thus calculated give the position within the new area P2 of each control point of the attached curve.
Referring to Figure 44, in an alternative embodiment, the initial patch P1 is the same as in the previous embodiment but the newly calculated P2 is derived to have its two end edges straight, and each end edge is rotated by an amount corresponding to the rotation of the tangent to the unattached curve at the attachment point. The other two sides, as before, correspond to the curve running between the two attachment points calculated by rotating the tangent ends as described with reference to Figure 42. Since the two end sides of the patch P2 are now in general no longer parallel, the two sides are rotated and scaled as necessary to run between the ends.
In these embodiments, as in the preceding embodiments, preferably the editor 113 prevents the breakage of parallel tangents at each control point (for example as before by transforming the tangent end points and interpolating the path control points in between the two). Equally, alternative methods of transforming the control point positions to take account of changes in shape of the region including the control point may be employed. Referring to Figure 45, a curve may be attached at more than two points; in this case, the attachments resolve to a series of either doubly attached curve segments or singly attached curved segments, and the editor 113 is preferably arranged to process each segment according to one of the above described embodiments in turn. This could lead to discontinuities at the attachment points, where tangents may be caused to become non-parallel. This problem is relatively straightforward for a human operator to overcome by appropriate editing, or alternatively the editor 113 could be arranged to subsequently make tangents at connection points parallel one to another (for example at an averaged inclination).
Referring to Figure 46, the editor 113 may permit the user to define, at a control point, a flag indicating that that control point is not to be moved with the attached curve, but is to remain fixed in space. Such a fixed point acts in effect like a further attachment point. The fixed point position and the tangent at the fixed point (both of which remain constant) are employed instead of the attachment point and attachment tangent in the above described embodiments; optionally the fixed point position may be used and the tangents ignored.
A further attached curve may be connected to an attached curve, and in this case since the position of the second curve depends upon that of the first it is necessary to first calculate the new curve control point positions of the first attached curve, and then from this those of the second curve are calculated by the editor 113 as described in the above embodiments. The order in which attached curves are to be drawn therefore defines a hierarchy. Clearly, a curve should not be attached to itself since this would result in the position of the curve depending on prior knowledge of its position. Cycles in the hierarchy are therefore not permitted. An example of an application of such a hierarchical structure is in the building of the face of a cartoon character. The outline of the face is connected to the body. The hair is connected to the top of the face outline. The lower jaw outline is connected to the bottom of the face outline. The lines on the chin, on the bottom lip are connected to the outline lower jaw. The nose and eyes are connected to the sides of the face outline, by spaced connections defining a distance between the outline and the eyes, at a single connection point, so that the eyes do not deform but merely move with changes in the face. The pupils are connected to the eyes. The eyebrows are connected to the forehead and to the eyes. The other features may similarly be built up hierarchically. Thus, the movement of a line higher up in the hierarchy (for example on the body) causes the editor 113 to move all the lines lower down in the hierarchy (in the face) greatly reducing the effort required of the human animator.
It would equally be possible to provide such a hierarchical structure of curve attachment by specifying the coordinate transformations between each attached curve and the curve to which it is attached, as described in relation to 3 dimensional structures in our earlier International Application no. GB91/02122. This may reduce the time taken to manipulate and edit lines initially.
The above described embodiments for manipulating a second subset of curves (attached curves) in dependence upon changes to a first subset of curves (unattached curves) can be employed to provide a "blue pencil" mode of operation as follows. Referring to Figure 47, in Figure 47a a key frame or template line image display on monitor 160 of a cartoon character consisting of a set of curves defining the essential outline of the character is shown. In Figure 47B, the finished shape of the character is shown, with the curves of 47A represented in dotted lines. It will be seen that the "blue pencil" outline of Figure 47A provides a structure around which the "black pencil" outline which will be reproduced in the finished image is arranged. Some of the "blue pencil" lines are required to be visible in the finished image, whereas others are not.
In this embodiment of the invention, an operator can specify a first subset of lines of a line image as being for direct manipulation and a second subset as being attached to the first subset, to be automatically manipulated by the editor 113 on manipulation of the first subset for example using the above embodiments. The first and second subsets are processed differently by the line image generator 111. In one embodiment, the line image generator 111 is operable in two modes according to a user's preference (selected by operating the keyboard 170B to enter a mode change command, or by positioning the input device 170A at an appropriate portion of the display); a first mode in which only the first subset of lines (blue pencil lines) are generated on the display 160 and a second mode in which the second subset (and, optionally, the first) are displayed. Thus, the line image generator 111 checks the line table 122 of each curve and in the event that the flag 901 is set to indicate that the curve is attached, the line image of that line is not generated in the first ("blue pencil") mode. The user is therefore free to edit and manipulate the displayed first subset curves as described in the above embodiments, and the display editor 113 updates the curve tables for the undisplayed attached curves. This results in a clear display which is straightforward to operate without cluttered lines. When the display (for example, in creating a key frame) has been edited to the desired appearance in the first mode by the animator, it may be displayed in the second mode (by the animator, or an assistant animator or clean up artist) in which the attached curves are visible as in Fig. 47B, and any minor amendments to the attached curve positions may be made as described above using the display editor 113.
It would be possible in this aspect to provide for several levels of detail; for instance, a framework level in which a skeletal first subset of curves is manipulated; an outline level, in which a character outline (the curves of which are attached to the skeletal curves) may be edited, and a detailed level in which detailed curves attached to the outline curve or the skeletal curves may be edited. Each of these three subsets of curves may be separately displayed in a corresponding display, or there may be provided three modes in which only the first set, the first and second sets, and all three sets of curves are displayed. In this embodiment, an animator may edit a frame to provide a simple large scale motion such as movement of a leg, by merely manipulating one frame curve and the editor 113 consequently changes the shape of the outline and the detail attached to the outline without human intervention.
It will be understood that the essence of this aspect of the invention lies in providing separate display modes, in a first of which one set of lines are manipulatable so as to affect the appearance of a second set of lines which are displayed in a second mode. Although this aspect has been described with reference to the method of curve attachment described above, it could also be implemented using prior art techniques of curve grouping, for example; although this is not preferred. Likewise, image portions other than lines could be displayed and manipulated.
Likewise, It will be understood that it may be of benefit in other applications to provide an editable image in which levels of detail are hidden but are automatically edited in dependence upon shown portions of the image. Such applications outside animation are likewise within the scope of the invention.

Claims

1. Apparatus for generating an output image which comprises:
means for displaying said images as a plurality of lines;
store means for storing line data related to said lines;
line generating means for generating said displayed lines from said stored data; and
editing means for editing said stored data so as to change the appearance of said display;
characterised in that said lines comprise a first subset of lines and a second subset of lines, and in that the generator means is arranged to generate a display of the first subset of lines which is different to the display of the second subset of lines, and in that the editing means, after editing stored data of said first subset, is arranged to make consequential amendments of stored data of said second subset prior to displaying said second subset.
2. Apparatus according to claim 1 in which the generator means is operable in two modes; a first mode in which the first subset of lines are displayed and the second are not, and a second mode in which the second subset of lines are displayed, and in which there are provided mode selection means for selecting between the said two modes.
3. Apparatus according to claim 1 in which the store means stores link data linking each of the second subset of lines to a specified line or lines of the first subset of lines.
4. Apparatus according to claim 1 in which said line images comprise key frame images of an animated sequence.
5. Apparatus according to claim 4 wherein the store means is arranged to store data defining a template Image, and means for generating each said key frame image from said template image.
6. Apparatus according to claim 1 in which the line data comprises point coordinate data of spaced apart control points defining each line.
7. Apparatus according to claim 6 wherein the control points lie on the line and the data includes also data defining the tangents to the line at each control point.
8. Apparatus according to claim 3 wherein the link data includes attachment point data specifying, for each line of the second subset, a point or points along the line or lines of the first subset by reference to which the position of that line of the second subset is defined, as a relative distance along the or each line of the first subset.
9. Apparatus according to claim 1 which is arranged, after operation of the editing means to amend line data of the line of the first subset, to operate the editing means to amend line data of any line of the second subset the position of which is defined by reference to that line of the first.
10. Apparatus according to claim 1 further comprising a third subset of lines, the positions of which are defined in dependence upon those of lines of the second subset.
11. Apparatus for generating an output image which comprises:
display means for displaying said images as a plurality of lines;
store means for storing line data related to said lines;
line generating means for generating said displayed lines from said stored data for display on said display means; and
editing means for editing said stored data so as to change the appearance of said display;
characterised in that said lines comprise a first subset of lines and a second subset of lines, and in that the store means stores link data linking each of the second subset of lines to a specified line or lines of the first subset of lines, and in that, after editing stored data of said first subset, the editing means is arranged to read said link data and to make consequential amendments of stored data of said second subset prior to displaying said second subset.
12. Apparatus according to claim 11 in which the line data comprises point coordinate data of spaced apart control points defining each line of the first subset.
13. Apparatus according to claim 12 in which the line data comprises also point coordinate data of spaced apart control points defining each line of the second subset, and the editing means is arranged to make consequential amendments of the point coordinate data of the second subset after each amendment of line data of each line of the first subset to which a line of the second subset is linked.
14. Apparatus according to claim 12 in which the control points lie on the line and the data includes also data defining the tangents to the line at each control point.
15. Apparatus according to claim 11 wherein the link data includes attachment point data specifying, for each line of the second subset, a point or points along the line or lines of the first subset by reference to which the position of that line of the second subset is defined.
16. Apparatus according to claim 15 in which the attachment point data is defined as a relative distance along the or each said line of the first subset.
17. Apparatus according to claim 15 in which the editing means is arranged, after editing a line of the first subset, to determine the spatial position in the Image of said defining point or points; and is arranged to amend the stored data for the line of the second subset which is defined by reference to said defining point.
18. Apparatus according to claim 17 wherein the editing means is arranged to determine the change in position of the defining point, and to translate the coordinates of points defining the curve of the second subset in dependence upon said change.
19. Apparatus according to claim 17 in which the editing means is arranged to rotate the spatial coordinates of points of the curve of the second subset in dependence upon changes in direction of the curve of the first subset at the defining point.
20. Apparatus according to claim 17 in which, where a curve of the second subset is attached twice to curves of the first subset, the editing means is arranged to vary the stored data of the second subset curve in joint dependence upon the coordinates of the two defining points.
21. Apparatus according to claim 20 in which the editing means is arranged to amend the stored data for the curve of the second subset so as to preserve a constant predetermined angular relationship between the curve of the second subset and that of the first subset at the or each defining point.
22. Apparatus according to claim 20 in which the editing means is arranged to effect a spatial transform operation on coordinates of control point data defining the curve of the second subset within a spatial area lying between the defining points.
23. Apparatus according to claim 11, in which said subsets are not exclusive and lines of the second subset of lines are linked to other lines within the second subset and the first, stored data for each line of the second subset being amended after amending all lines to which it is linked.
24. Image processing apparatus in which first portions of an image are displayed and edited, and second portions are linked to the first, and are not displayed, but are edited in dependence upon changes to the first.
PCT/GB1992/000927 1990-11-30 1992-05-21 Animation WO1992021095A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP4510508A JPH06507742A (en) 1991-05-21 1992-05-21 Video creation device

Applications Claiming Priority (9)

Application Number Priority Date Filing Date Title
GB909026120A GB9026120D0 (en) 1990-11-30 1990-11-30 Computer animation and graphics systems
GB919100632A GB9100632D0 (en) 1991-01-11 1991-01-11 Animation system
GB919102125A GB9102125D0 (en) 1991-01-31 1991-01-31 Animation system
GB9110945A GB2256118A (en) 1991-05-21 1991-05-21 Image synthesis and processing
GB9110945.4 1991-05-21
GB9117409A GB2258790A (en) 1991-08-12 1991-08-12 Animation
GB9117409.4 1991-08-12
GBPCT/GB91/02124 1991-11-29
GBPCT/GB91/02122 1991-11-29

Publications (1)

Publication Number Publication Date
WO1992021095A1 true WO1992021095A1 (en) 1992-11-26

Family

ID=27517011

Family Applications (4)

Application Number Title Priority Date Filing Date
PCT/GB1991/002122 WO1992009965A1 (en) 1990-11-30 1991-11-29 Animation
PCT/GB1991/002124 WO1992009966A1 (en) 1990-11-30 1991-11-29 Image synthesis and processing
PCT/GB1992/000928 WO1992021096A1 (en) 1990-11-30 1992-05-21 Image synthesis and processing
PCT/GB1992/000927 WO1992021095A1 (en) 1990-11-30 1992-05-21 Animation

Family Applications Before (3)

Application Number Title Priority Date Filing Date
PCT/GB1991/002122 WO1992009965A1 (en) 1990-11-30 1991-11-29 Animation
PCT/GB1991/002124 WO1992009966A1 (en) 1990-11-30 1991-11-29 Image synthesis and processing
PCT/GB1992/000928 WO1992021096A1 (en) 1990-11-30 1992-05-21 Image synthesis and processing

Country Status (5)

Country Link
US (2) US5692117A (en)
EP (3) EP0559708A1 (en)
JP (2) JPH06503663A (en)
AU (2) AU9015891A (en)
WO (4) WO1992009965A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2284524A (en) * 1993-12-02 1995-06-07 Fujitsu Ltd Graphic editing apparatus and method
US5854634A (en) * 1995-12-26 1998-12-29 Imax Corporation Computer-assisted animation construction system using source poses within a pose transformation space

Families Citing this family (183)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
IL106410A (en) * 1992-08-06 1996-09-12 Hughes Training Inc Interactive computerized witness interrogation recording tool
EP0589658B1 (en) * 1992-09-21 2002-07-17 Matsushita Electric Industrial Co., Ltd. Superimposing of graphic data with graphic parameter store
JPH07146931A (en) * 1993-03-08 1995-06-06 Canon Inf Syst Res Australia Pty Ltd Picture generating method
GB2277856A (en) * 1993-04-05 1994-11-09 Cambridge Animation Syst Computer generating animated sequence of pictures
AU706423B2 (en) * 1994-09-13 1999-06-17 Canon Kabushiki Kaisha Edge to edge blends
EP0702332B1 (en) 1994-09-13 2002-01-23 Canon Kabushiki Kaisha Edge to edge blends
JPH08202850A (en) * 1995-01-26 1996-08-09 Sony Corp Paper fiber structure data generating method and device, paper fiber structure data and blotting plotting method and device
CA2167237A1 (en) * 1995-02-17 1996-08-18 Steven Charles Dzik Line smoothing techniques
WO1996036945A1 (en) 1995-05-19 1996-11-21 Sega Enterprises, Ltd. Picture processing device, picture processing method, game device using same, and memory medium
AUPN360295A0 (en) * 1995-06-16 1995-07-13 Canon Information Systems Research Australia Pty Ltd Blend control system
US6028593A (en) * 1995-12-01 2000-02-22 Immersion Corporation Method and apparatus for providing simulated physical interactions within computer generated environments
JP3785700B2 (en) * 1995-12-18 2006-06-14 ソニー株式会社 Approximation method and apparatus
FR2743241B1 (en) * 1995-12-28 1998-02-13 Sagem METHOD FOR MODIFYING THE RESOLUTION OF A DIGITAL IMAGE
US5764814A (en) * 1996-03-22 1998-06-09 Microsoft Corporation Representation and encoding of general arbitrary shapes
JPH09326990A (en) * 1996-06-07 1997-12-16 Matsushita Electric Ind Co Ltd Video editor
US5982389A (en) * 1996-06-17 1999-11-09 Microsoft Corporation Generating optimized motion transitions for computer animated objects
US5889532A (en) * 1996-08-02 1999-03-30 Avid Technology, Inc. Control solutions for the resolution plane of inverse kinematic chains
US6115051A (en) * 1996-08-07 2000-09-05 Adobe Systems Incorporated Arc-length reparameterization
JP3211679B2 (en) * 1996-09-25 2001-09-25 松下電器産業株式会社 Editing device and editing method
US5977319A (en) * 1996-10-21 1999-11-02 Cambridge Antibody Technology Limited Specific binding members for estradiol; materials and methods
US6252604B1 (en) * 1997-01-10 2001-06-26 Tom Snyder Productions, Inc. Method of animating an image by squiggling the edges of image features
US7616198B2 (en) * 1998-02-20 2009-11-10 Mental Images Gmbh System and computer-implemented method for modeling the three-dimensional shape of an object by shading of a two-dimensional image of the object
US6400368B1 (en) * 1997-03-20 2002-06-04 Avid Technology, Inc. System and method for constructing and using generalized skeletons for animation models
US5999195A (en) * 1997-03-28 1999-12-07 Silicon Graphics, Inc. Automatic generation of transitions between motion cycles in an animation
CA2233380A1 (en) * 1997-04-04 1998-10-04 Microsoft Corporation Parametric function curve editing
US6128001A (en) * 1997-04-04 2000-10-03 Avid Technology, Inc. Methods and apparatus for changing a color of an image
US6351264B1 (en) * 1997-05-20 2002-02-26 Adam S. Iga Method for computer image color shading painting or recreation
JPH1118071A (en) * 1997-06-25 1999-01-22 Nec Corp Slow reproduction system
US6072502A (en) * 1997-06-25 2000-06-06 Adobe Systems Incorporated Characterization of corners of curvilinear segment
US6271864B1 (en) * 1997-06-30 2001-08-07 Sun Microsystems, Inc. Representing a path as an object with transformation capability
JP3047007B2 (en) * 1997-09-26 2000-05-29 株式会社島精機製作所 Image processing device
US6070167A (en) * 1997-09-29 2000-05-30 Sharp Laboratories Of America, Inc. Hierarchical method and system for object-based audiovisual descriptive tagging of images for information retrieval, editing, and manipulation
US5977965A (en) * 1997-09-29 1999-11-02 Intergraph Corporation Automatic frame accumulator
US6307576B1 (en) * 1997-10-02 2001-10-23 Maury Rosenfeld Method for automatically animating lip synchronization and facial expression of animated characters
US6119123A (en) * 1997-12-02 2000-09-12 U.S. Philips Corporation Apparatus and method for optimizing keyframe and blob retrieval and storage
US6260044B1 (en) 1998-02-04 2001-07-10 Nugenesis Technologies Corporation Information storage and retrieval system for storing and retrieving the visual form of information from an application in a database
US6404435B1 (en) * 1998-04-03 2002-06-11 Avid Technology, Inc. Method and apparatus for three-dimensional alphanumeric character animation
WO1999052063A1 (en) * 1998-04-05 1999-10-14 Automedia Ltd. Feature motivated tracking and processing
US6240198B1 (en) * 1998-04-13 2001-05-29 Compaq Computer Corporation Method for figure tracking using 2-D registration
US6269172B1 (en) * 1998-04-13 2001-07-31 Compaq Computer Corporation Method for tracking the motion of a 3-D figure
US6256418B1 (en) 1998-04-13 2001-07-03 Compaq Computer Corporation Method and system for compressing a sequence of images including a moving figure
US6323879B1 (en) * 1998-05-14 2001-11-27 Autodesk, Inc. Method and system for determining the spacing of objects
US6317125B1 (en) 1998-06-19 2001-11-13 Interplay Entertainment Corp. Saxs video object generation engine
JP3432149B2 (en) * 1998-07-13 2003-08-04 株式会社島精機製作所 Image processing method and apparatus
US6377259B2 (en) * 1998-07-29 2002-04-23 Inxight Software, Inc. Presenting node-link structures with modification
US7536706B1 (en) 1998-08-24 2009-05-19 Sharp Laboratories Of America, Inc. Information enhanced audio video encoding system
AUPP557898A0 (en) * 1998-08-28 1998-09-24 Canon Kabushiki Kaisha Method and apparatus for orientating a character stroke
GB2342026B (en) * 1998-09-22 2003-06-11 Luvvy Ltd Graphics and image processing system
US6535213B1 (en) * 1998-09-22 2003-03-18 Sony Corporation Curve edition system, curve-loop detecting system, curve-loop removing system
US6201551B1 (en) * 1998-09-30 2001-03-13 Xerox Corporation PDL operator overloading for line width management
US6246419B1 (en) * 1998-09-30 2001-06-12 Xerox Corporation PDL operator overloading for line width management
US6331854B1 (en) * 1998-10-05 2001-12-18 Azi International Srl Method and apparatus for accelerating animation in a video graphics system
JP3427973B2 (en) * 1998-12-09 2003-07-22 日本電気株式会社 Object display description document conversion device and browser
JP4288449B2 (en) * 1999-02-16 2009-07-01 株式会社セガ Image display device, image processing device, and image display system
US7188353B1 (en) 1999-04-06 2007-03-06 Sharp Laboratories Of America, Inc. System for presenting synchronized HTML documents in digital television receivers
US6512522B1 (en) * 1999-04-15 2003-01-28 Avid Technology, Inc. Animation of three-dimensional characters along a path for motion video sequences
WO2000063843A1 (en) * 1999-04-16 2000-10-26 Avid Technology, Inc. A method and apparatus for hierarchically combining regions
US6870550B1 (en) * 1999-04-26 2005-03-22 Adobe Systems Incorporated Digital Painting
US6681043B1 (en) * 1999-08-16 2004-01-20 University Of Washington Interactive video object processing environment which visually distinguishes segmented video object
US6633300B1 (en) * 1999-12-22 2003-10-14 Adobe Systems Incorporated Method and apparatus for painting groups of objects
US7082436B1 (en) 2000-01-05 2006-07-25 Nugenesis Technologies Corporation Storing and retrieving the visual form of data
GB2360919A (en) * 2000-01-20 2001-10-03 Anthropics Technology Ltd Appearance modelling
TWI282957B (en) * 2000-05-09 2007-06-21 Sharp Kk Drive circuit, and image display device incorporating the same
US7647340B2 (en) 2000-06-28 2010-01-12 Sharp Laboratories Of America, Inc. Metadata in JPEG 2000 file format
US7079158B2 (en) * 2000-08-31 2006-07-18 Beautyriot.Com, Inc. Virtual makeover system and method
EP1187066A3 (en) * 2000-09-01 2004-04-21 Sony Computer Entertainment Inc. Method and apparatus for image enlargement/reduction
AU2001292202A1 (en) * 2000-09-19 2002-04-02 Technion Research And Development Foundation Ltd. Method and apparatus for shape deformation and placement
US7006694B1 (en) * 2000-10-05 2006-02-28 Coreco Imaging, Inc. System and method for pattern identification
EP1207498A3 (en) * 2000-11-15 2003-10-15 Sega Corporation Display object generation method in information processing equipment
US6765589B1 (en) * 2000-11-16 2004-07-20 Adobe Systems Incorporated Brush for warping and water reflection effects
CN1537300A (en) * 2000-12-22 2004-10-13 Communication system
US20040135788A1 (en) * 2000-12-22 2004-07-15 Davidson Colin Bruce Image processing system
GB2370709A (en) * 2000-12-28 2002-07-03 Nokia Mobile Phones Ltd Displaying an image and associated visual effect
NO313477B1 (en) * 2001-01-08 2002-10-07 Simsurgery As Method and system for simulating a thread in computer-based graphical simulations
US8750382B2 (en) 2001-01-23 2014-06-10 Kenneth Martin Jacobs System and method for calculating 3Deeps action specs motion estimation from the motion vectors in an MPEG file
US9781408B1 (en) 2001-01-23 2017-10-03 Visual Effect Innovations, Llc Faster state transitioning for continuous adjustable 3Deeps filter spectacles using multi-layered variable tint materials
US10742965B2 (en) 2001-01-23 2020-08-11 Visual Effect Innovations, Llc Faster state transitioning for continuous adjustable 3Deeps filter spectacles using multi-layered variable tint materials
US20020130872A1 (en) * 2001-03-15 2002-09-19 Elena Novikova Methods and systems for conflict resolution, summation, and conversion of function curves
US6963350B1 (en) * 2001-07-03 2005-11-08 Adobe Systems Incorporated Painting interface to computer drawing system curve editing
KR20010113584A (en) * 2001-11-08 2001-12-28 (주)시스튜디오 a method for providing comics-animation by computers and a computer-readable medium storing data of comics-animation
US7026960B2 (en) * 2001-11-27 2006-04-11 Samsung Electronics Co., Ltd. Method and apparatus for encoding and decoding key data
KR100480787B1 (en) * 2001-11-27 2005-04-07 삼성전자주식회사 Encoding/decoding method and apparatus for key value of coordinate interpolator node
US7336713B2 (en) * 2001-11-27 2008-02-26 Samsung Electronics Co., Ltd. Method and apparatus for encoding and decoding data
KR100426313B1 (en) * 2001-12-28 2004-04-06 한국전자통신연구원 Method for modifying posture of an articulated object in manufacturing picture
AU2003200347C1 (en) * 2002-02-07 2005-04-21 Canon Kabushiki Kaisha A Method for Stroking Flattened Paths
US7199805B1 (en) * 2002-05-28 2007-04-03 Apple Computer, Inc. Method and apparatus for titling
US6822653B2 (en) * 2002-06-28 2004-11-23 Microsoft Corporation Methods and system for general skinning via hardware accelerators
US6970169B1 (en) 2002-09-24 2005-11-29 Adobe Systems Incorporated Digitally synthesizing seamless texture having random variations
US7809204B2 (en) 2002-10-18 2010-10-05 Samsung Electronics Co., Ltd. Method and apparatus for encoding and decoding key value data of coordinate interpolator
US7319764B1 (en) * 2003-01-06 2008-01-15 Apple Inc. Method and apparatus for controlling volume
AU2003900809A0 (en) * 2003-02-24 2003-03-13 Aristocrat Technologies Australia Pty Ltd Gaming machine transitions
US7333111B2 (en) * 2003-04-25 2008-02-19 Honda Giken Kogyo Kabushiki Kaisha Joint component framework for modeling complex joint behavior
US7164423B1 (en) * 2003-04-30 2007-01-16 Apple Computer, Inc. Method and apparatus for providing an animated representation of a reorder operation
US7259764B2 (en) * 2003-05-14 2007-08-21 Pixar Defrobulated angles for character joint representation
GB2418475B (en) 2003-06-09 2007-10-24 Immersion Corp Interactive gaming systems with haptic feedback
US7372464B2 (en) * 2003-07-21 2008-05-13 Autodesk, Inc. Processing image data
US7317457B2 (en) * 2003-07-21 2008-01-08 Autodesk, Inc. Processing image data
US7164410B2 (en) * 2003-07-28 2007-01-16 Sig G. Kupka Manipulating an on-screen object using zones surrounding the object
GB2406028A (en) * 2003-09-11 2005-03-16 Autodesk Canada Inc Tangent handle adjustment for Bezier curves
US7593015B2 (en) * 2003-11-14 2009-09-22 Kyocera Wireless Corp. System and method for sequencing media objects
US8237712B2 (en) * 2004-03-18 2012-08-07 Apple Inc. Manipulation of image content using various image representations
US8121338B2 (en) * 2004-07-07 2012-02-21 Directsmile Gmbh Process for generating images with realistic text insertion
US20060093309A1 (en) * 2004-10-05 2006-05-04 Magix Ag System and method for creating a photo movie
US7376894B2 (en) * 2004-11-18 2008-05-20 Microsoft Corporation Vector path merging into gradient elements
US20060130679A1 (en) 2004-12-20 2006-06-22 Dubois Radford E Iii Automated cutting system for customized field stencils
US7920144B2 (en) * 2005-01-18 2011-04-05 Siemens Medical Solutions Usa, Inc. Method and system for visualization of dynamic three-dimensional virtual objects
JP4866013B2 (en) * 2005-03-31 2012-02-01 富士通株式会社 Character image generation program, system thereof, and method thereof
US7830384B1 (en) * 2005-04-27 2010-11-09 Image Metrics Limited Animating graphical objects using input video
US8260056B2 (en) * 2005-08-19 2012-09-04 Telefonaktiebolaget Lm Ericsson (Publ) Resizing video and aligning video image edges to block boundaries and picture boundaries
US20070091112A1 (en) * 2005-10-20 2007-04-26 Pfrehm Patrick L Method system and program for time based opacity in plots
US7782324B2 (en) * 2005-11-23 2010-08-24 Dreamworks Animation Llc Non-hierarchical unchained kinematic rigging technique and system for animation
JP5111772B2 (en) * 2006-03-24 2013-01-09 株式会社沖データ Printing device
US8281281B1 (en) * 2006-06-07 2012-10-02 Pixar Setting level of detail transition points
US8902233B1 (en) 2006-06-09 2014-12-02 Pixar Driving systems extension
US7965294B1 (en) * 2006-06-09 2011-06-21 Pixar Key frame animation with path-based motion
US8147315B2 (en) * 2006-09-12 2012-04-03 Aristocrat Technologies Australia Ltd Gaming apparatus with persistent game attributes
US8081187B2 (en) * 2006-11-22 2011-12-20 Autodesk, Inc. Pencil strokes for vector based drawing elements
US7697002B2 (en) * 2007-01-25 2010-04-13 Ricoh Co. Ltd. Varying hand-drawn line width for display
US7884834B2 (en) * 2007-04-13 2011-02-08 Apple Inc. In-context paint stroke characteristic adjustment
US20080291212A1 (en) * 2007-05-23 2008-11-27 Dean Robert Gary Anderson As Trustee Of D/L Anderson Family Trust Software for creating engraved images
US20080310747A1 (en) * 2007-05-23 2008-12-18 Dean Robert Gary Anderson As Trustee Of D/L Anderson Family Trust Software for creating engraved images
KR100917887B1 (en) * 2007-06-11 2009-09-16 삼성전자주식회사 Graphic processing method and apparatus for supporting line acceleration function
US8000529B2 (en) * 2007-07-11 2011-08-16 Hewlett-Packard Development Company, L.P. System and method for creating an editable template from a document image
US20090058863A1 (en) * 2007-09-04 2009-03-05 Apple Inc. Image animation with transitional images
US20090079743A1 (en) * 2007-09-20 2009-03-26 Flowplay, Inc. Displaying animation of graphic object in environments lacking 3d redndering capability
US8310483B2 (en) * 2007-11-20 2012-11-13 Dreamworks Animation Llc Tinting a surface to simulate a visual effect in a computer generated scene
US8134558B1 (en) 2007-12-06 2012-03-13 Adobe Systems Incorporated Systems and methods for editing of a computer-generated animation across a plurality of keyframe pairs
US8253728B1 (en) * 2008-02-25 2012-08-28 Lucasfilm Entertainment Company Ltd. Reconstituting 3D scenes for retakes
US20090284532A1 (en) * 2008-05-16 2009-11-19 Apple Inc. Cursor motion blurring
US20090295791A1 (en) * 2008-05-29 2009-12-03 Microsoft Corporation Three-dimensional environment created from video
JP4561883B2 (en) * 2008-06-19 2010-10-13 コニカミノルタビジネステクノロジーズ株式会社 Image forming apparatus, program, and image forming processing method
US8788963B2 (en) * 2008-10-15 2014-07-22 Apple Inc. Scrollable preview of content
DE102008057512A1 (en) * 2008-11-15 2010-07-01 Diehl Aerospace Gmbh Method for displaying line trains
WO2010083272A1 (en) * 2009-01-15 2010-07-22 Simquest Llc Interactive simulation of biological tissue
WO2010129263A2 (en) * 2009-04-27 2010-11-11 Sonoma Data Solutions Llc A method and apparatus for character animation
US8566721B2 (en) * 2009-04-30 2013-10-22 Apple Inc. Editing key-indexed graphs in media editing applications
US8286081B2 (en) 2009-04-30 2012-10-09 Apple Inc. Editing and saving key-indexed geometries in media editing applications
KR101080255B1 (en) * 2009-07-21 2011-11-08 (주)펜앤프리 Apparatus and method for inputting handwriting in accordance with the handwriting pattern
US9672646B2 (en) * 2009-08-28 2017-06-06 Adobe Systems Incorporated System and method for image editing using visual rewind operation
JP5476103B2 (en) 2009-11-27 2014-04-23 富士フイルム株式会社 Page description data processing apparatus, method and program
US20110276891A1 (en) * 2010-05-06 2011-11-10 Marc Ecko Virtual art environment
US8860734B2 (en) 2010-05-12 2014-10-14 Wms Gaming, Inc. Wagering game object animation
JP5494337B2 (en) 2010-07-30 2014-05-14 ソニー株式会社 Information processing apparatus, information processing method, and information processing program
JP5552947B2 (en) 2010-07-30 2014-07-16 ソニー株式会社 Information processing apparatus, display control method, and display control program
US9305398B2 (en) * 2010-10-08 2016-04-05 City University Of Hong Kong Methods for creating and displaying two and three dimensional images on a digital canvas
US8861890B2 (en) * 2010-11-24 2014-10-14 Douglas Alan Lefler System and method for assembling and displaying individual images as a continuous image
US8988461B1 (en) 2011-01-18 2015-03-24 Disney Enterprises, Inc. 3D drawing and painting system with a 3D scalar field
US9142056B1 (en) * 2011-05-18 2015-09-22 Disney Enterprises, Inc. Mixed-order compositing for images having three-dimensional painting effects
JP5741282B2 (en) * 2011-07-26 2015-07-01 カシオ計算機株式会社 Image processing apparatus, image processing method, and program
US8907957B2 (en) 2011-08-30 2014-12-09 Apple Inc. Automatic animation generation
US9164576B2 (en) 2011-09-13 2015-10-20 Apple Inc. Conformance protocol for heterogeneous abstractions for defining user interface behaviors
US8819567B2 (en) 2011-09-13 2014-08-26 Apple Inc. Defining and editing user interface behaviors
US9478058B2 (en) * 2012-08-06 2016-10-25 CELSYS, Inc. Object correcting apparatus and method and computer-readable recording medium
US8947216B2 (en) * 2012-11-02 2015-02-03 Immersion Corporation Encoding dynamic haptic effects
US9898084B2 (en) 2012-12-10 2018-02-20 Immersion Corporation Enhanced dynamic haptic effects
US20140198080A1 (en) * 2013-01-11 2014-07-17 Research In Motion Limited Method and Apparatus Pertaining to Pre-Associated Stylus-Input User Preferences
JP6472171B2 (en) * 2013-08-27 2019-02-20 キヤノン株式会社 Image processing apparatus and method
US8986691B1 (en) 2014-07-15 2015-03-24 Kymab Limited Method of treating atopic dermatitis or asthma using antibody to IL4RA
US8980273B1 (en) 2014-07-15 2015-03-17 Kymab Limited Method of treating atopic dermatitis or asthma using antibody to IL4RA
JP6307873B2 (en) * 2013-12-24 2018-04-11 富士通株式会社 Object line detection apparatus, method, and program
USD906348S1 (en) * 2014-11-26 2020-12-29 Intergraph Corporation Computer display screen or portion thereof with graphic
US10089291B2 (en) 2015-02-27 2018-10-02 Microsoft Technology Licensing, Llc Ink stroke editing and manipulation
US9792723B2 (en) * 2015-04-07 2017-10-17 Disney Enterprises, Inc. Method and system for progressively sculpting three-dimensional geometry
EP3286718A4 (en) 2015-04-23 2018-12-05 Hasbro, Inc. Context-aware digital play
US9740310B2 (en) * 2015-05-22 2017-08-22 Adobe Systems Incorporated Intuitive control of pressure-sensitive stroke attributes
US9741133B2 (en) * 2015-09-29 2017-08-22 Adobe Systems Incorporated Identifying shapes in an image by comparing Bézier curves
US20170236318A1 (en) * 2016-02-15 2017-08-17 Microsoft Technology Licensing, Llc Animated Digital Ink
JP6062589B1 (en) 2016-04-28 2017-01-18 株式会社Live2D Program, information processing apparatus, influence derivation method, image generation method, and recording medium
US10299750B2 (en) * 2016-08-05 2019-05-28 Toshiba Medical Systems Corporation Medical image processing apparatus and X-ray CT apparatus
CN106476479B (en) * 2016-09-20 2017-10-17 北京理工大学 A kind of variable drawing ratio drawing dolly for supporting Freehandhand-drawing and SVG file to import
JP6930091B2 (en) * 2016-11-15 2021-09-01 富士フイルムビジネスイノベーション株式会社 Image processing equipment, image processing methods, image processing systems and programs
WO2018115469A1 (en) * 2016-12-22 2018-06-28 Episurf Ip-Management Ab System and method for optimizing an implant position in an anatomical joint
US10712840B2 (en) * 2017-10-13 2020-07-14 Dell Products L.P. Active pen system
US10424086B2 (en) * 2017-11-16 2019-09-24 Adobe Inc. Oil painting stroke simulation using neural network
US10510186B2 (en) 2017-12-22 2019-12-17 Adobe Inc. Digital media environment for intuitive modifications of digital graphics
US10388045B2 (en) 2018-01-04 2019-08-20 Adobe Inc. Generating a triangle mesh for an image represented by curves
BR102018004967A2 (en) * 2018-03-13 2019-10-01 Samsung Eletrônica da Amazônia Ltda. METHOD FOR PROCESSING MOVEMENT OF VIRTUAL POINTERS
US10410317B1 (en) * 2018-03-26 2019-09-10 Adobe Inc. Digital image transformation environment using spline handles
RU2702498C1 (en) * 2018-05-15 2019-10-08 Юрий Александрович Акименко Method of converting main types into sets of axonometric views
US10832446B2 (en) 2019-01-07 2020-11-10 Adobe Inc. Bone handle generation
US10943375B2 (en) 2019-04-17 2021-03-09 Adobe Inc. Multi-state vector graphics
US11630504B2 (en) * 2021-03-16 2023-04-18 Htc Corporation Handheld input device and electronic system
US11631207B2 (en) 2021-09-09 2023-04-18 Adobe Inc. Vector object stylization from raster objects

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO1989009458A1 (en) * 1988-03-22 1989-10-05 Strandberg Oerjan Method and device for computerized animation

Family Cites Families (35)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3364382A (en) * 1967-01-03 1968-01-16 Control Image Corp Automatic generation and display of animated figures
BE793543A (en) * 1971-12-30 1973-04-16 Ibm MECHANISM POSITION CODING METHODS
US3898438A (en) * 1972-09-28 1975-08-05 Walt Disney Prod Programmable method for digital animation apparatus for assembling animation data
GB1437795A (en) * 1973-07-04 1976-06-03 Computer Image Corp Digitally controlled computer animation generating system
US4189743A (en) * 1976-12-20 1980-02-19 New York Institute Of Technology Apparatus and method for automatic coloration and/or shading of images
US4189744A (en) * 1976-12-20 1980-02-19 New York Institute Of Technology Apparatus for generating signals representing operator-selected portions of a scene
DE2806820C2 (en) * 1978-02-17 1982-02-25 Messerschmitt-Bölkow-Blohm GmbH, 8000 München Method for the synthetic generation of animated films
JPS5837543A (en) * 1981-08-31 1983-03-04 Meidensha Electric Mfg Co Ltd Analysis of curing agent in epoxy resin
NL8300872A (en) * 1983-03-10 1984-10-01 Philips Nv MULTIPROCESSOR CALCULATOR SYSTEM FOR PROCESSING A COLORED IMAGE OF OBJECT ELEMENTS DEFINED IN A HIERARCHICAL DATA STRUCTURE.
US4600919A (en) * 1982-08-03 1986-07-15 New York Institute Of Technology Three dimensional animation
US4582520A (en) * 1982-09-30 1986-04-15 Owens-Corning Fiberglas Corporation Methods and apparatus for measuring and controlling curing of polymeric materials
US4620287A (en) * 1983-01-20 1986-10-28 Dicomed Corporation Method and apparatus for representation of a curve of uniform width
US4646075A (en) * 1983-11-03 1987-02-24 Robert Bosch Corporation System and method for a data processing pipeline
US4739317A (en) * 1984-03-30 1988-04-19 International Business Machines Corporation Draw graphics capabilities
US4683468A (en) * 1985-03-11 1987-07-28 International Business Machines Corp. Method for manipulation of graphic sub-objects in an interactive draw graphic system
JPS62103540A (en) * 1985-10-30 1987-05-14 Mitsubishi Heavy Ind Ltd Method for measuring curing time of organic adhesive
US4764763A (en) * 1985-12-13 1988-08-16 The Ohio Art Company Electronic sketching device
EP0246340A1 (en) * 1986-05-17 1987-11-25 Andreas Dipl.-Math. Wetjen Method to simulate the movements of human dummies
US4760548A (en) * 1986-06-13 1988-07-26 International Business Machines Corporation Method and apparatus for producing a curve image
JPH0785271B2 (en) * 1986-06-27 1995-09-13 株式会社日立製作所 Shape modeling method
JPS63109581A (en) * 1986-10-27 1988-05-14 Video Toron Kk Animation picture processor
JPH0743774B2 (en) * 1986-12-05 1995-05-15 富士通株式会社 Animation creation processing device
US4897638A (en) * 1987-02-27 1990-01-30 Hitachi, Ltd. Method for generating character patterns with controlled size and thickness
AU2999189A (en) * 1988-02-15 1989-08-17 Information Concepts Pty. Ltd. Electronic drawing tools
EP0342752B1 (en) * 1988-05-20 1997-08-06 Koninklijke Philips Electronics N.V. A computer method and an aparatus for generating a display picture representing a set of object elements including a brush object element
DE3821322A1 (en) * 1988-06-24 1990-01-04 Rolf Prof Dr Walter Method of controlling a graphic output device
US5025394A (en) * 1988-09-09 1991-06-18 New York Institute Of Technology Method and apparatus for generating animated images
US4952051A (en) * 1988-09-27 1990-08-28 Lovell Douglas C Method and apparatus for producing animated drawings and in-between drawings
CA1329433C (en) * 1988-10-24 1994-05-10 Lemuel L. Davis Computer animation production system
US5233671A (en) * 1989-02-22 1993-08-03 Ricoh Company Ltd. Image coding method for coding characters using a modified Bezier curve
US5155805A (en) * 1989-05-08 1992-10-13 Apple Computer, Inc. Method and apparatus for moving control points in displaying digital typeface on raster output devices
US5214758A (en) * 1989-11-14 1993-05-25 Sony Corporation Animation producing apparatus
US5155813A (en) * 1990-01-08 1992-10-13 Wang Laboratories, Inc. Computer apparatus for brush styled writing
BE1004117A5 (en) * 1990-04-20 1992-09-29 Neurones Method and device for storing data and animation.
US5416899A (en) * 1992-01-13 1995-05-16 Massachusetts Institute Of Technology Memory based method and apparatus for computer graphics

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO1989009458A1 (en) * 1988-03-22 1989-10-05 Strandberg Oerjan Method and device for computerized animation

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
COMMUNICATIONS OF THE ASSOCIATION FOR COMPUTING MACHINERY. vol. 19, no. 10, October 1976, NEW YORK US pages 564 - 569; BURTNYK AND WEIN: 'INTERACTIVE SKELETON TECHNIQUES FOR ENHANCING MOTION DYNAMICS IN KEY-FRAME ANIMATION' *
COMPUTER GRAPHICS August 1980, FAREHAM GB pages 37 - 55; DIMENT: 'COMPUTER AIDED CHARACTER ANIMATION ACTIVITIES ON BOTH SIDES OF THE ATLANTIC' *

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2284524A (en) * 1993-12-02 1995-06-07 Fujitsu Ltd Graphic editing apparatus and method
US5642475A (en) * 1993-12-02 1997-06-24 Fujitsu Limited Apparatus and method for editing graphics or group graphics displayed on a screen and correlated to each other
GB2284524B (en) * 1993-12-02 1998-08-05 Fujitsu Ltd Graphic editing apparatus and method
US5854634A (en) * 1995-12-26 1998-12-29 Imax Corporation Computer-assisted animation construction system using source poses within a pose transformation space
US6373492B1 (en) 1995-12-26 2002-04-16 Imax Corporation Computer-assisted animation construction system and method and user interface
US6577315B1 (en) 1995-12-26 2003-06-10 Imax Corporation Computer-assisted animation construction system and method and user interface

Also Published As

Publication number Publication date
WO1992009965A1 (en) 1992-06-11
AU8932191A (en) 1992-06-25
US5692117A (en) 1997-11-25
WO1992021096A1 (en) 1992-11-26
JPH06505817A (en) 1994-06-30
EP0559714A1 (en) 1993-09-15
WO1992009966A1 (en) 1992-06-11
AU9015891A (en) 1992-06-25
EP0559708A1 (en) 1993-09-15
JPH06503663A (en) 1994-04-21
EP0585298A1 (en) 1994-03-09
US5611036A (en) 1997-03-11

Similar Documents

Publication Publication Date Title
WO1992021095A1 (en) Animation
US5598182A (en) Image synthesis and processing
Burtnyk et al. Interactive skeleton techniques for enhancing motion dynamics in key frame animation
EP0950988B1 (en) Three-Dimensional image generating apparatus
Fekete et al. TicTacToon: A paperless system for professional 2D animation
Reeves Inbetweening for computer animation utilizing moving point constraints
US5619628A (en) 3-Dimensional animation generating apparatus
JP3862759B2 (en) Computer system and process for defining and producing images using structured objects with variable edge characteristics
US5729704A (en) User-directed method for operating on an object-based model data structure through a second contextual image
US5652851A (en) User interface technique for producing a second image in the spatial context of a first image using a model-based operation
US6208360B1 (en) Method and apparatus for graffiti animation
US5680531A (en) Animation system which employs scattered data interpolation and discontinuities for limiting interpolation ranges
US7420574B2 (en) Shape morphing control and manipulation
Miranda et al. Sketch express: A sketching interface for facial animation
US8358311B1 (en) Interpolation between model poses using inverse kinematics
GB2258790A (en) Animation
JP2000149046A (en) Cure generation device and method, recording medium storing program and corresponding point setting method
Akeo et al. Computer Graphics System for Reproducing Three‐Dimensional Shape from Idea Sketch
Durand The “TOON” project: requirements for a computerized 2D animation system
US8228335B1 (en) Snapsheet animation visualization
JP3002972B2 (en) 3D image processing device
JPH0696186A (en) Editor for graphic changing by time/attribute value
Fei et al. 3d animation creation using space canvases for free-hand drawing
GB2256118A (en) Image synthesis and processing
GB2277856A (en) Computer generating animated sequence of pictures

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): AT AU BB BG BR CA CH CS DE DK ES FI GB HU JP KP KR LK LU MG MN MW NL NO PL RO RU SD SE US

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): AT BE BF BJ CF CG CH CI CM DE DK ES FR GA GB GN GR IT LU MC ML MR NL SE SN TD TG

DFPE Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed before 20040101)
WWE Wipo information: entry into national phase

Ref document number: 1992910474

Country of ref document: EP

ENP Entry into the national phase

Ref country code: US

Ref document number: 1994 142417

Date of ref document: 19940124

Kind code of ref document: A

Format of ref document f/p: F

WWP Wipo information: published in national office

Ref document number: 1992910474

Country of ref document: EP

REG Reference to national code

Ref country code: DE

Ref legal event code: 8642

NENP Non-entry into the national phase

Ref country code: CA

WWW Wipo information: withdrawn in national office

Ref document number: 1992910474

Country of ref document: EP