Software applications executing on processor-based computing systems are prevalent today, and such applications touch many aspects of life. For instance, users commonly interact with applications for performing such tasks as conducting business (e.g., authoring documents, authoring presentations, performing research, purchasing stocks, etc.), planning trips (e.g., making airline, hotel, and/or rental car reservations), conducting meetings (e.g., via web conference and/or video conference applications), accessing entertainment (e.g., video gaming, etc.), and many more. Many applications are accessible by users over a communication network, such as the Internet. For instance, web pages and other types of interactive applications are commonly accessible via the Internet. To provide greater service to users and/or otherwise improve the user experience, multimedia applications are ever-increasing in popularity. As one example, Rich Internet Applications (RIAs), which are interactive, multimedia applications, that may run on client-side players, for example, ADOBE® FLASH® PLAYER, are very popular.
A variety of authoring tools have been developed to enable developers to author (e.g., create, modify, etc.) software applications, such as multimedia applications. For instance, a variety of programs on the market allow a developer (e.g., a web developer) to create web pages, websites, interactive applications, and the like for use by end users (e.g., visitors to websites). Examples of such authoring tools include ADOBE® DREAMWEAVER™, DIRECTOR™, FIREWORKS®, FLASH®, FLEX®, and FLEX BUILDER®, etc.
DREAMWEAVER is an Integrated Development Environment (IDE) that allows web developers to design Hypertext Markup Language (HTML) web pages in a code editor and also in a graphical-based design time environment. In other words, DREAMWEAVER parses tags (e.g., HTML tags) and renders an interactive simulation of the application in the design time environment. That is, DREAMWEAVER provides a design view that mimics operation of a browser for presenting the author a What You See Is What You Get (WYSIWYG) view for the web page being authored. Thus, a user may edit the application (e.g., web page) being authored by directly editing the tags or by graphically manipulating the design time representation. As the user graphically manipulates the design time representation, DREAMWEAVER changes the tags to reflect the modification. DREAMWEAVER also allows the developer to design with more than just HTML, such as, for example, the developer may use Active Server Page (ASP) and C# from Microsoft Corporation, COLDFUSION™ Markup Language (CFML™), and the like.
FLEX BUILDER and FLASH are authoring tools for creating Rich Internet Applications (RIAs), which are interactive, multimedia applications, that may run on client-side run-time players, for example, ADOBE FLASH PLAYER. MXML™ is the native tag-based language for FLEX BUILDER, and ACTIONSCRIPT™ is a script-based procedural language for FLASH-based RIAs. MXML is an Extensible Markup Language (XML)-based language commonly used to create RIAs and it looks similar in some ways to HTML. A developer may write code in a text editor or FLEX BUILDER and save the MXML. Then, a FLEX SDK may he used, which has command line tools for compiling MXML and ACTIONSCRIPT into a run-time file, such as a SWF (“Small Web Format” or “Shock Wave Flash”) file, that can be downloaded and executed on a user's machine. FLEX BUILDER is an IDE that provides a graphical interface into the FLEX SDK. Thus, according to one development technique for developing RIAs using FLEX BUILDER, a developer may write MXML tags and ACTIONSCRIPT code and save it in an MXML file. The FLEX SDK may then be used either directly or indirectly (via FLEX BUILDER) to compile the MXML and ACTIONSCRIPT into ACTIONSCRIPT bytecodes in a run-time file, such as a SWF. SWF is the native file format for the FLASH PLAYER. Thus, as one example, a generated SWF file can be run on the FLASH PLAYER on a user's machine to present the corresponding multimedia output to the user.
Web sites, RIAs, and other multimedia applications generally include one or more output presentation components, such as visual components and or audio components. Visual components are application elements that have a visual appearance on an end-user's screens. Visual components may include text paragraphs, images, animation (e.g., vector animation, bit map animation, movies, videos, etc.), other graphics, and the like. Visual components may also include user-interactive graphics, such as graphics for receiving user input, such as check boxes, data grids, radio buttons, and the like. The visual components of a run-time multimedia application (e.g., a SWF file) are rendered by a run-time player, such as the FLASH PLAYER, to an end-user. Similarly, audio components are audible sound (e.g., music, spoken words, sound-effects, etc.) that are output by a run-time player (e.g., via speakers).
FIG. 1 shows an exemplary system 10 that illustrates a common multimedia authoring scenario of the prior an. As shown, system 10 comprises a processor-computer 11, such as a personal computer (PC), laptop computer, server computer, workstation computer, etc. In this example, an authoring tool 12 is executing on such a computer 11 with which a user may interact to author a multimedia application, such as an RIA. Authoring tool 12 comprises computer-executable software code stored to a computer-readable medium that is readable by a processor of computer 11 and, when executed by such processor, causes computer 11 to perform the various operations described further herein for such authoring tool 12. Examples of such an authoring tool 12 include FLASH, FLEX BUILDER, DIRECTOR, a visual java IDE, a visual content producer for Silverlight (e.g., MICROSOFT's EXPRESSION BLEND) and other applications that produce SWF content, such as Namo FreeMotion and Tivity Software XTIVITY.
Additionally, in this example, a run-time media player 13 is also executing on computer 11, which may receive and play/output a run-time multimedia application (such as one that is authored by authoring application 12). Run-time media player 13 comprises computer-executable software code stored to a computer-readable medium that is readable by a processor of computer 11 and, when executed by such processor, causes computer 11 to perform the various operations described further herein for such run-time media player 13. Examples of such a run-time media player 13 include the FLASH PLAYER, Java, Silverlight (ice., WPFE), and SVG players (e.g., ADOBE SVG Viewer, SVG player embedded in the Opera browser, etc.).
A developer may interact with authoring tool 12 to author a multimedia application. Commonly, authoring tool 12 comprises editing tools 101 with which a user may interact to author (e.g., create, modify, etc.) a multimedia application. That is, authoring tool 12 may present a user interface to a developer, which may enable the developer access to various editing tools 101. Editing tools 101 may comprise any number of tools with which a user may interact to author a multimedia application. Examples of such tools 101 that are commonly included in such authoring tools 12 as FLEX BUILDER include 3D/2D transformation tool, object creation tools such as squares, circles, stars, paths, smart shape objects like pie charts, bubble, etc., tweening and/or blending, tools that create computer-generated intermediate objected between two other objects, constraint based editing tools like flowcharts, inverse kinematics, etc. There are object creation tools which generally translate the user's mouse gestures into objects. For example, a rectangle tool may create a rectangle that is based on the points where the user pressed the mouse button to where the user released the mouse button. Also, generic path tools are often provided that may be used to create a shape based on the mouse movements while the mouse button is depressed. There are often object manipulation tools which may be based on mouse gestures or on entering data into a user interface (UI). For example, a user may size, move or rotate an object on screen by clicking and dragging on certain handles associated with the object on screen or by entering explicit values into text fields in the application. Animation tools are often included that generally let a user set the state of an object (e.g., color, shape, size, etc.) at two different times and generate new objects that change the object from one state to the other state (e.g., tweening, morphing, etc.). Also, there are often included tools that present different author time views to allow the user to see the resulting animation in ways other than one frame at a times. For example, “onion skinning” is a view that allows the user to see multiple frames at a time so they can easily see if the transition from one frame to another is not smooth. Other such editing tools include a selection tool that allows drag and drop movement of drawn objects, and a free transform tool that allows scale, rotate, and skew transformations.
As described above, an authored multimedia application may be compiled into a run-time tile 105, such as a SWF file, that is executable by a run-time player 13. Run-time player 13 generally comprises logic, such as output generation logic 106, for processing the received run-time file 105 and generating an output presentation 107 to an end user. Such output presentation 107 may comprise visual components presented to a display and/or audio components presented to an audio output device (e.g., speakers), for example. For instance, the visual components and/or other output presentation components (e.g., audio, etc.) of the run-time file 105 may be output by the run-time player 13, as output 107, to an end user.
When authoring a multimedia application, a developer may periodically compile the multimedia application into a run-time file 105, and then run the run-time file on a run-time player 13 to evaluate the output 107 to, for instance, determine whether the output 107 is as the developer desires for the multimedia application. If changes to the output 107 are desired by the developer, the developer may edit the multimedia application in authoring tool 12, and re-compile it into another run-time file 105 for re-evaluation using run-time player 13.
Of course, such process of generating a run-time file 105 for the application being authored in the authoring tool 12 in order to evaluate its output using a separate, run-time player 13 becomes unwieldy and inefficient. Thus, many authoring tools 12, such as FLEX BUILDER and FLASH comprise an integrated stage (or “design surface”) 102 that presents a preview of the output of an application being authored. That is, authoring tool 12 may present a user interface to a developer, wherein such user interface includes a stage 102. In general, stage 102 represents an output area that mimics the output of a run-time player 13. Thus, the output of a multimedia application being authored in authoring tool 12 may be designed on stage 102. For example, a developer may interact with stage 102 to arrange the output presentation of the multimedia application being authored, such as by arranging/editing media objects 104 (which may comprise output presentation components, such as visual and/or audio components) that are to be presented in the output of the multimedia application. And, the developer can evaluate the output of the multimedia application being authored without being required to compile the application into a run-time file 105 and employing run-time player 13 for evaluating the output. Thus, stage 102 provides a design surface with which a developer may interact to design the desired output of an application being authored.
Typically, stage 102 provides a user-interactive authoring area with which a developer may interact to author (e.g., edit) media object(s) 104 of a multimedia application under development For instance, a developer may use editing tools 101 to perform various editing operations on media object(s) 104, wherein the result of such editing operations may be reflected in the output presentation of such media object(s) 104 on stage 102. Further, editing tools 101 may enable a developer to view the media object(s) 104 presented on stage 102 in a desired authoring format, such as multiple frame view (e.g., an “onion-skin view”), a keyframe only view, a 3D view, a view that shows selection decorators for selected objects or highlighted selected text, etc., or other view that modifies or supplements the mimicked run-time output presentation to some output that aids the author in reviewing and/or editing the output. In this regard, the output presented on the stage 102 may be a hybrid of the mimicked run-time output (e.g., output 107) of the multimedia application being authored and an author-friendly output. For instance, the mimicked run-time output may be altered to present a view of the corresponding mimicked run-time output in some format that is author-friendly, such as onion-skin view, etc., as opposed to a corresponding view of the output that would be presented as output 107 during run-time. Further, in some implementations, a user may interact directly with media object(s) 104 shown on stage 102 to, for example, drag-and-drop selected media object(s) to a desired location on stage 102, rotate selected media object (s) to a desired orientation, remove/delete selected media object(s) from the stage 102) etc.
In this way, stage 102 is similar to a document area commonly presented in a word processor application, such as MICROSOFT® WORD™, in which the portions of a document being authored (e.g., text, etc. that are to be included in the resulting document) are shown in such document area, while editing tools with which a user may interact to format the document (e.g., bold, highlight, underline, highlight, change text color, etc.) are presented outside the document area, as those editing tools are not actually part of the output of the document being authored. Similarly, stage 102 generally presents a preview of the output of a multimedia application being authored. That is, stage 102 presents a preview that attempts to mimic the output 107 that would be output by a run-time player 13 for the multimedia application being authored (if the multimedia application were compiled into a run-time file 105 and executed on run-time player 13). Also, some word processor applications offer the ability to present author-friendly information in the document area, such as symbols that indicate paragraphs, spacing, and/or other formatting information that is present in the document, wherein such author-friendly information (e.g., symbols) is provided in the document area solely for authoring assistance and is not output as part of the document being authored (e.g., the symbols may not be included in a print-out of the authored document). Similarly, as mentioned above, author-friendly information may likewise be presented for an output of a multimedia application on stage 102, such as an author-friendly view (e.g., onion skin view, etc.).
Accordingly, it is generally desirable for stage 102 of authoring tool 12 to output a fairly accurate representation of the output 107 that will be presented during run-time of the multimedia application by run-time player 13. That is, it is desirable for a developer working with authoring tool 12 to receive, via stage 102, an accurate representation of the output 107 that will be generated for the multimedia application by run-time player 13 so that the developer can better evaluate the output and author/design the multimedia application to have a desired output. Traditionally, some logic 103 is implemented in authoring tool 12 for mimicking run-time player 13 in order to generate the output presentation to stage 102. That is, logic 103 is employed for generating stage 102 in a manner that attempts to mimic closely the output (e.g., output 107) that would be presented during run-time for the multimedia application by run-time player 13. A couple of different approaches have been employed within authoring tool 12 of the prior art for mimicking run-time player 13.
According to one approach, actual core code of run-time player 13 is integrated into authoring tool 12. That is, a portion of the underlying software code of run-time player 13 may be integrated within authoring tool 12, and modified to interact with the editing tools 101 and/or other features of the authoring tool 12 in order to render media objects 104 of an application being authored to stage 102. However, such an integration of the run-time player code into the authoring tool 12 has several disadvantages. First, this integration often results in an undesirably large authoring tool 12. That is, the underlying code for implementing the authoring tool 12 may become undesirably large. Additionally, if the code of the run-time player 13 (on which the authoring tool 12 is based in this example) is modified, then the corresponding mimicking code 103 in the authoring tool 12 may need to be modified in order for the authoring tool 12 to continue to accurately present output on stage 102 consistent with the output that would be generated by run-time player 13. For instance, if bugs in the underlying code implementing run-time player 11 are corrected, if new features are added to run-time player 11, and/or if run-time player 13 is overhauled/modified in any way, particularly with regard to how the run-time player 13 generates output 107 (e.g., modifications within the underlying code implementing output generation logic 106 of run-time player 13), it may become necessary to modify the mimicking code 103 of authoring tool 12.
In another approach, mimic logic 103 is independently developed, rather than integrating underlying code of the run-time player 13 into authoring tool 12. In these instances, mimic logic 103 attempts to render an output to stage 102 that is similar to the output that would be rendered by run-time player 13 for a run-time version of the multimedia file being authored, but the mimic logic 103 may achieve its representation of the output to stage 102 in a much different way than run-time player 13 does. That is, the underlying code and operations performed by mimic logic 103 in rendering an output to stage 102 may be much different than that of run-time player 13. Of course, such mimic logic 103 is limited in the accuracy of the output presentation that it generates to stage 102 as compared with an actual, run-time output generated by run-time player 13 for a run-time version of the multimedia application being authored.
In both of the traditional mimicking techniques mentioned above, to maintain the mimic logic 103 up-to-date with a most recent version of a run-time player (e.g., to ensure an accurate reflection on stage 102 of the output presentation 107 that would be generated by the player 13), mimic logic 103 must be periodically updated. Further, the traditional mimicking techniques employed by logic 103 restrict representation of the stage's output to that of a given run-time player 13 that is being mimicked. Different versions of a run-time player and/or different target run-time players to which a multimedia application may be intended are not represented in the authoring tool 12.
In view of the above, a desire exists for an improved system and method for rendering to a stage in an authoring tool a presentation output preview of a multimedia application that is being authored.