VRML

Transcription

VRML
VRML
What is VRML?
-
The Virtual Reality Modeling Language (VRML) is a file format for describing 3D interactive
worlds and objects. It may be used in conjunction with the World Wide Web. It may be used to
create two and three-dimensional representations of complex scenes such as illustrations,
product definition, virtual reality presentations, interactive worlds and games and many others.
Why should I use it?
-
-
VRML is capable of representing static and animated objects and it can have hyperlinks to other
media such as sound, movies, and image. Interpreters (browsers) for VRML are widely available
for many different platforms as well as authoring tools for the creation VRML files.
VRML is designed to be used on the Internet, intranets, and local client systems. VRML is also
intended to be a universal interchange format for integrated 3D graphics and multimedia.
VRML may be used in a variety of application areas such as engineering and scientific
visualization, multimedia presentations, entertainment and educational titles, web pages, and
shared virtual worlds.
Play VRML files (.wrl) locally
Install GPAC
-
Read a short description about GPAC framework.
Install GPAC framework executable for Windows or the APK for Android devices.
(alternatively, you can visit ARAF's binaries page in order to download the software)
Download the ZIP archive with all the VRML examples and the related images and unzip it
anywhere you like. You can also find all the ARAF material in MyMultimediaWorld's repository.
1. Use the standalone player, Osmo, in order to load the .wrl files
 open with… or
 drag the .wrl file over the Osmo executable icon and release or
 start the application and use the menu: File -> Open file (or Ctrl + o)
2. Use mp4client
 Open a command prompt: windows -> ‘cmd’
 Change the path to the VRML files directory: cd ‘path/to/dir’
 Use the following command to load a file: mp4client ‘filename.wrl’
 The mp4client will play the file in the same player (without GUI)
 Enable the logs for displaying the print messages and/or error messages
in the console: add Logs=all@warning:console@info in [General]
section of the config file which may be found at
C:\Users\<your_windows_username>\AppData\Roaming\GPAC\GPAC.cfg
Let’s see some VRML examples first:
Use the above instructions and play the files in the given order.
1.
2.
3.
4.
5.
6.
7.
8.
9.
10.
Ex1.sphere
Ex2.cylinder
Ex3.shelter
Ex4.spacecraft
Ex5.text
Ex6.spaceSystem
Ex7.fan
Ex8.interactiveBG –click the scene shapes in order interact with the scene
Ex9.dungeon – navigation types (switch to ‘walk’ and move around)
Ex10.dungeonVP – multiple viewpoints, navigation set to ‘walk’ inside the scene. Use the player
menu to change the VP and the keyboard arrows to move the avatar through the scene.
11. Ex11.rabbit
12. Ex12.argame
13. EXTRA: Regression tests (BIFS)
History
VRML was initiated because of a need felt by the initiators to have a more human oriented interface for
information and interaction. While I don't entirely agree blindly with the more human oriented part, it is
interesting from a HCI point of view to investigate this for various applications.
Late in 1993, Mark Pesce, and Tony Parisi initiated workings that started VRML on its way. After a
mailing-list was started to discuss the development of VRML, it was not long before a basis for VRML
was chosen in the form of SGI's Open Inventor file-format. A subset was taken with some extensions to
support networking. Although HTML was already well known and standardized, it was decided not to
create VRML as an extension for HTML, but a completely independent language. This way HTML and
VRML could develop separately.
VRML 2.0 extends the possibilities of v1.0 by allowing properties of objects to be changed by
interactions with the user or by using scripts to animate objects in time. Multi-user support is not
directly supported, but it is explicitly mentioned in the specification as possible extension. It also has a
more formal tone to it, which indicates that VRML is accepted as a serious extension to the Web.
(http://margo.student.utwente.nl/simon/finished/hci/chapter1.html)
Read more (history + VRML introduction):
http://www.mitra.biz/vrml/vrml2/marrin_history.htm
http://www.comp.lancs.ac.uk/~rodger/Work/index.php/WebPages/VRMLHistory
http://www.agocg.ac.uk/train/vrml2rep/part1/guide1.htm
http://www.scintillatinggraphics.com.au/VRMLmagic/VRMLInfo.html
Getting started
Basics
A VRML file is essentially a collection of Objects called Nodes which can be something physically
(Sphere, Cylinder, Box, etc.) or non-physically (Viewpoints, Hyperlinks, Transforms, Sound, etc.).
Each Node contains Fields which holds the data of the node.
Some nodes are container nodes or grouping nodes, which can contain other nodes.
Nodes are arranged in hierarchical structures called scene graphs. Scene graphs are more than just a
collection of nodes; they also define an ordering for the nodes. A scene graph has a notion of state, for
example nodes defined earlier in the scene can affect nodes that appear later in the world.
Open the first VRML example (1.sphere.wrl) found in the examples.zip with your preferred text editor
and read the comments in order to find out which of the defined keywords are nodes.
A comment line in VRML starts with a hash tag (#). The first line of a VRML file should be a comment line
which tells the player what kind of data he has to handle. VRML indicates the file content is a VRML
description, 2.0 specifies the VRML version and utf8 indicates the coding format of the file.
As it can be noticed in the file, a node definition is composed of a field type (e.g. appearance) and the
node name (e.g. Appearance). Please also notice the file content design: the hierarchy of nodes
composes a tree structure. This is possible because of the grouping nodes which allow nesting other
nodes.
Observation1: In order to be able to easily follow a VRML content file it is wisely to follow the same
structure as in the examples or something similar.
Observation2: VRML is case-sensitive.
For a better understanding of the first example (ex1.sphere.wrl) let’s first review the definition of the
Shape node, which is the top level node of our scene:
Shape {
exposedField SFNode appearance NULL
exposedField SFNode geometry NULL
}
The full list of the node definitions may be found at
http://www.web3d.org/x3d/specifications/vrml/ISO-IEC-14772-VRML97/part1/nodesRef.html
The Shape node has two exposedFields of type SFNode, appearance and geometry, both initialized to
NULL. The appearance field contains an Appearance node specifying the visual attributes to be applied
to the geometry node which is a Sphere in our example.
Exercise1. Remove the geometry node and play the file. What happens? Explain.
Exercise2. Undo the changes and replace the Sphere node with a Box node. Use the node reference
documentation in order to find out the field name of the Box which allows changing the size of the
object.
Exercise3. Change the color of the Box to green. Use the diffuseColor field of the Material node to
achieve this.
Well done!
Now let’s see what every keyword in the Shape definition mean and which are the other predefined
keywords using the below example line:
exposedField SFNode geometry NULL
1. An exposedField is capable of receiving events via an eventIn to change its value(s), and
generating events via an eventOut when its value(s) change. Therefore the other two access
modes are eventIn and eventOut. An eventIn behaves as a receptor which receives events while
the eventOut is an output terminal from which the events are sent. For more on this read the
VRML97 glossary.
2. The SFNode keyword specifies a VRML node. In our example we started with a Sphere node and
replace it with a Box node afterwards. An MFNode would have specified zero or more VRML
nodes.
a. A field name starting with S (single) specifies a single object while a field name starting with M
(multiple) obviously specifies and array of objects, a list.
b. Another field example is SFInt32 which specifies one 32-bit integer. Other field examples:
MFString, SFRotation and Boolean.
Exercise4. Specify at least 3 more field types based on the ones already mentioned.
c. A full list of fields is available at web3d field and event reference.
3. The geometry field associates a geometry node to it. Review the full list of geometry nodes.
4. The last parameter is the actual value of the field which may be a single object, a list of objects
or a Boolean. Initially there is nothing therefore a NULL value is used.
Let’s review what we know so far:
-
We have learnt about some of the VRML nodes like Shape, Sphere, Material and Appearance.
After reading the node reference we should have basic knowledge about all the nodes provided
by VRML.
-
After reading the field reference we should know what kinds of fields are supported and which
their meaning is.
We can recognize the geometry and appearance nodes among the entire list of nodes.
In order to differentiate other types of nodes let’s review the Grouping and children nodes
section. As you will notice, most of the grouping nodes have a children field of type MFNode
which is an exposedField. Any node containing this field is a grouping node just because the field
parameter can ‘hold’ some other node instances. In the same section you may also find which
node types can be children nodes and which cannot.
Exercise5.
a. Using the links provided, find the correct nodes in order to create a fresh VRML file containing a
red Cylinder with a radius of 3.0 without bottom and a Box of size 2 1 1. The box should be
translated by 3 on Y axis. Hint: you may think of using a group node which has a translation field
in order to translate the Box.
b. Rotate both the Cylinder and the Box with 90 degrees on Z axis as a single object. Hint: use the
rotation field of a group node to achieve this.
Exercise6. Open the second example, add a new image texture to the Cylinder and change its
transparency to 0.5.
Re-use with DEF and USE
An important feature of VRML is being able to reuse an object by giving it a name and making copies of
it. You can give a certain node a name with DEF and re-use it later on as in the below example:
DEF COLUMN Shape {
appearance Appearance {...}
geometry Cylinder {
radius 3
height 6
}
…
# later in the file
Transform {
translation 1 1 1
USE COLUMN
}
Even if you don't plan reusing objects, it is good practice to name at least major nodes because it
implicitly adds documentation to your file and you never know if later you want to reuse an object in
bigger scenes and/or modify its contents with a script.
Try to understand the usage of DEF by examining the 4th and 5th examples.
Re-use with PROTO
Prototyping is a mechanism that allows the set of node types to be extended from within a VRML file. It
allows the encapsulation and parameterization of geometry, behaviors, or both. In other words, a
PROTO is another way of reusing a node and besides the DEF/USE functionality a PROTO also allows
creating new nodes with the desired field types which can be parameterized. A new node can be
created by instantiating the PROTO. The fields’ values of the instance are the default ones if they are not
specified when creating the instance. To conclude, PROTOs allow you to extend the language by adding
your own nodes with the desired structure and functionality.
General syntax:
PROTO ProtoName [
field type parameter_name value
…. # other fields declarations
]
{
# object definition
}
The IS statement
The IS statement tells what eventIn, eventOut, or exposedField from the PROTO abstract interface
connects with what part of the inside of the PROTO.
PROTO MyProtoExample [
exposedField SFVec3f objPosition 10 5 -10
]
{
Transform {
translation IS objPosition
}
}
Exercise7. Examine the ex.PROTO.wrl file example and try to understand it. Identify the fields’
declaration and the object definition. Explain why there are 2 groups of 2 intersected ellipses. Add
another two groups of blue ellipses with half the radius of the initial ones.
Exercise8. Create a new VRML file which has a 4x4 grid of cubes. The size of each cube is between 1 and
4 and the distance between the lines is 1. The size of the cubes should increase from top to bottom. All
the cubes of a horizontal line should have the same texture but a different one from the above/below
line. Hint: Create a PROTO displaying a line of 4 Boxes. The size field and the URL of the ImageTexture of
the Box are mandatory parameters of the PROTO.
Moving, interactive VRML
The interpolators
The interpolator nodes can be used to animate a 3D object or to move any other graphical object along
a linear given path. The available interpolator nodes are the following:






ColorInterpolator
CoordinateInterpolator
NormalInterpolator
OrientationInterpolator
PositionInterpolator
ScalarInterpolator
All of them share a common set of fields and semantics. Please read the Interpolator nodes section in
order to find out how these nodes can animate your scene and when you should use them. Another way
of achieving the same result will be presented later in this tutorial.
Exercise9. Play again the spaceSystem.wrl and the fan.wrl examples. Guess what kinds of interpolators
are used to create the animation for each case. Don’t cheat! Use a ScalarInterpolator to resize one of
the planets while still gravitating.
The viewpoint
Probably one of the most important VRML node, it defines a specific location in the coordinate system
from which the user may visualize the scene. Viewpoint is a bindable node therefore the highest
viewpoint node on the stack will be the active one until its set_bind event is set to False or another
Viewpoint’s set_bind input event receives a True value. The last active Viewpoint node will become the
last entry of the stack while the second node will become the current active Viewpoint. The cycle
continues each time the set_bind event is set. Consider reading more about the Viewpoint node.
Exercise10. Play the ex10.dungeonVP.wrl file and use the player’s menu (‘View’) in order to switch
through the available viewpoints. Examine the file content and find the corresponding piece of code for
each of the options. Change the type of the NavigationInfo to any other available value and play the file
again. Use the keyboard arrows to move around. Notice how the camera is moving for each of the cases.
The TimeSensor
Another important VRML node is the TimeSensor which is a time-dependent node.
A TimeSensor node generates cyclic events that may be used to create loops, to control various
activities or to initialize or trigger another event.
Because of its importance, please read the whole description of the TimeSensor node and try to
understand how it works.
Did you get the idea? A TimeSensor is like a stop watch: there is a start time and a stop time and a cycle.
The length of the cycle can be also set by the user. A TimeSensor becomes active when the startTime is
reached and it begins to cycle (if loop==True) until the global time of the scene reaches the stopTime of
the sensor.
Read about the other sensor nodes before going further.
In order for the time sensors to be useful their output events have to be somehow used and propagated
throughout the nodes of the scene. One way of achieving this is by using the ROUTE statement.
The ROUTE statement
In order to connect a node generating a message (event) with another node receiving a message a
ROUTE statement can be used. A node that produces events of a given type can be routed to a node
that receives events of the same type with the following syntax:
ROUTE NodeName.eventOutName TO NodeName.eventInName
Observation: you can only connect fields of nodes that are exactly of the same type; else you will have
to write a script in between as you will learn later.
Remember what an event is. We said there are two kinds of events, eventIn and eventOut, and another
type which is basically both an input and an output event in the same time, exposedField.
Open the sixth exercise (ex6.spaceSystem.wrl) and try to understand how it works. Notice the
TimeSensors and the ROUTEs at the end of the file.
Exercise10. Identify the input and the output events and the nodes used to create the moving scene.
Explain the behavior. Change the trajectory and the speed of the planets.
Interaction
If you read about the other sensor nodes then you’ve must have been noticed the TouchSensor which is
as important sensor node as the TimeSensor. Please read again the TouchSensor description.
Usually the events are triggered by sensors. The sensor nodes you read about allow you to detect what
and when something happens. Play the ex8.interactiveBG.wrl file and click the geometric shapes inside
the scene. As you’ll notice, the background of the scene changes as one of the shapes is clicked. In other
words, the click event of the TouchSensor attached to a geometric shape is routed to a Background
input event set_bind. You should also consider reading Background’s description to find out how
set_bind works and which the other available fields are.
Exercise11.




Use some other TouchSensor output event to achieve the same result.
Add a new functionality that translates the sphere whenever the mouse moves over the box
geometry. You may think of creating a new ROUTE linking the hitPoint_changed output event
of the TouchSensor to the sphere.
Add a TimeSensor to the scene and change the background using an output event of the
TimeSensor. Notice the events can be triggered either by user actions (click, mouse over,
mouse move) or automatically by using TimeSensor nodes.
Try to use one of the TimeSensor’s output events to switch the scene background each
second. Is it possible? Of course it is! Read the next section.
Scripting
As you may have noticed in the previous exercise, not everything can be achieved by using pure VRML
code. As you will find out while trying to create new VRML worlds, a lot of things will be just too
complex for the built-in sensors and interpolators. This is why Scripting was introduced.
While reading the node reference documentation you may have seen the Script node. This is the magic
node which can be used to program the desired behavior in your scene. The Script node provides the
missing flexibility to expand almost limitless your behavior requirements. The Script node allows you to
define behaviors using programming languages such as JAVA, JavaScript, or the most recently
proposed VRMLScript by Silicon Graphics. You should also consider reading the Scripting concept
before going further.
A Script can specify an action each time it receives an event (you use ROUTE to send an event to a
Script as with any other node). Furthermore you can define initialization and shutdown procedures. In
each action a script can generate multiple events.
As opposed to other nodes, in the Script node you can define the events the node receives and sends. A
Script node can also have fields. Fields work as persistent local variables to the script and can be used to
define an internal state.
A script can receive any number of events; the values where these events are stored are eventIn type
variables. The name of the variables specifies the name of the events the Script can receive. These
variables are read-only, i.e. you can't set a value to one of these variables, you can only read its current
value. There is also no limit to the number of events a Script can generate. The values associated with
these events are stored in eventOut type variables. These variables, as well as fields, are persistent.
Observation: you can't define an exposedField in a Script. In order to define the equivalent to an
exposedField you'll have to define a field, an eventIn and or eventOut. For instance the following
declaration is equivalent to the definition of an exposed field:
field
eventIn
eventOut
myField
set_myField
myField_changed
Further you will learn how to use JavaScript in order to add scripting functionality to your VRML scene.
Exercise12. Play ex.SCRIPT.wrl and examine de code afterwards. Which is the main difference between
ex8.interactiveBG.wrl and ex.SCRIPT.wrl? Add a new Box to the existing scene and start rotating it when
the sphere is clicked and stop the rotation when the cone is clicked. Use ROUTEs. Do the same using
Scripting. Highlight advantages and disadvantages of the two implementations.
Exercise13. Explain the code snippet found in ex.SCRIPT_PIECE.wrl. Notice the touchTime event and
explain its purpose. What about the enabled event? Integrate the code into a valid VRML file and
demonstrate the script functionality.
Well done!
Right now you should have enough knowledge of VRML in order for you to start writing fairly complex
VRML code.
Of course, there is more on VRML! This tutorial provides only the most relevant information so that you
are not obliged to learn everything about VRML but in the same time you will be able to finalize the
project.
For the eager ones there are some other tutorials and online books throughout the internet. You may
find some helpful links at the end of this tutorial.
We just ended the first part of our journey.
The second part will introduce you BIFS, the Binary Format for Scenes.
BIFS
What is BIFS?
Binary Format for Scenes (BIFS) is a binary format for two- or three-dimensional audiovisual content. It
is based on VRML and part 11 of the MPEG-4 standard. The MPEG-4 Part 11 was published as ISO/IEC
14496011 in 2005 and it is also known as BIFS, XMT, MPEG-J. Shortly, it’s a standardized First Scene
Description of MPEG based on another ISO standard, VRML’97, the one you just read about.
Why BIFS instead VRML?
It’s not mandatory to start using BIFS but it’s worth presenting the advantages of using it and the extra
features which have been added to VRML. You will eventually find out the benefits of using BIFS over
VRML simply because the first supports every feature of the second but it also adds major
improvements. We could just say BIFS extends VRML. Further on are presented several builds on VRML
and adds:
Additional features







2D Vector Graphics
Improved Text handling
Improved Media Management
Binarization
Streamability
XML formats: XMT
Access with Java Code (MPEG-J)
Besides other features, you may notice several nodes have been added. Here is a list of the most
important ones: Transform2D, PlaneSensor2D, Layer2D, Layer3D, Material2D. You may find all of these
and others in the BIFS standard itself, ISO_14496-11_SDAE (BIFS).pdf, attached to the tutorial. As the
document is rather dense, you will have to figure out what exactly do you need to know and choose
wisely parts of the document to study otherwise you’ll probably not enjoy reading a 500 pages standard

File extension and other changes
Although the language is the same there are few slight differences compared to VRML. The first one of
course, has to be the file extension and that is .BT.
As you may notice in the ex2.TexturedCylinder.bt the file does not start with the same comment line as
before simply because the file content is not a VRML description but a BIFS one. In return, we have to
use an InitialObjectDescriptor which should be the first node of your BT file as you may notice in the file
example presented before. For the moment you should not worry about the InitialObjectDescriptor’s
field values except the pixelWidth and pixelHeight. Set these values in order to resize the application
window to your desired dimensions (e.g. 640x480).
In the same example you will find out the url value of the ImageTexture node is a number and not a
path to the media. If you look carefully then you should observe the same number (11 in our example) is
the value for an objectDescriptorID that has a fileName value set to “images/floor.jpg”:
texture ImageTexture {
url "11"
}
# later in the file…
ObjectDescriptor {
objectDescriptorID 11
esDescr [
ES_Descriptor {
ES_ID 11
muxInfo MuxInfo {
fileName "images/floor.jpg"
}
}
]
}
The objectDescriptorID value can be any number as long as it does not conflict with any other
objectDescriptorID within your file. It has to be unique.
You should already know the ImageTexture is used to map an image to a geometric shape that is why
the value of the url has to be a path to an image. The following example is also valid:
texture ImageTexture {
url " images/floor.jpg "
}
There is a huge advantage of using the first example syntax: referencing a media file through an object
descriptor allows us to actually import the media file (e.g. image, sound, video) into the scene. Let’s see
how that is possible and what the advantage is.
Exercise14. Open the ex2.TexturedCylinder.bt and examine the content. You should be familiar with
what’s inside except the first and last nodes. Play both the .wrl and the .bt files; you should not see any
difference. Now move both of them outside the ‘example’ folder. Play them again. What do you notice?
Explain.
Encapsulation
Using the object descriptor didn’t change anything so far. As it was mentioned before, the object
descriptor allows importing the referenced media to the file. Go back to the ‘examples’ directory, open a
‘cmd’ and write the following command:
mp4box -mp4 path/to/your/ TexturedCylinder.bt
Basically this command encapsulates everything that is defined in the BT file into an MP4 file. Therefore
the MP4 file acts like a container for all the graphical elements defined in the BT file and all the media
(audio, video, image, sound) referenced in the BT file through the object descriptors. Right now the
mp4box application should have created the ex2.TexturedCylinder.mp4 in the same directory with the
BT source file.
The reverse operation of mp4box –mp4 btFile.bt is mp4box –bt mp4File.mp4. This command will
generate the corresponding BT file however the media files will be not extracted. They will be
referenced in the new BT file using their initial objectDescriptorID as follows:
muxInfo MuxInfo {
fileName "mp4File.mp4#initialObjectDescriptorID"
streamFormat "MP4"
}
Exercise15. Copy the MP4 file result outside the folder and play it. What do you notice? Compare the
size of the files (.wrl, .bt, .mp4) and explain.
You should now be aware of the advantages of creating BT files and converting them to MP4 files
afterwards. Have you noticed the ex12.argame.mp4 in the beginning of this tutorial? It’s a single file
which contains multiple sounds and images as well as graphical elements. Of course, its corresponding
BT file exists too and you have to be able to create something similar on your own! 
BIFS regression tests presents a lot of examples of how to use each and every node. Don’t hesitate to
examine the files and to ask questions!
It’s time to move on to some real examples therefore I suggest you to download an MPEG AR Game. You
will find the MP4 game file, the BT source file and all the media associated to the game. Please make
sure you can run locally both the MP4 and the BT files in mp4client. Once this is done, let’s start to
examine and understand all that is necessary for you to create your own game.
Plan:





Overview
File structure
PROTOS: especially MAP, Marker, Overlay, ARView, ARObject
Functionalities
Questions?
VRML online books
Introduction to VRML 97 http://www.sdsc.edu/~moreland/courses/Siggraph98/vrml97/slides/mt0000.htm
The Annotated VRML97 Reference Manual http://www.cs.vu.nl/~eliens/documents/vrml/reference/BOOK.HTM
VRML Primer and Tutorial
http://tecfa.unige.ch/guides/vrml/vrmlman/vrmlman.html
VRML Interactive Tutorial
http://www.lighthouse3d.com/vrml/tutorial/
VRML 2.0 Specification (ISO/IEC 14772-1)
http://www.web3d.org/x3d/specifications/vrml/ISO-IEC-14772-VRML97/
Other information sources
Organisation for the VRML standard:
http://www.web3d.org/
VRML 2 Spec:
http://www.web3d.orgtechnicalinfo/specifications/vrml97/
Books:
e.g. The VRML 2 Handbook
BIFS Regression Tests:
https://github.com/golgol7777/gpac/tree/master/regression_tests/bifs