This is adapted from SGI's proposal - Click Here for a diff
Mitra <mitra@mitra.biz>, 4 Dec 95.
General notes: Given a "thick enough" API for Script nodes, Sensors could all be implemented as prototyped Script nodes. Although it may make sense to define such a rich API eventually, in the short term we believe it makes more sense to build that functionality into pre-defined classes.
To allow Sensors to be composed in the VRML file separately from the object they are monitoring, they each have an eventOn field which can reference a target object to be monitored. If this field is left as the default, then it refers to the Separator that the sensor is a child of, with the effect that:
Separator { ClickSensor { ... } Cube { ... } Cone { ... } }will generate events whenever the Cube or Cone are clicked on. The reason for this is to allow ClickSensors to be added later by behaviors without requiring the scene graph to be edited.
When a sensor is triggered, it generates an event to its actionObject, which defaults to the object it is attached to. This event contains a pointer back to the sensor, which the script can access (via the API) to query particular values. This has the big advantage of leaving it up to the script as to how much information it needs, or whether to ignore the event entirely, it also avoids having to route all the "eventOut" fields of a sensor to a LogicNode field. The actionMethod field can be used to call a particular event on a destination ScriptNode. The actionParameters can be used by the destination ScriptNode to distinguish between sending sensors, or to send parameters to scripts.
Each sensor as a set of eventFlags, these control:
Proximity sensors are nodes that generates events when the viewpoint enters, exits, and moves inside a space. A proximity sensor can be activated or inactivated by sending it an "enable" event with a value of TRUE/FALSE. enter and exit events are generated when the viewpoint enters/exits the region and contain the time of entry/exit (ideally, implementations will interpolate viewpoint positions and compute exactly when the viewpoint first intersected the volume). As the viewpoint moves inside the region, position and orientation events are generated that report the position and orientation of the viewpoint in the local coordinate system of the proximity sensor.
There are two types of proximity sensors: BoxProximitySensor and SphereProximitySensor, differing only in the shape of region that they detect.
Issue: Providing position and orientation when the user is outside the region would kill scalability and composability, so those should NOT be provided (authors can create proximity sensors that enclose the entire world if they want to track the viewpoint wherever it is in the world). Position and orientation at the time the user entered/exited the region might also be useful, but I'm not convinced they're useful enough to add (and besides, you could write a Logic node with internal state that figured these out...).
The BoxProximitySensor node reports when the camera enters and leaves the volume defined by the fields center and size (an object-space axis-aligned box).
A BoxProximitySensor that surrounds the entire world will have an enter time equal to the time that the world was entered, and can be used to start up animations or behaviors as soon as a world is loaded.
FILE FORMAT/DEFAULTS BoxProximitySensor { center 0 0 0 # SFVec3f size 0 0 0 # SFVec3f enabled TRUE # SFBool enabled TRUE # SFBool eventOn Parent # SFNode actionObject Parent # SFNode actionMethod "Proximity" # SFString actionParameters SFNode Self # MFFields # eventIn SFBool setEnabled # eventOut SFTime enter # eventOut SFTime exit # eventOut SFVec3f position # eventOut SFRotation orientation }
The PointProximitySensor reports when the camera enters and leaves the sphere defined by the fields center and radius.
FILE FORMAT/DEFAULTS PointProximitySensor { center 0 0 0 # SFVec3f radius 0 # SFFloat eventOn Parent # SFNode actionObject Parent # SFNode actionMethod "Proximity" # SFString actionParameters SFNode Self # MFFields # eventIn SFBool setEnabled # eventOut SFTime enter # eventOut SFTime exit # eventOut SFVec3f position # eventOut SFRotation orientation }
The CollisionSensor node reports when the camera collides with the object that is specified by the "eventOn". It complements the CollideStyle node which controls at a course level whether the user can enter the object being collided with.
FILE FORMAT/DEFAULTS CollisionSensor { enabled TRUE # SFBool enabled TRUE # SFBool eventOn Parent # SFNode actionObject Parent # SFNode actionMethod "Proximity" # SFString actionParameter "" # SFString # eventIn SFBool setEnabled # eventOut SFTime enter # eventOut SFTime exit # eventOut SFVec3f position # eventOut SFRotation orientation }
A PointingDeviceSensor is a node which tracks the pointing device with respect to its child geometry. A PointingDeviceSensor can be made active/inactive by being sent enable events. There are two types of PointingDeviceSensors; ClickSensor and DragSensors.
moveChildrencontrols whether the children's geometry should be moved as the pointer does.
Open issueIt may be usefull to add a SFBitField that controls which of the events are sent.
Events are first processed by looking at a set of queues of Sensors. The first place to check is the "GlobalEvents" pseudo-object, which allow for Modal situations such as "the next place clicked is the destination". Then the object on which the event occured is checked (e.g. the hand), then that object's parent in the scene graph, and so-on up the hierarchy.
Finer control over the event is done through two flags "HANDLED" and "IGNOREHANDLED", if a Sensor is marked "HANDLED" then once an event has been handled by this Sensor, the event is marked as "HANDLED" as it goes up the hierarchy. If an Sensor is marked as "IGNOREHANDLED" then in an event that has already been "HANDLED" will be ignored by it.
The ClickSensor generates events as the pointing device passes over its child geometry, and when the pointing device is over the sensed object. will also generate button press and release events for the button associated with the pointing device. Typically, the pointing device is a mouse and the button is a mouse button.
An enter event is generated when the pointing device passes over any of the shape nodes contained underneath the ClickSensor and contains the time at which the event occured. Likewise, an exit event is generated when the pointing device is no longer over the sensed object. isOver events are generated when enter/exit events are generated; an isOver event with a TRUE value is generated at the same time as enter events, and an isOver FALSE event is generated with exit events.
Should we say anything about what happens if the cursor stays still but the geometry moves out from underneath the ClickSensor? If we do say something, we should probably be conservative and only require enter/exit events when the pointing device moves...
Issue: enter/exit is for locate-highlighting ( changing color or shape when the cursor passes over you to indicate that you may be picked). Is that too much to ask from implementations?
If the user presses the button associated with the pointing device while the cursor is located over its children, the ClickSensor will grab all further motion events from the pointing device until the button is released (other Click or Drag sensors will not generate events during this time). A press event is generated when the button is pressed over the ClickSensor's children, followed by a release event when it is released. isActive TRUE/FALSE events are generated along with the press/release events. Motion of the pointing device while it has been grabbed by a ClickSensor is referred to as a "drag".
As the user drags the cursor over the ClickSensor's child geometry, the point on that geometry which lies directly underneath the cursor is determined. When isOver and isActive are TRUE, hitPoint, hitNormal, and hitTexture events are generated whenever the pointing device moves. hitPoint events contain the 3D point on the surface of the underlying geometry, given in the ClickSensor's coordinate system. hitNormal events contain the surface normal at the hitPoint. hitTexture events contain the texture coordinates of that surface at the hitPoint, which can be used to support the 3D equivalent of an image map.
FILE FORMAT/DEFAULTS ClickSensor { enabled TRUE # SFBool moveChildren FALSE # SFBool eventOn Parent # SFNode actionObject Parent # SFNode actionMethod "Click" # SFString actionParameters SFNode Self # MFFields # eventIn SFBool setEnabled # eventOut SFTime enter # eventOut SFTime exit # eventOut SFBool isOver # eventOut SFTime press # eventOut SFTime release # eventOut SFBool isActive # eventOut SFVec3f hitPoint # eventOut SFVec3f hitNormal # eventOut SFVec2f hitTexture }
A DragSensor tracks pointing and clicking over its child geometry just like the ClickSensor; however, DragSensors track dragging in manner suitable for continuous controllers such as sliders, knobs, and levers. When the pointing device is pressed and dragged over the node's child geometry, the pointing device's position is mapped onto idealized 3D geometry.
DragSensors extend the ClickSensor's interface; enabled, enter, exit, isOver, press, release and isActive are implemented identically. hitPoint, hitNormal, and hitTexture events are only updated upon the initial click down on the DragSensors' child geometry. There are five types of DragSensors; LineSensor and PlaneSensor support translation-oriented interfaces, and DiscSensor, CylinderSensor and SphereSensor establish rotation-oriented interfaces.
The LineSensor maps dragging motion into a translation in one dimension, along the x axis of its local space. It could be used to implement the 3-dimensional equivalent of a 2D slider or scrollbar.
FILE FORMAT/DEFAULTS LineSensor { minPosition 0 # SFFloat maxPosition 0 # SFFloat enabled TRUE # SFBool moveChildren FALSE # SFBool eventOn Parent # SFNode actionObject Parent # SFNode actionMethod "Line" # SFString actionParameters SFNode Self # MFFields # eventIn SFBool setEnabled # eventOut SFTime enter # eventOut SFTime exit # eventOut SFBool isOver # eventOut SFTime press # eventOut SFTime release # eventOut SFBool isActive # eventOut SFVec3f hitPoint # eventOut SFVec3f hitNormal # eventOut SFVec2f hitTexture # eventOut SFVec3f trackPoint # eventOut SFVec3f translation }
minPosition and maxPosition may be set to clamp the translation events to a range of values as measured from the origin of the x axis. If minPosition is less than or equal to maxPosition, translation events are not clamped. trackPoint events provide unclamped drag position along the x axis.
The PlaneSensor maps dragging motion into a translation in two dimensions, in the x-y plane of its local space.
FILE FORMAT/DEFAULTS PlaneSensor { minPosition 0 0 # SFVec2f maxPosition 0 0 # SFVec2f enabled TRUE # SFBool moveChildren FALSE # SFBool eventOn Parent # SFNode actionObject Parent # SFNode actionMethod "Plane" # SFString actionParameters SFNode Self # MFFields # eventIn SFBool setEnabled # eventOut SFTime enter # eventOut SFTime exit # eventOut SFBool isOver # eventOut SFTime press # eventOut SFTime release # eventOut SFBool isActive # eventOut SFVec3f hitPoint # eventOut SFVec3f hitNormal # eventOut SFVec2f hitTexture # eventOut SFVec3f trackPoint # eventOut SFVec3f translation }
minPosition and maxPosition may be set to clamp translation events to a range of values as measured from the origin of the x-y plane. If the x or y component of minPosition is less than or equal to the corresponding component of maxPosition, translation events are not clamped in that dimension. trackPoint events provide unclamped drag position in in the x-y plane.
The DiscSensor maps dragging motion into a rotation around the z axis of its local space. The feel of the rotation is as if you were 'scratching' on a record turntable.
FILE FORMAT/DEFAULTS DiscSensor { minAngle 0 # SFFloat (radians) maxAngle 0 # SFFloat (radians) enabled TRUE # SFBool moveChildren FALSE # SFBool eventOn Parent # SFNode actionObject Parent # SFNode actionMethod "Disk" # SFString actionParameters SFNode Self # MFFields # eventIn SFBool setEnabled # eventOut SFTime enter # eventOut SFTime exit # eventOut SFBool isOver # eventOut SFTime press # eventOut SFTime release # eventOut SFBool isActive # eventOut SFVec3f hitPoint # eventOut SFVec3f hitNormal # eventOut SFVec2f hitTexture # eventOut SFVec3f trackPoint # eventOut SFRotation rotation }
minAngle and maxAngle may be set to clamp rotation events to a range of values as measured in radians about the z axis. If minAngle is less than or equal to maxAngle, rotation events are not clamped. trackPoint events provide unclamped drag position in the x-y plane.
The CylinderSensor maps dragging motion into a rotation around the y axis of its local space. The feel of the rotation is as if you were turning rolling pin.
FILE FORMAT/DEFAULTS CylinderSensor { minAngle 0 # SFFloat (radians) maxAngle 0 # SFFloat (radians) enabled TRUE # SFBool moveChildren FALSE # SFBool eventOn Parent # SFNode actionObject Parent # SFNode actionMethod "Cylinder" # SFString actionParameters SFNode Self # MFFields # eventIn SFBool setEnabled # eventOut SFTime enter # eventOut SFTime exit # eventOut SFBool isOver # eventOut SFTime press # eventOut SFTime release # eventOut SFBool isActive # eventOut SFVec3f hitPoint # eventOut SFVec3f hitNormal # eventOut SFVec2f hitTexture # eventOut SFVec3f trackPoint # eventOut SFRotation rotation # eventOut SFBool onCylinder }
minAngle and maxAngle may be set to clamp rotation events to a range of values as measured in radians about the y axis. If minAngle is less than or equal to maxAngle, rotation events are not clamped.
Upon the initial click down on the CylinderSensors' child geometry, the hitPoint determines the radius of the cylinder used to map pointing device motion while dragging. trackPoint events always reflects the unclamped drag position on the surface of this cylinder, or in the plane perpendicular to the view vector if the cursor moves off of this cylinder. An onCylinder TRUE event is generated at the initial click down; thereafter, onCylinder FALSE/TRUE events are generated if the pointing device is dragged off/on the cylinder.
The SphereSensor maps dragging motion into a free rotation about its center. The feel of the rotation is as if you were rolling a ball.
FILE FORMAT/DEFAULTS SphereSensor { enabled TRUE # SFBool moveChildren FALSE # SFBool eventOn Parent # SFNode actionObject Parent # SFNode actionMethod "Sphere" # SFString actionParameters SFNode Self # MFFields # eventIn SFBool setEnabled # eventOut SFTime enter # eventOut SFTime exit # eventOut SFBool isOver # eventOut SFTime press # eventOut SFTime release # eventOut SFBool isActive # eventOut SFVec3f hitPoint # eventOut SFVec3f hitNormal # eventOut SFVec2f hitTexture # eventOut SFVec3f trackPoint # eventOut SFRotation rotation # eventOut SFBool onSphere }
The free rotation of the SphereSensor is always unclamped.
Upon the initial click down on the SphereSensors' child geometry, the hitPoint determines the radius of the sphere used to map pointing device motion while dragging. trackPoint events always reflects the unclamped drag position on the surface of this cylinder, or in the plane perpendicular to the view vector if the cursor moves off of the sphere. An onSphere TRUE event is generated at the initial click down; thereafter, onSphere FALSE/TRUE events are generated if the pointing device is dragged off/on the cylinder.