Data Property: 'X3DSensorNode field enabled is implemented by multiple nodes.'
Annotations (12)
- rdfs:label "Enables/disables node operation."
- rdfs:label "Enables/disables the sensor node."
- rdfs:label "X3DNBodyCollidableNode field enabled is implemented by multiple nodes."
- rdfs:label "X3DNBodyCollisionSpaceNode field enabled is implemented by multiple nodes."
- rdfs:label "X3DParticlePhysicsModelNode field enabled is implemented by multiple nodes."
- rdfs:label "X3DSensorNode field enabled is implemented by multiple nodes."
- rdfs:label "X3DSoundNode field enabled is implemented by multiple nodes."
- rdfs:label "X3DSoundProcessingNode field enabled is implemented by multiple nodes."
- rdfs:label "X3DSoundSourceNode field enabled is implemented by multiple nodes."
- rdfs:label "X3DVolumeRenderStyleNode field enabled is implemented by multiple nodes."
- enabledDefault http://www.w3.org/2001/XMLSchema#false

- enabledDefault http://www.w3.org/2001/XMLSchema#true

Domains (87)
- 'AcousticProperties specifies the interaction of sound waves with characteristics of geometric objects in the scene.'
- 'All fields fully supported except shadows supported with at least Phong shading at level 3. All fields fully supported with at least Phong shading and Henyey-Greenstein phase function, shadows fully supported at level 4.'
- 'An HAnimMotion node supports discrete frame-by-frame playback for H-Anim motion data animation.'
- 'Analyser provides real-time frequency and time-domain analysis information, without any change to the input.'
- 'AudioClip provides audio data used by parent Sound nodes.'
- 'AudioDestination node represents the final audio destination and is what user ultimately hears, typically from the speakers of user device.'
- 'Base type for all drag-style pointing device sensors.'
- 'Base type for all pointing device sensors.'
- 'Base type for all sensor node types that operate using key devices.'
- 'Base type for all sensors.'
- 'Base type for all sound destination nodes, which represent the final destination of an audio signal and are what the user can ultimately hear.'
- 'Base type for all sound destination nodes, which represent the final destination of an audio signal and are what the user can ultimately hear.'
- 'Base type for all sound nodes.'
- 'Base type for all sound processing nodes, which are used to enhance audio with filtering, delaying, changing gain, etc.'
- 'Base type for all touch-style pointing device sensors.'
- 'Base type for the environmental sensor nodes ProximitySensor, TransformSensor and VisibilitySensor.'
- 'Base typefor all sensors that generate events based on network activity.'
- 'BiquadFilter node is an AudioNode processor implementing common low-order filters.'
- 'BlendedVolumeStyle combines rendering of two voxel data sets into one by blending voxel values.'
- 'BoundaryEnhancementVolumeStyle provides boundary enhancement for the volume rendering style.'
- 'BoundedPhysicsModel provides user-defined geometrical boundaries for particle motion.'
- 'BufferAudioSource node represents a memory-resident audio asset that can contain one or more channels.'
- 'CartoonVolumeStyle generates cartoon-style non-photorealistic rendering of associated volumetric data.'
- 'ChannelMerger unites different input channels into a single output channel.'
- 'ChannelSelector selects a single channel output from all input channels.'
- 'ChannelSplitter separates the different channels of a single audio source into a set of monophonic output channels.'
- 'ClipPlane specifies a single plane equation used to clip (i.'
- 'CollidableOffset repositions geometry relative to center of owning body.'
- 'CollidableShape connects the collision detection system, the rigid body model, and the renderable scene graph.'
- 'Collision detects camera-to-object contact using current view and NavigationInfo avatarSize.'
- 'CollisionCollection holds a collection of objects that can be managed as a single entity for resolution of inter-object collisions.'
- 'CollisionSensor generates collision-detection events.'
- 'CollisionSpace holds collection of objects considered together for resolution of inter-object collisions.'
- 'ComposedVolumeStyle allows compositing multiple rendering styles into single rendering pass.'
- 'Convolver performs a linear convolution on a given AudioBuffer, often used to achieve a reverberation effect.'
- 'CylinderSensor converts pointer motion (for example, a mouse or wand) into rotation values using an invisible cylinder aligned with local Y-axis.'
- 'Delay causes a time delay between the arrival of input data and subsequent propagation to the output.'
- 'DynamicsCompressor node implements a dynamics compression effect, lowering volume of loudest parts of signal and raising volume of softest parts.'
- 'EdgeEnhancementVolumeStyle specifies edge enhancement for the volume rendering style.'
- 'EspduTransform is a networked Transform node that can contain most nodes.'
- 'ForcePhysicsModel applies a constant force value to the particles.'
- 'GeoProximitySensor generates events when the viewer enters, exits and moves within a region of space (defined by a box).'
- 'GeoTouchSensor returns geographic coordinates for the object being selected.'
- 'If a non-uniform scale is applied to the pick sensor, correct results may require level 3 support.'
- 'KeySensor generates events as the user presses keys on the keyboard.'
- 'LinePickSensor uses one or more pickingGeometry line segments to compute intersections with pickTarget shapes.'
- 'ListenerPointSource node represents position and orientation of a person listening to virtual sound in the audio scene, and provides single or multiple sound channels as output.'
- 'LoadSensor generates events as watchList child nodes are either loaded or fail to load.'
- 'LocalFog simulates atmospheric effects by blending distant objects with fog color.'
- 'MicrophoneSource captures input from a physical microphone in the real world.'
- 'MovieTexture applies a 2D movie image to surface geometry, or provides audio for a Sound node.'
- 'Nodes implementing X3DSoundSourceNode provide signal inputs to the audio graph.'
- 'OpacityMapVolumeStyle specifies that volumetric data is rendered using opacity mapped to a transfer function texture.'
- 'OscillatorSource node represents an audio source generating a periodic waveform, providing a constant tone.'
- 'ParticleSystem specifies a complete particle system.'
- 'PeriodicWave defines a periodic waveform that can be used to shape the output of an Oscillator.'
- 'PlaneSensor converts pointing device motion into 2D translation parallel to the local Z=0 plane.'
- 'PointPickSensor tests one or more pickingGeometry points in space as lying inside the provided pickTarget geometry.'
- 'ProjectionVolumeStyle uses voxel data to directly generate output color.'
- 'ProximitySensor generates events when the viewer enters, exits and moves within a region of space (defined by a box).'
- 'ReceiverPdu is a networked Protocol Data Unit (PDU) information node that transmits the state of radio frequency (RF) receivers modeled in a simulation.'
- 'RigidBody describes a collection of shapes with a mass distribution that is affected by the physics model.'
- 'RigidBodyCollection represents a system of bodies that interact within a single physics model.'
- 'SignalPdu is a networked Protocol Data Unit (PDU) information node that communicates the transmission of voice, audio or other data modeled in a simulation.'
- 'SilhouetteEnhancementVolumeStyle specifies that volumetric data is rendered with silhouette enhancement.'
- 'SphereSensor converts pointing device motion into a spherical rotation about the origin of the local coordinate system.'
- 'StreamAudioDestination node represents the final audio destination via a media stream.'
- 'StreamAudioSource operates as an audio source whose media is received from a MediaStream obtained using the WebRTC or Media Capture and Streams APIs.'
- 'StringSensor generates events as the user presses keys on the keyboard.'
- 'The Gain node amplifies or deamplifies the input signal.'
- 'The Sound node controls the 3D spatialization of sound playback by a child AudioClip or MovieTexture node.'
- 'The SpatialSound node controls the 3D spatialization of sound playback by a child AudioClip or MovieTexture node.'
- 'The X3DComposableVolumeRenderStyleNode abstract node type is the base type for all node types that allow rendering styles to be sequentially composed together to form a single renderable output.'
- 'The X3DNBodyCollidableNode abstract node type represents objects that act as the interface between the rigid body physics, collision geometry proxy, and renderable objects in the scene graph hierarchy.'
- 'The X3DNBodyCollisionSpaceNode abstract node type represents objects that act as a self-contained spatial collection of objects that can interact through collision detection routines.'
- 'The X3DParticlePhysicsModelNode abstract type represents any node that applies a form of constraints on the particles after they have been generated.'
- 'The X3DPickSensorNode abstract node type is the base node type that represents the lowest common denominator of picking capabilities.'
- 'The X3DVolumeRenderStyleNode abstract node type is the base type for all node types that specify a specific visual rendering style to be used when rendering volume data.'
- 'TimeSensor continuously generates events as time passes.'
- 'ToneMappedVolumeStyle specifies that volumetric data is rendered with Gooch shading model of two-toned warm/cool coloring.'
- 'TouchSensor tracks location and state of the pointing device, detecting when a user points at or selects (activates) geometry.'
- 'TransformSensor generates output events when its targetObject enters, exits, and moves within a region in space (defined by a box).'
- 'TransmitterPdu is a networked Protocol Data Unit (PDU) information node that provides detailed information about a radio transmitter modeled in a simulation.'
- 'VisibilitySensor detects when user can see a specific object or region as they navigate the world.'
- 'VolumePickSensor tests picking intersections using the pickingGeometry against the pickTarget geometry volume.'
- 'WaveShaper node represents a nonlinear distorter that applies a wave-shaping distortion curve to the signal.'
- 'WindPhysicsModel applies a wind effect to the particles.'