libpxcclr.unity This interface forms the base of all SDK interface definitions. The interface overrides the class delete operator to work with the SDK dispatching mechanism; and exposes a custom (RTTI-free) dynaimc casting mechanism with the QueryInstance method. The application that implements any PXCMBase derived interface must derive from one of the PXCMBaseImpl class templates. The function disposes the object instance. Certain objects are reference counted, in which case the function reduces the reference count by 1. The function checks if the object instance supports a specific interface. If so, return the instance of the interface. Otherwise, returns NULL. The interface identifier. The interface instance, or NULL. The function checks if the object instance supports a specific interface. If so, return the instance of the interface. Otherwise, returns null. The interface instance, or null. Query the current PXCM3DScan Configuration PXCM3DScan Configuration structure Set the PXCM3DScan configuration. The configuration object to be set. the status code Query the current PXC3DScan Area Set the PXC3DScan Area PXC3DScan preview image access The preview image is a rendered approximation of the scanning volume from the perspective of the camera. A different image is available each time a frame is processed. A privew image Return the visible extent from the perspective of the most recent frame. Values are in normalized coordinates (0.0-1.0). Determine if the scan has started PXC3DScan generation of standard mesh formats from the scanning volume. PXC3DScan utility to convert FileFormat value to a string constructors and misc Reconstruction options Bit-OR'ed values Scanning modes Configuration structure Area structure PXC3DScan mesh formats supported by Reconstruct AlertEvent Enumerate all supported alert events. AlertData Describe the alert parameters. Allocate and return a copy of the module's most recent segmented image. The returned object's Release method can be used to deallocate it. constructor Subscribe with an alert callback Set a frame ski interval AlertEvent Enumeratea all supported alert events. AlertData Describe the alert parameters. The interface adds a reference count to the supported object. Increase the reference counter of the underlying object. The increased reference counter value. constructors and misc @brief Return the audio sample information. @return the audio sample information in the AudioInfo structure. @brief Lock to access the internal storage of a specified format. The function will perform format conversion if unmatched. @param[in] access The access mode. @param[in] format The requested smaple format. @param[in] options The option flags @param[out] data The sample data storage, to be returned. @return PXC_STATUS_NO_ERROR Successful execution. @brief Unlock the previously acquired buffer. @param[in] data The sample data storage previously acquired. @return PXCM_STATUS_NO_ERROR Successful execution. Get the audio format string representation The Audio format enumerator. the string representation of the audio format. Return the audio sample size. the sample size in bytes Return the audio sample time stamp. the time stamp, in 100ns. Return the audio sample option flags. the option flags. Set the sample time stamp. The time stamp value, in 100ns. Copy data from another audio sample. The audio sample to copy data from. PXCM_STATUS_NO_ERROR Successful execution. Lock to access the internal storage of a specified format. The function will perform format conversion if unmatched. The access mode. The requested smaple format. The sample data storage, to be returned. PXCM_STATUS_NO_ERROR Successful execution. Lock to access the internal storage of a specified format. The access mode. The sample data storage, to be returned. PXCM_STATUS_NO_ERROR Successful execution. Increase a reference count of the sample. Properties AudioData Describes the audio storage details. AudioFormat Describes the audio sample format ChannelMask Describes the channels of the audio source. AudioInfo Describes the audio sample details. Option Describes the audio options. Access Describes the audio sample access mode. Scan the availble audio devices. the number of audio devices. Get the number of available audio devices previously scanned. the number of audio devices. Enumerate the audio devices previously scanned. The zero-based index to enumerate all devices. The DeviceInfo structure to return the device information. PXCM_STATUS_NO_ERROR Successful execution. PXCM_STATUS_ITEM_UNAVAILABLE No more devices. Get the currnet working device The working device info PXCM_STATUS_NO_ERROR Successful execution. Set the audio device for subsequent module processing. The audio source PXCM_STATUS_NO_ERROR Successful execution. Get the audio device volume the volume from 0 (min) to 1 (max). Set the audio device volume The audio volume from 0 (min) to 1 (max). PXCM_STATUS_NO_ERROR Executed successfully. Constructors and Disposers. DeviceInfo This structure describes the audio device information. the constructor cleans all fields. @class PXCMBlobConfiguration Retrieve the current configuration of the blob module and set new configuration values. @note Changes to PXCBlobConfiguration are applied only when ApplyChanges() is called. Apply the configuration changes to the blob module. This method must be called in order for any configuration changes to apply. PXCM_STATUS_NO_ERROR - successful operation. PXCM_STATUS_DATA_NOT_INITIALIZED - the configuration was not initialized. Restore configuration settings to the default values. PXCM_STATUS_NO_ERROR - successful operation. PXCM_STATUS_DATA_NOT_INITIALIZED - the configuration was not initialized. Retrieve the blob module's current configuration settings. PXCM_STATUS_NO_ERROR - successful operation. PXCM_STATUS_DATA_NOT_INITIALIZED - failed to retrieve current configuration. Set the smoothing strength for the segmentation image. - a value between 0 (no smoothing) to 1 (very smooth). PXCM_STATUS_NO_ERROR - successful operation. PXCM_STATUS_PARAM_UNSUPPORTED - invalid smoothing value. Get the segmentation image smoothing value. The segmentation image smoothing value. Sets the contour smoothing value. - a value between 0 (no smoothing) to 1 (very smooth). PXCM_STATUS_NO_ERROR - successful operation. PXCM_STATUS_PARAM_UNSUPPORTED - invalid smoothing value. Get the contour smoothing value. The contour smoothing value. Get the maximal number of blobs that can be detected. The maximal number of blobs that can be detected. Set the maximal distance in meters of a detected blob from the sensor. Only objects that are within this limit will be identified as blobs. - the maximal distance in meters of a blob from the sensor PXCM_STATUS_NO_ERROR - successful operation PXCM_STATUS_PARAM_UNSUPPORTED - invalid maxDistance value (in this case, the last valid value will be retained). Get the maximal distance in meters of a detected blob from the sensor. The maximal distance of a detected blob from the sensor. Set the maximal depth in meters of a blob (maximal distance between closest and farthest points in the blob). - the maximal depth in meters of the blob. PXCM_STATUS_NO_ERROR - successful operation. PXCM_STATUS_PARAM_UNSUPPORTED - invalid maxDepth value (in this case, the last valid value will be retained). Get the maximal depth in meters of a blob. The maximal depth in meters of a blob. Set the minimal blob size in pixels. Only objects that are larger than this size are identified as blobs. - the minimal blob size in pixels (cannot be more than a quarter of the image size in pixels). PXCM_STATUS_NO_ERROR - successful operation. PXCM_STATUS_PARAM_UNSUPPORTED - invalid minBlobSize value (in this case, the last valid value will be retained). Get the minimal blob size in pixels. The minimal blob size in pixels. Enable extraction of the segmentation image. - set to true if the segmentation image should be extracted; otherwise set to false. PXCM_STATUS_NO_ERROR - successful operation. Return the segmentation image extraction flag. The segmentation image extraction flag. Enable extraction of the contour data. - set to true if contours should be extracted; otherwise set to false. PXCM_STATUS_NO_ERROR - successful operation. Return the contour extraction flag. The contour extraction flag. Set the minimal contour size in points. Objects with external contours that are smaller than the limit are not identified as blobs. - the minimal contour size in points. PXCM_STATUS_NO_ERROR - successful operation. PXCM_STATUS_PARAM_UNSUPPORTED - invalid minContourSize value (in this case, the last valid value will be retained). Get the minimal contour size in points. The minimal contour size in points. Enable or disable the stabilization feature.\n Enabling this feature produces smoother tracking of the extremity points, ignoring small shifts and "jitters".\n @Note: in some cases the tracking may be less sensitive to minor movements and some blob pixels may be outside of the extremities. - true to enable the stabilization; false to disable it. PXCM_STATUS_NO_ERROR - operation succeeded. Return blob stabilizer activation status. true if blob stabilizer is enabled, false otherwise. Set the maximal blob size in pixels. Only objects that are smaller than this size are identified as blobs. - the maximal blob size in pixels. PXCM_STATUS_NO_ERROR - successful operation. PXCM_STATUS_PARAM_UNSUPPORTED - invalid maxBlobSize value (in this case, the last valid value will be retained). Get the maximal blob size in pixels. The maximal blob size in pixels. Set the maximal blob area in square meter. Only objects that are smaller than this area are identified as blobs. - the maximal blob area in square meter. PXCM_STATUS_NO_ERROR - successful operation. PXCM_STATUS_PARAM_UNSUPPORTED - invalid maxBlobArea value (in this case, the last valid value will be retained). Get the maximal blob area in square meter. The maximal blob area in square meter. Set the minimal blob area in square meter. Only objects that are larger than this area are identified as blobs. - the minimal blob area in square meter. PXCM_STATUS_NO_ERROR - successful operation. PXCM_STATUS_PARAM_UNSUPPORTED - invalid minBlobArea value (in this case, the last valid value will be retained). Get the minimal blob area in square meter. The minimal blob area in square meter. Set the smoothing strength for the blob extraction. - a value between 0 (no smoothing) to 1 (strong smoothing). PXCM_STATUS_NO_ERROR - successful operation. PXCM_STATUS_PARAM_UNSUPPORTED - invalid smoothing value (in this case, the last valid value will be retained). Get the segmentation blob smoothing value. The segmentation image smoothing value. Enable blob data extraction correlated to color stream. - set to true if color mapping should be extracted; otherwise set to false. PXCM_STATUS_NO_ERROR - successful operation. Return the color mapping extraction flag. The color mapping extraction flag. @Class PXCMBlobData A class that contains extracted blob and contour line data. The extracted data refers to the sensor's frame image at the time PXCBlobModule.CreateOutput() was called. Updates the extracted blob data to the latest available output. PXCM_STATUS_NO_ERROR - successful operation. PXCM_STATUS_DATA_NOT_INITIALIZED - when the BlobData is not available. Get the number of extracted blobs. The number of extracted blobs. Retrieve an IBlob object using a specific AccessOrder and index (that relates to the given order). - the zero-based index of the requested blob (between 0 and QueryNumberOfBlobs()-1 ). - the order in which the blobs are enumerated. - contains the extracted blob and contour line data. PXCM_STATUS_NO_ERROR - successful operation. PXCM_STATUS_PARAM_UNSUPPORTED - index >= configured maxBlobs. PXCM_STATUS_DATA_UNAVAILABLE - index >= number of detected blobs. Retrieve an IBlob object using a specific AccessOrder and index (that relates to the given order). - the zero-based index of the requested blob (between 0 and QueryNumberOfBlobs()-1 ). - the image type which the blob will be mapped to. To get data mapped to color see PXCBlobConfiguration::EnableColorMapping. - the order in which the blobs are enumerated. - contains the extracted blob data. PXCM_STATUS_NO_ERROR - successful operation. PXCM_STATUS_PARAM_UNSUPPORTED - index >= configured maxBlobs. PXCM_STATUS_DATA_UNAVAILABLE - index >= number of detected blobs or blob data is invalid. @class IContour An interface that provides access to the contour line data. A contour is represented by an array of 2D points, which are the vertices of the contour's polygon. Get the point array representing a contour line. - the size of the array allocated for the contour points. - the contour points stored in the user-allocated array. PXCM_STATUS_NO_ERROR - successful operation. Return true for the blob's outer contour; false for inner contours. true for the blob's outer contour; false for inner contours. Get the contour size (number of points in the contour line). This is the size of the points array that you should allocate. The contour size (number of points in the contour line). @class IBlob An interface that provides access to the blob and contour line data. Retrieves the 2D segmentation image of a tracked blob. In the segmentation image, each pixel occupied by the blob is white (value of 255) and all other pixels are black (value of 0). - the segmentation image of the tracked blob. PXCM_STATUS_NO_ERROR - successful operation. PXCM_STATUS_DATA_UNAVAILABLE - the segmentation image is not available. Get an extremity point location using a specific ExtremityType. - the extremity type to be retrieved. The extremity point location data. Get the number of pixels in the blob. The number of pixels in the blob. Get the number of contour lines extracted (both external and internal). The number of contour lines extracted. Get the point array representing a contour line. A contour is represented by an array of 2D points, which are the vertices of the contour's polygon. - the zero-based index of the requested contour line (the maximal value is QueryNumberOfContours()-1). - the size of the array allocated for the contour points. - the contour points stored in the user-allocated array. PXCM_STATUS_NO_ERROR - successful operation. PXCM_STATUS_PARAM_UNSUPPORTED - the given index or the user-allocated array is invalid. PXCM_STATUS_ITEM_UNAVAILABLE - processImage() is running. Return true for the blob's outer contour; false for inner contours. - the zero-based index of the requested contour. true for the blob's outer contour; false for inner contours. Get the contour size (number of points in the contour line). This is the size of the points array that you should allocate. - the zero-based index of the requested contour line. The contour size (number of points in the contour line). Retrieve an IContour object using index (that relates to the given order). - the zero-based index of the requested contour (between 0 and QueryNumberOfContours()-1 ). - contains the extracted contour line data. PXCM_STATUS_NO_ERROR - successful operation. PXCM_STATUS_DATA_UNAVAILABLE - index >= number of detected contours. Return the location and dimensions of the blob, represented by a 2D bounding box (defined in pixels). The location and dimensions of the 2D bounding box. Return the location and dimensions of the blob, represented by a 3D bounding box. The location and dimensions of the 3D bounding box. AccessOrderType Each AccessOrderType value indicates the order in which the extracted blobs can be accessed. Use one of these values when calling QueryBlobByAccessOrder(). From the nearest to the farthest blob in the scene From the largest to the smallest blob in the scene ExtremityType The identifier of an extremity point of the extracted blob. 6 extremity points are identified (see values below).\n Use one of the extremity types when calling IBlob.QueryExtremityPoint(). The closest point to the sensor in the tracked blob The left-most point of the tracked blob The right-most point of the tracked blob The top-most point of the tracked blob The bottom-most point of the tracked blob SegmentationImageType Each SegmentationImageType value indicates the extracted blobs data mapping. Use one of these values when calling QueryBlob(). The blob data mapped to depth image @class PXCMBlobExtractor A utility for extracting mask images (blobs) of objects in front of the camera. Given a depth image, this utility will find the largest objects that are "close enough" to the camera. For each object a segmentation mask image will be created, that is, all object pixels will be "white" (255) and all other pixels will be "black" (0). The maximal number of blobs that can be extracted is 2. The order of the blobs will be from the largest to the smallest (in number of pixels) initialize PXCMBlobExtractor instance for a specific image type (size) definition of the images that should be processed Extract the 2D image mask of the blob in front of the camera. In the image mask, each pixel occupied by the blob's is white (set to 255) and all other pixels are black (set to 0). the depth image to be segmented PXC_STATUS_NO_ERROR if a valid depth exists and could be segmented; otherwise, return the following error: PXCM_STATUS_DATA_UNAVAILABLE - if image is not available or PXCM_STATUS_ITEM_UNAVAILABLE if processImage is running or PXCMBlobExtractor was not initialized. Retrieve the 2D image mask of the blob and its associated blob data The blobs are ordered from the largest to the smallest (in number of pixels) the zero-based index of the requested blob (has to be between 0 to number of blobs) the 2D image mask of the requested blob the data of the requested blob PXCM_STATUS_NO_ERROR if index is valid and processImage terminated successfully; otherwise, return the following error: PXCM_STATUS_PARAM_UNSUPPORTED if index is invalid or PXCM_STATUS_ITEM_UNAVAILABLE if processImage is running or PXCMBlobExtractor was not initialized. Set the maximal number of blobs that can be detected The default number of blobs that will be detected is 1 the maximal number of blobs that can be detected (limited to 2) PXCM_STATUS_NO_ERROR if maxBlobs is valid; otherwise, return the following error: PXCM_STATUS_PARAM_UNSUPPORTED, maxBlobs will remain the last valid value Get the maximal number of blobs that can be detected the maximal number of blobs that can be detected Get the number of detected blobs the number of detected blobs Set the maximal distance limit from the camera. Blobs will be objects that appear between the camera and the maxDistance limit. the maximal distance from the camera (has to be a positive value) PXCM_STATUS_NO_ERROR if maxDistance is valid; otherwise, return the following error: PXCM_STATUS_PARAM_UNSUPPORTED, maxDistance will remain the last valid value Get the maximal distance from the camera, in which an object can be detected and segmented maximal distance from the camera Set the maximal depth of a blob (maximal distance between closest and furthest points on blob) the maximal depth of the blob (has to be a positive value) PXCM_STATUS_NO_ERROR if maxDepth is valid; otherwise, return the following error: PXCM_STATUS_PARAM_UNSUPPORTED, maxDepth will remain the last valid value Get the maximal depth of the blob that can be detected and segmented maximal depth of the blob Set the smoothing level of the shape of the blob The smoothing level ranges from 0 to 1, when 0 means no smoothing, and 1 implies a very smooth contour the smoothing level PXCM_STATUS_NO_ERROR if smoothing is valid; otherwise, return the following error: PXCM_STATUS_PARAM_UNSUPPORTED, smoothing level will remain the last valid value Get the smoothing level of the blob (0-1) smoothing level of the blob Set the minimal blob size in pixels Any blob that is smaller than threshold will be cleared during "ProcessImage". the minimal blob size in pixels (cannot be more than a quarter of image-size) PXCM_STATUS_NO_ERROR if minBlobSize is valid; otherwise, return the following error: PXCM_STATUS_PARAM_UNSUPPORTED, minimal blob size will remain the last valid size Get the minimal blob size in pixels minimal blob size BlobData Contains the parameters that define a blob Image coordinates of the closest point in the blob Image coordinates of the left-most point in the blob Image coordinates of the right-most point in the blob Image coordinates of the top point in the blob Image coordinates of the bottom point in the blob Image coordinates of the center of the blob @Class PXCMBlobModule The main interface to the blob module's classes. The blob module allows you to extract "blobs" (silhouettes of objects identified by the sensor) and their contour lines. Use the PXCBlobModule interface to access to the module's configuration and blob and contour line data. Create a new instance of the blob module's active configuration. Use the PXCBlobConfiguration object to examine the current configuration or to set new configuration values. A pointer to the configuration instance. Create a new instance of the blob module's output data (extracted blobs and contour lines). A pointer to a PXCBlobData instance. Query camera calibration and transformation data for a sensor. The stream type which is produced by the sensor. The intrinsics calibration parameters of the sensor. The extrinsics parameters from the sensor to the camera coordinate system origin. PXCM_STATUS_NO_ERROR Successful execution. Query camera calibration and transformation data for a sensor according to user defined options. The stream type which is produced by the sensor. The options that selects specific calibration and transformation data which is produced by the sensor. The intrinsics calibration parameters of the sensor. The extrinsics transformation parameters from the sensor to the camera coordinate system origin. PXCM_STATUS_NO_ERROR Successful execution. This interface provides member functions to create instances of and query stream capture devices. @brief Return the device infomation structure for a given device. @param[in] didx The zero-based device index. @param[out] dinfo The pointer to the DeviceInfo structure, to be returned. @return PXCM_STATUS_NO_ERROR Successful execution. @return PXCM_STATUS_ITEM_UNAVAILABLE The specified index does not exist. @brief The function subscribes a handler for Capture callbacks. supports subscription of multiple callbacks. @param[in] Handler handler instance for Capture callbacks. @return the status of the handler subscription. @brief The function unsubscribes a handler from the Capture callbacks. the rest of the handlers will still be triggered. @param[in] handler handler instance of Capture callbacks to unsubscirbe. @return the status of the handler unsubscription. Get the stream type string representation The stream type The corresponding string representation Get the stream type from an index number. The strream index. The corresponding stream type. Get the stream index number The stream type The stream index numebr. Return the number of devices. the number of available devices Activate the device and return the video device instance. The zero-based device index The device instance. @brief Return the device infomation structure of the current device. @param[out] dinfo The pointer to the DeviceInfo structure, to be returned. Get the number of valid stream configurations for the streams of interest. The bit-OR'ed value of stream types of interest. the number of valid profile combinations. Check if profile set is valid. The stream profile set to check true if stream profile is valid. false if stream profile is invalid. Query the unique configuration parameters for the selected video streams (types). The bit-OR'ed value of stream types of interest. Zero-based index to retrieve all valid profiles. The stream profile set. PXCM_STATUS_NO_ERROR for successful execution. PXCM_STATUS_ITEM_UNAVAILABLE if index out of range. Query for the active stream configuration parameters (during streaming). The stream profile set, to be returned. PXCM_STATUS_NO_ERROR for successful execution. Set the active profile for the all video streams. The application must configure all streams before streaming. The stream profile set. PXCM_STATUS_NO_ERROR successful execution. Reset the device properties to their factory default values. The stream type to reset properties, or STREAM_TYPE_ANY for all streams. Restore all device properties to what the application sets. Call this function upon receiving windows focus. Get the color stream exposure value. The color stream exposure, in log base 2 seconds. Get the color stream exposure property information. The color stream exposure property information Set the color stream exposure value. The color stream exposure value, in log base 2 seconds. PXCM_STATUS_NO_ERROR for successful execution. PXCM_STATUS_ITEM_UNAVAILABLE if the device property is not supported. Set the color stream exposure value. True to enable auto exposure. PXCM_STATUS_NO_ERROR for successful execution. PXCM_STATUS_ITEM_UNAVAILABLE if the device property is not supported. Query the color stream auto exposure value. exposure auto mode Get the color stream brightness value. The color stream brightness from -10,000 (pure black) to 10,000 (pure white). Get the color stream brightness property information. The color stream brightness property information Set the color stream brightness value. The color stream brightness from -10,000 (pure black) to 10,000 (pure white). PXCM_STATUS_NO_ERROR successful execution. PXCM_STATUS_ITEM_UNAVAILABLE if the device property is not supported. Get the color stream contrast value. The color stream contrast, from 0 to 10,000. Get the color stream contrast property information. The color stream contrast property information Set the color stream contrast value. The color stream contrast, from 0 to 10,000. PXCM_STATUS_NO_ERROR for successful execution. PXCM_STATUS_ITEM_UNAVAILABLE if the device property is not supported. Get the color stream saturation value. The color stream saturation, from 0 to 10,000. Get the color stream saturation property information. The color stream saturation property information. Set the color stream saturation value. The color stream saturation, from 0 to 10,000. PXCM_STATUS_NO_ERROR for successful execution. PXCM_STATUS_ITEM_UNAVAILABLE if the device property is not supported. Get the color stream hue value. The color stream hue, from -180,000 to 180,000 (representing -180 to 180 degrees.) Get the color stream Hue property information. The color stream Hue property information. Set the color stream hue value. The color stream hue, from -180,000 to 180,000 (representing -180 to 180 degrees.) PXCM_STATUS_NO_ERROR for successful execution. PXCM_STATUS_ITEM_UNAVAILABLE if the device property is not supported. Get the color stream gamma value. The color stream gamma, from 1 to 500. Get the color stream gamma property information. The color stream gamma property information Set the color stream gamma value. The color stream gamma, from 1 to 500. PXCM_STATUS_NO_ERROR for successful execution. PXCM_STATUS_ITEM_UNAVAILABLE if the device property is not supported. Get the color stream white balance value. The color stream balance, as a color temperature in degrees Kelvin. Get the color stream white balance property information. The color stream white balance property information. Set the color stream white balance value. The color stream balance, as a color temperature in degrees Kelvin. PXCM_STATUS_NO_ERROR for successful execution. PXCM_STATUS_ITEM_UNAVAILABLE if the device property is not supported. Query the color stream auto white balance mode status. White Balance auto mode status. Set the color stream auto white balance mode. The flag to tell if the auto is enabled or not. PXCM_STATUS_NO_ERROR for successful execution. PXCM_STATUS_ITEM_UNAVAILABLE if the device property is not supported. Get the color stream sharpness value. The color stream sharpness, from 0 to 100. Get the color stream Sharpness property information. The color stream Sharpness property information Set the color stream sharpness value. The color stream sharpness, from 0 to 100. PXCM_STATUS_NO_ERROR for successful execution. PXCM_STATUS_ITEM_UNAVAILABLE if the device property is not supported. Get the color stream back light compensation status. The color stream back light compensation status. Get the color stream back light compensation property information. The color stream back light compensation property information. Set the color stream back light compensation status. The color stream back light compensation status. PXCM_STATUS_NO_ERROR for successful execution. PXCM_STATUS_ITEM_UNAVAILABLE if the device property is not supported. Get the color stream gain value. The color stream gain adjustment, with negative values darker, positive values brighter, and zero as normal. Get the color stream gain property information. The color stream gain property information Set the color stream gain value. The color stream gain adjustment, with negative values darker, positive values brighter, and zero as normal. PXCM_STATUS_NO_ERROR for successful execution. PXCM_STATUS_ITEM_UNAVAILABLE if the device property is not supported. Get the color stream power line frequency value. The power line frequency in Hz. Set the color stream power line frequency value. The power line frequency in Hz. PXCM_STATUS_NO_ERROR successful execution. PXCM_STATUS_ITEM_UNAVAILABLE the device property is not supported. Get the color stream power line frequency Property Info. The power line frequency default value. Query the color stream power line frequency auto value. The power line frequency auto mode status. PowerLineFrequency auto mode status Set the color stream auto power line frequency. The power line frequency auto mode. PXCM_STATUS_NO_ERROR successful execution. PXCM_STATUS_ITEM_UNAVAILABLE the device property is not supported. Get the color stream field of view. The color-sensor horizontal and vertical field of view parameters, in degrees. Get the color stream focal length. The color-sensor focal length in pixels. The parameters vary with the resolution setting. Get the color stream focal length in mm. The color-sensor focal length in mm. Get the color stream principal point. The color-sensor principal point in pixels. The parameters vary with the resolution setting. Get the depth stream low confidence value. The special depth map value to indicate that the corresponding depth map pixel is of low-confidence. Get the depth stream confidence threshold. The confidence threshold that is used to floor the depth map values. The range is from 1 to 32767. Get the depth stream confidence threshold information. The property information for the confidence threshold that is used to floor the depth map values. The range is from 0 to 15. Set the depth stream confidence threshold. The confidence threshold that is used to floor the depth map values. The range is from 1 to 32767. PXCM_STATUS_NO_ERROR successful execution. PXCM_STATUS_ITEM_UNAVAILABLE the device property is not supported. Get the depth stream unit value. The unit of depth values in micrometer. Set the depth stream unit value. The unit of depth values in micrometre PXCM_STATUS_NO_ERROR successful execution. PXCM_STATUS_ITEM_UNAVAILABLE the device property is not supported. Get the depth stream field of view. The depth-sensor horizontal and vertical field of view parameters, in degrees. Get the depth stream sensor range. The depth-sensor, sensing distance parameters, in millimeters. Get the depth stream focal length. The depth-sensor focal length in pixels. The parameters vary with the resolution setting. Get the depth stream focal length in mm. The depth-sensor focal length in mm. Get the depth stream principal point. The depth-sensor principal point in pixels. The parameters vary with the resolution setting. Get the device allow profile change status. If ture, allow resolution change and throw PXCM_STATUS_STREAM_CONFIG_CHANGED. Set the device allow profile change status. If ture, allow resolution change and throw PXCM_STATUS_STREAM_CONFIG_CHANGED. PXCM_STATUS_NO_ERROR successful execution. PXCM_STATUS_ITEM_UNAVAILABLE the device property is not supported. Get the mirror mode. The mirror mode Set the mirror mode. The mirror mode PXCM_STATUS_NO_ERROR successful execution. PXCM_STATUS_ITEM_UNAVAILABLE the device property is not supported. Get the IVCAM laser power. The laser power value from 0 (minimum) to 16 (maximum). Get the IVCAM laser power property information. The laser power proeprty information. Set the IVCAM laser power. The laser power value from 0 (minimum) to 16 (maximum). PXCM_STATUS_NO_ERROR successful execution. PXCM_STATUS_ITEM_UNAVAILABLE the device property is not supported. Get the IVCAM accuracy. The accuracy value Set the IVCAM accuracy. The accuracy value PXCM_STATUS_NO_ERROR successful execution. PXCM_STATUS_ITEM_UNAVAILABLE the device property is not supported. Get the IVCAM accuracy property Information. The accuracy value property Information Get the IVCAM filter option (smoothing aggressiveness) ranged from 0 (close range) to 7 (far range). The filter option value. Get the IVCAM Filter Option property information. The IVCAM Filter Option property information. Set the IVCAM filter option (smoothing aggressiveness) ranged from 0 (close range) to 7 (far range). The filter option value PXC_STATUS_NO_ERROR successful execution. PXC_STATUS_ITEM_UNAVAILABLE the device property is not supported. Get the IVCAM motion range trade off option, ranged from 0 (short range, better motion) to 100 (far range, long exposure). The motion range trade option. Get the IVCAM Filter Option property information. The IVCAM Filter Option property information. Set the IVCAM motion range trade off option, ranged from 0 (short range, better motion) to 100 (far range, long exposure). The motion range trade option. PXC_STATUS_NO_ERROR successful execution. PXC_STATUS_ITEM_UNAVAILABLE the device property is not supported. Get the DS Left Right Cropping status. true if enabled enable\disable the DS Left Right Cropping. The setting value PXCM_STATUS_NO_ERROR successful execution. PXCM_STATUS_ITEM_UNAVAILABLE the device property is not supported. Get the DS emitter status true if enabled enable\disable the DS Emitter The setting value PXCM_STATUS_NO_ERROR successful execution. PXCM_STATUS_ITEM_UNAVAILABLE the device property is not supported. Get the DS Disparity (inverse distance) Output status true if enabled enable\disable DS Disparity Output, Switches the range output mode between distance (Z) and disparity (inverse distance) The setting value PXCM_STATUS_NO_ERROR successful execution. PXCM_STATUS_ITEM_UNAVAILABLE the device property is not supported. Gets the disparity scale factor used when in disparity output mode. Default value is 32. the disparity scale factor. Sets the disparity scale factor used when in disparity output mode. the disparity scale factor. PXCM_STATUS_NO_ERROR successful execution. PXCM_STATUS_ITEM_UNAVAILABLE the device property is not supported. Gets the disparity shift used when in disparity output mode. the disparity shift. Sets the disparity shift used when in disparity output mode. the disparity shift. PXCM_STATUS_NO_ERROR successful execution. PXCM_STATUS_ITEM_UNAVAILABLE the device property is not supported. Get the current min and max limits of the z. min, max z. Set the min and max limits of the z units. The setting value PXCM_STATUS_NO_ERROR successful execution. PXCM_STATUS_ITEM_UNAVAILABLE the device property is not supported. Get the color recatification status true if enabled Get the depth recatification status true if enabled Get DS left and right streams exposure value. DS left and right streams exposure, in log base 2 seconds. Get DS left and right streams exposure property information. The DS left and right streams exposure property information Set DS left and right streams exposure value. DS left and right streams exposure value, in log base 2 seconds. PXCM_STATUS_NO_ERROR successful execution. XCM_STATUS_ITEM_UNAVAILABLE the device property is not supported. Set DS left and right streams exposure value. True to enable auto exposure. PXCM_STATUS_NO_ERROR successful execution. XCM_STATUS_ITEM_UNAVAILABLE the device property is not supported. Get DS left and right streams gain value. DS left and right streams gain adjustment, with negative values darker, positive values brighter, and zero as normal. Get DS left and right streams gain property information. DS left and right streams gain property information Set DS left and right streams gain value. DS left and right streams gain adjustment, with negative values darker, positive values brighter, and zero as normal. PXCM_STATUS_NO_ERROR successful execution. PXCM_STATUS_ITEM_UNAVAILABLE the device property is not supported. Read the selected streams asynchronously. The function returns immediately. The application must synchronize SP to get the stream samples. The application can read more than a single stream using the scope parameter, provided that all streams have the same frame rate. Otherwise, the function will return error. The bit-OR'ed value of stream types of interest. The output sample. The pointer to the SP to be returned. PXCM_STATUS_NO_ERROR successful execution. PXCM_STATUS_DEVICE_LOST the device is disconnected. PXCM_STATUS_PARAM_UNSUPPORTED the streams are of different frame rates. Read the all configured streams asynchronously. The function returns immediately. The application must synchronize SP to get the stream samples. The configured streams must have the same frame rate or the function will return error. The output sample. The pointer to the SP to be returned. PXCM_STATUS_NO_ERROR successful execution. PXCM_STATUS_DEVICE_LOST the device is disconnected. PXCM_STATUS_PARAM_UNSUPPORTED the streams are of different frame rates. Read the selected streams synchronously. The function blocks until all stream samples are ready. The application can read more than a single stream using the scope parameter, provided that all streams have the same frame rate. Otherwise, the function will return error. The bit-OR'ed value of stream types of interest. The output sample. PXCM_STATUS_NO_ERROR successful execution. PXCM_STATUS_DEVICE_LOST the device is disconnected. PXCM_STATUS_PARAM_UNSUPPORTED the streams are of different frame rates. Return the instance of the PXCProjection interface. Must be called after initialization. the PXCProjection instance. Return the device information structure of the current device. PowerLineFrequency Describes the power line compensation filter values. MirrorMode Describes the mirroring options. IVCAMAccuracy Describes the IVCAM accuracy. Property Describes the device properties. Use the inline functions to access specific device properties. Option Describes the stream options. StreamProfile Describes the video stream configuration parameters StreamProfileSet The set of StreamProfile that describes the configuration parameters of all streams. Access the configuration parameters by the stream type. The stream type. The StreamProfile instance. PropertyInfo Describe the property value range and attributes. @class Sample The capture sample that contains multiple streams. The constructor zeros the image instance array Return the image element by StreamType. The stream type. The image instance. The Capture calls this function when a capture device is added or removed StreamType Bit-OR'ed values of stream types, physical or virtual streams. Describe device details. Get the available stream numbers. the number of streams. DeviceModel Describes the device type (0xFFF00000) and model (0xFFFFF) DeviceOrientation Describes the device orientation ConnectionType Describes the Connection type of the device The CaptureManager interface provides the following features: (1) Locate an I/O device that meets all module input needs. (2) Record any streamming data to a file and playback from the file. @brief Add the module input needs to the CaptureManager device search. The application must call this function for all modules before the LocalStreams function, where the CaptureManager performs the device match. @param[in] mid The module identifier. The application can use any unique value to later identify the module. @param[in] inputs The module input descriptor. @return PXCM_STATUS_NO_ERROR Successful executation. @brief The function locates an I/O device that meets any module input needs previously specified by the RequestStreams function. The device and its streams are configured upon a successful return. @param[in] handler The optional handler instance for callbacks during the device match. @return PXCM_STATUS_NO_ERROR Successful executation. @brief Close the streams. @brief Return the stream resolution of the specified stream type. @param[in] type The stream type. @return the stream resolution. The functinon adds the specified DeviceInfo to the DeviceInfo filter list. The DeviceInfo structure to be added to the filter list, or NULL to clean up the filter list. The functinon adds the specified device information to the DeviceInfo filter list. The optional device friendly name. The optional device symbolic name. The optional device index. The functinon adds the specified StreamProfile to the StreamProfile filter list. The stream configuration to be added to the filter list, or NULL to clean up the filter list. The functinon adds the specified StreamProfile to the StreamProfile filter list. The stream type. The optional image width. The optional image height. The optional frame rate. The function locates an I/O device that meets any module input needs previously specified by the RequestStreams function. The device and its streams are configured upon a successful return. PXCM_STATUS_NO_ERROR Successful executation. Return the capture instance. the capture instance. Return the device instance. the device instance. Read the image samples for a specified module. The module identifier. The captured sample, to be returned. The SP, to be returned. PXCM_STATUS_NO_ERROR Successful execution. Setup file recording or playback. The file name. If true, the file is opened for recording. Otherwise, the file is opened for playback. PXCM_STATUS_NO_ERROR Successful execution. Set up to record or playback certain stream types. The bit-OR'ed stream types. Pause/Resume recording or playing back. True for pause and false for resume. Set the realtime playback mode. True to playback in real time, or false to playback as fast as possible. Reset the playback position by the frame index. The frame index. Return the current playback position in frame index. The frame index. Reset the playback position by the nearest time stamp. The time stamp, in 100ns. Return the current playback frame time stamp. The time stamp, in 100ns. Return the frame number in the recorded file. The number of frames. Enable detection of device rotation. Call function PXCImage::QueryRotation on current image to query rotation data. If true, enable detection of device rotation, otherwise disable. Query if device rotation enabled. This is the PXCMCaptureManager callback interface. The CaptureManager callbacks this function when creating a device instance. The I/O module descriptor. The device instance. The CaptureManager aborts the device match if the status is an error. The CaptureManager callbacks this function when configuring the device streams. The device instance. The bit-OR'ed value of all streams. The CaptureManager aborts the device match if the status is an error. The CaptureManager callbacks this function when the current device failed to meet the algorithm needs. If the function returns any error, the CaptureManager performs the current device match again, allowing to try multple configurations on the same device. The device instance. The CaptureManager repeats the match on the same device if the status code is any error, or go onto the next device if the status code is no error. @class PXCMContourExtractor A utility for extracting contour lines from blob (mask) images. Given a mask image, in which the blob pixels are white (255) and the rest are black (0) this utility will extract the contour lines of the blob. The contour lines are all the lines that define the borders of the blob. Inner contour lines (i.e. "holes" in the blob) are defined by an array of clock-wise points. The outer contour line (i.e. the external border) is defined by an array of counter-clock-wise points. initialize PXCMContourExtractor instance for a specific image type (size) definition of the images that should be processed @see PXCMImage.ImageInfo Extract the contour of the blob in the given image Given an image of a blob, in which object pixels are white (set to 255) and all other pixels are black (set to 0), extract the contour lines of the blob. Note that there might be multiple contour lines, if the blob contains "holes". the blob-image to be processed PXCM_STATUS_NO_ERROR if a valid blob image exists and could be processed; otherwise, return the following error: PXCM_STATUS_DATA_UNAVAILABLE - if image is not available or PXCM_STATUS_ITEM_UNAVAILABLE if processImage is running or PXCMContourExtractor was not initialized. Get the data of the contour line A contour is composed of a line, an array of 2D points describing the contour path the zero-based index of the requested contour size of the allocated array for the contour points points stored in the user allocated array PXCM_STATUS_NO_ERROR if terminated successfully; otherwise, return the following error: PXCM_STATUS_PARAM_UNSUPPORTED if index is invalid or user allocated array is invalid PXCM__STATUS_ITEM_UNAVAILABLE if processImage is running or PXCMContourExtractor was not initialized. Get the contour size (number of points in the contour) This is the size of the points array that the user should allocate the contour size (number of points in the contour) the zero-based index of the requested contour Return true if it is the blob's outer contour, false for internal contours. true if it is the blob's outer contour, false for internal contours. the zero-based index of the requested contour Get the number of contours extracted the number of contours extracted Set the smoothing level of the shape of the contour The smoothing level ranges from 0 to 1, when 0 means no smoothing, and 1 implies a very smooth contour Note that a larger smoothing level will reduce the number of points, while "cleaning" small holes in the line the smoothing level PXCM_STATUS_NO_ERROR if smoothing is valid; otherwise, return the following error: PXCM_STATUS_PARAM_UNSUPPORTED, smoothing level will remain the last valid value Get the smoothing level of the contour (0-1) smoothing level of the contour @class PXCMCursorConfiguration @brief Handles all the configuration options of the hand cursor module. Use this interface to configure the tracking, alerts, gestures and output options. @note Updated configuration is applied only when ApplyChanges is called. Apply the configuration changes to the module. This method must be called in order to apply the current configuration changes. PXCM_STATUS_NO_ERROR - operation succeeded. PXCM_STATUS_DATA_NOT_INITIALIZED - configuration was not initialized.\n Restore configuration settings to the default values. PXCM_STATUS_NO_ERROR - operation succeeded. PXCM_STATUS_DATA_NOT_INITIALIZED - configuration was not initialized.\n Read current configuration settings from the module into this object. PXCM_STATUS_NO_ERROR - operation succeeded. PXCM_STATUS_DATA_NOT_INITIALIZED - configuration was not read.\n Set the boundaries of the tracking area. The tracking boundaries create a frustum shape in which the hand is tracked.\n (A frustum is a truncated pyramid, with 4 side planes and two rectangular bases.)\n When the tracked hand reaches one of the boundaries (near, far, left, right, top, or bottom), the appropriate alert is fired. @param[in] trackingBounds - the struct that defines the tracking boundaries. @note The frustum base center are directly opposite the sensor.\n /// - the struct that defines the tracking boundaries. PXCM_STATUS_NO_ERROR - operation succeeded. PXCM_STATUS_PARAM_UNSUPPORTED - invalid argument. @see PXCMCursorData.TrackingBounds Get the values defining the tracking boundaries frustum. PXCMCursorData.TrackingBounds Enable/Disable Cursor engagement mode. The cursor engagement mode retrieves an indication that the hand is ready to interact with the user application - a boolean to turn off/on the feature. PXC_STATUS_NO_ERROR - operation succeeded. Set the duration time in milliseconds for engagement of the Cursor. The duration is the loading time since the algorithm recognized the cursor till the completion of the gesture - time duration in milliseconds (min 32) PXC_STATUS_NO_ERROR - operation succeeded. PXC_STATUS_VALUE_OUT_OF_RANGE - time duration is under 32 milliseconds; @brief Get the duration time in milliseconds for engagement of the Cursor. The duration is the time needed for the hand to be in front of the camera and static. @param[out] timeInMilliseconds - time duration in milliseconds. Enable alert messaging for a specific event. - the ID of the event to be enabled. PXCM_STATUS_NO_ERROR - operation succeeded. PXCM_STATUS_PARAM_UNSUPPORTED - invalid alert type. @see PXCMCursorData.AlertType Enable all alert messaging events. PXCM_STATUS_NO_ERROR - operation succeeded. Test the activation status of the given alert. - the ID of the event to be tested. true if the alert is enabled, false otherwise. @see PXCMCursorData.AlertType Disable alert messaging for a specific event. - the ID of the event to be disabled PXCM_STATUS_NO_ERROR - operation succeeded. PXCM_STATUS_PARAM_UNSUPPORTED - unsupported parameter. PXCM_STATUS_DATA_NOT_INITIALIZED - data was not initialized. @see PXCMCursorData.AlertType Disable messaging for all alerts. PXCM_STATUS_NO_ERROR - operation succeeded. PXCM_STATUS_DATA_NOT_INITIALIZED - data was not initialized. Register an event handler object for the alerts. The event handler's OnFiredAlert method is called each time an alert fires. - a pointer to the event handler. PXCM_STATUS_NO_ERROR - operation succeeded. PXCM_STATUS_PARAM_UNSUPPORTED - null alertHandler pointer. Unsubscribe an alert handler object. - a pointer to the event handler to unsubscribe. PXCM_STATUS_NO_ERROR - operation succeeded. PXCM_STATUS_PARAM_UNSUPPORTED - illegal alertHandler (null pointer). Enable a gesture, so that events are fired when the gesture is identified. - the ID of the gesture to be enabled. PXCM_STATUS_NO_ERROR - operation succeeded. PXCM_STATUS_PARAM_UNSUPPORTED - invalid parameter. Enable all gestures, so that events are fired for every gesture identified. PXCM_STATUS_NO_ERROR - operation succeeded. Check whether a gesture is enabled. - the ID of the gesture to be tested. true if the gesture is enabled, false otherwise. Deactivate identification of a gesture. Events will no longer be fired for this gesture. - the ID of the gesture to deactivate. PXC_STATUS_NO_ERROR - operation succeeded. PXCM_STATUS_PARAM_UNSUPPORTED - invalid gesture name. Deactivate identification of all gestures. Events will no longer be fired for any gesture. PXCM_STATUS_NO_ERROR - operation succeeded. Register an event handler object to be called on gesture events. The event handler's OnFiredGesture method will be called each time a gesture is identified. - a pointer to the gesture handler. PXCM_STATUS_NO_ERROR - operation succeeded. PXCM_STATUS_PARAM_UNSUPPORTED - null gesture handler. Unsubscribe a gesture event handler object. After this call no callback events will be sent to the given gestureHandler. - a pointer to the event handler to unsubscribe. PXCM_STATUS_NO_ERROR - operation succeeded. PXCM_STATUS_PARAM_UNSUPPORTED - null gesture handler. @class PXCCursorData This class holds all the output of the hand cursor tracking process. Each instance of this class holds the information of a specific frame. Updates hand data to the most current output. Return the number of fired alerts in the current frame. Get the details of the fired alert with the given index. - the zero-based index of the requested fired alert. - the information for the fired event. @note the index is between 0 and the result of QueryFiredAlertsNumber() PXCM_STATUS_NO_ERROR - operation succeeded. PXCM_STATUS_PARAM_UNSUPPORTED - invalid input parameter. @see AlertData @see QueryFiredAlertsNumber Return whether the specified alert is fired in the current frame, and retrieve its data. - the alert type. - the information for the fired event. true if the alert is fired, false otherwise. @see AlertType @see AlertData Return whether the specified alert is fired for a specific hand in the current frame, and retrieve its data. - the alert type. - the ID of the hand whose alert should be retrieved. - the information for the fired event. true if the alert is fired, false otherwise. @see AlertType @see AlertData Return the number of gestures fired in the current frame. Get the details of the fired gesture with the given index. - the zero-based index of the requested fired gesture. - the information for the fired gesture. @note The gesture index must be between 0 and [QueryFiredGesturesNumber() - 1] PXCM_STATUS_NO_ERROR - operation succeeded. PXCM_STATUS_PARAM_UNSUPPORTED - invalid input parameter. @see GestureData @see QueryFiredGesturesNumber Return whether the specified gesture is fired for a specific hand in the current frame, and if so retrieve its data. - the event type of the gesture to be checked. - the information for the fired gesture. true if the gesture was fired, false otherwise. @see GestureType @see GestureData Return whether the specified gesture is fired for a specific hand in the current frame, and if so retrieve its data. - the name of the gesture to be checked. - the ID of the hand whose alert should be retrieved. - the information for the fired gesture. true if the gesture was fired, false otherwise. @see GestureData Return the number of hands detected in the current frame. Retrieve the cursor object data using a specific AccessOrder and related index. - the order in which the cursors are enumerated (accessed). - the index of the cursor to be retrieved, based on the given AccessOrder. - the information for the cursor. PXCM_STATUS_NO_ERROR - operation succeeded. PXCM_STATUS_DATA_UNAVAILABLE - index >= number of detected hands. @see AccessOrder @see ICursor Retrieve the cursor object data by its unique Id. - the unique ID of the requested hand - the information for the cursor. PXCM_STATUS_NO_ERROR - operation succeeded. PXCM_STATUS_DATA_UNAVAILABLE - there is no output data. PXCM_STATUS_PARAM_UNSUPPORTED - there is no hand data for the given cursor ID. @see ICursor Reset the adaptive point. - the unique ID of the requested cursor. - the position of the new point. should be between 0-1 in every axis. PXCM_STATUS_NO_ERROR - operation succeeded. PXCM_STATUS_DATA_UNAVAILABLE - there is no output data. PXCM_STATUS_PARAM_UNSUPPORTED - there is no hand data for the given cursor ID. @class IHand Contains all the properties of the hand that were calculated by the tracking algorithm Return the hand's unique identifier. Return the time-stamp in which the collection of the hand data was completed. Return the side of the body to which the hand belongs (when known). @note This information is available only in full-hand tracking mode (TRACKING_MODE_FULL_HAND). @see PXCMHandConfiguration.SetTrackingMode Get the geometric position of the cursor in 3D world coordinates, in meters. Return the cursor point in world coordinates. Get the geometric position of the cursor in 2D image coordinates, in pixels. (Note: the Z coordinate is the point's depth in millimeters.) @return the cursor point in image coordinates. The module defines a bounding box around a hand in the world coordinate system, the adaptive point is a normalized point inside the bounding box with values between 0-1. Using this point allows an easy way to map the hand cursor to any resolution screen. Return the a 3D point with values between 0-1. Get the hand cursor engagement loading percentage. Return 100 - percent indicates full engagement state. Return -1 - percent indicates that the feature wasn't enabled. @see PXCCursorConfiguration::EnableEngagement @see AlertType::CURSOR_ENGAGED @see AlertType::CURSOR_DISENGAGED BodySideType The side of the body to which a hand belongs.\n @note Body sides are reported from the player's point-of-view, not the sensor's. The side was not determined AccessOrderType Orders in which the hands can be accessed. From oldest to newest hand in the scene AlertType Identifiers for the events that can be detected and fired by the cursor module. Cursor is detected GestureType Identifiers for the events that can be detected and fired by the cursor module. Cursor click - hold your hand facing the camera, close and open your hand in smooth motion. Cursor clockwise circle - move your hand in clockwise circle while hand facing the camera. Cursor counter clockwise circle - move your hand in counter clockwise circle while hand facing the camera. Cursor hand closing - hold an open hand towards the camera and close your hand. TrackingBounds Defines the properties of an alert event nearest tracking distance (distance of small frustum base from sensor) farthest tracking distance (distance of large frustum base from sensor) width of small frustum base AlertData Defines the properties of an alert event The time-stamp in which the event occurred The number of the frame in which the event occurred (relevant for recorded sequences) The ID of the hand that triggered the alert, if relevant and known GestureData Defines the properties of a gesture. The time-stamp in which the event occurred The number of the frame in which the event occurred (relevant for recorded sequences) The ID of the hand that triggered the alert, if relevant and known @class PXCMDataSmoothing A utility that allows smoothing data of different types, using a variety of algorithms Create Stabilizer smoother instance for single floats The stabilizer keeps the smoothed data point stable as long as it has not moved more than a given threshold The stabilizer smoother strength, default value is 0.5f The stabilizer smoother radius, default value is 0.03f a pointer to the created Smoother, or NULL in case of illegal arguments Create Stabilizer smoother instance for single floats The stabilizer keeps the smoothed data point stable as long as it has not moved more than a given threshold a pointer to the created Smoother, or NULL in case of illegal arguments Create the Weighted algorithm for single floats The Weighted algorithm applies a (possibly) different weight to each of the previous data samples If the weights vector is not assigned (NULL) all the weights will be equal (1/numWeights) The Weighted smoother number of weights The Weighted smoother weight values a pointer to the created Smoother, or NULL in case of illegal arguments Create the Weighted algorithm for single floats The Weighted algorithm applies a (possibly) different weight to each of the previous data samples If the weights vector is not assigned (NULL) all the weights will be equal (1/numWeights) The Weighted smoother number of weights a pointer to the created Smoother, or NULL in case of illegal arguments Create the Quadratic algorithm for single floats The Quadratic smoother smooth strength a pointer to the created Smoother, or NULL in case of illegal arguments Create the Quadratic algorithm for single floats a pointer to the created Smoother, or NULL in case of illegal arguments Create the Spring algorithm for single floats The Spring smoother smooth strength a pointer to the created Smoother, or NULL in case of illegal arguments Create the Spring algorithm for single floats a pointer to the created Smoother, or NULL in case of illegal arguments Create Stabilizer smoother instance for 2-dimensional points The stabilizer keeps the smoothed data point stable as long as it has not moved more than a given threshold The stabilizer smoother strength, default value is 0.5f The stabilizer smoother radius, default value is 0.03f a pointer to the created Smoother, or NULL in case of illegal arguments Create Stabilizer smoother instance for 2-dimensional points The stabilizer keeps the smoothed data point stable as long as it has not moved more than a given threshold a pointer to the created Smoother, or NULL in case of illegal arguments Create the Weighted algorithm for 2-dimensional points The Weighted algorithm applies a (possibly) different weight to each of the previous data samples If the weights vector is not assigned (NULL) all the weights will be equal (1/numWeights) The Weighted smoother weight values a pointer to the created Smoother, or NULL in case of illegal arguments Create the Weighted algorithm for 2-dimensional points The Weighted algorithm applies a (possibly) different weight to each of the previous data samples If the weights vector is not assigned (NULL) all the weights will be equal (1/numWeights) The Weighted smoother number of weights a pointer to the created Smoother, or NULL in case of illegal arguments Create the Quadratic algorithm for 2-dimensional points The Quadratic smoother smooth strength a pointer to the created Smoother, or NULL in case of illegal arguments Create the Quadratic algorithm for 2-dimensional points a pointer to the created Smoother, or NULL in case of illegal arguments Create the Spring algorithm for 2-dimensional points The Spring smoother smooth strength a pointer to the created Smoother, or NULL in case of illegal arguments Create the Spring algorithm for 2-dimensional points a pointer to the created Smoother, or NULL in case of illegal arguments Create Stabilizer smoother instance for 3-dimensional points The stabilizer keeps the smoothed data point stable as long as it has not moved more than a given threshold The stabilizer smoother strength, default value is 0.5f The stabilizer smoother radius, default value is 0.03f a pointer to the created Smoother, or NULL in case of illegal arguments Create Stabilizer smoother instance for 3-dimensional points The stabilizer keeps the smoothed data point stable as long as it has not moved more than a given threshold a pointer to the created Smoother, or NULL in case of illegal arguments Create the Weighted algorithm for 3-dimensional points The Weighted algorithm applies a (possibly) different weight to each of the previous data samples If the weights vector is not assigned (NULL) all the weights will be equal (1/numWeights) The Weighted smoother weight values a pointer to the created Smoother, or NULL in case of illegal arguments Create the Weighted algorithm for 3-dimensional points The Weighted algorithm applies a (possibly) different weight to each of the previous data samples If the weights vector is not assigned (NULL) all the weights will be equal (1/numWeights) The Weighted smoother number of weights a pointer to the created Smoother, or NULL in case of illegal arguments Create the Quadratic algorithm for 3-dimensional points The Quadratic smoother smooth strength a pointer to the created Smoother, or NULL in case of illegal arguments Create the Quadratic algorithm for 3-dimensional points a pointer to the created Smoother, or NULL in case of illegal arguments Create the Spring algorithm for 3-dimensional points The Spring smoother smooth strength a pointer to the created Smoother, or NULL in case of illegal arguments Create the Spring algorithm for 3-dimensional points a pointer to the created Smoother, or NULL in case of illegal arguments @class Smoother1D Handles the smoothing of a stream of floats, using a specific smoothing algorithm Add a new data sample to the smoothing algorithm the latest data sample PXCM_STATUS_NO_ERROR Add a new data sample to the smoothing algorithm the latest data sample PXCM_STATUS_NO_ERROR Gets a new data sample from the smoothing algorithm smoothed value in pxcF32 format Gets a new data sample from the smoothing algorithm smoothed value in pxcF32 format Add a new data sample to the smoothing algorithm the latest data sample PXCM_STATUS_NO_ERROR Add a new data sample to the smoothing algorithm the latest data sample PXCM_STATUS_NO_ERROR Gets a new data sample from the smoothing algorithm smoothed value in PXCPointF32 format Gets a new data sample from the smoothing algorithm smoothed value in PXCPointF32 format Add a new data sample to the smoothing algorithm the latest data sample PXCM_STATUS_NO_ERROR Add a new data sample to the smoothing algorithm the latest data sample PXCM_STATUS_NO_ERROR Gets a new data sample from the smoothing algorithm smoothed value in PXCPoint3DF32 format Gets a new data sample from the smoothing algorithm smoothed value in PXCPoint3DF32 format Init: the function initializes the Depth Mask function with the photo that needs processing. photo: 2D+depth input image. returns PXCMStatus. ComputeFromCoordinate: convenience function that creates a mask directly from a depth coordinate. coord: input (x,y) coordinates on the depth map. Returns a mask in the form of PXCMImage for blending with the current photo frame. Note: this function simply calls ComputeMaskFromThreshold underneath. ComputeFromThreshold: calculates a mask from the threshold computed depthThreshold: depth threshold. frontObjectDepth: foreground depth backOjectDepth: background depth Returns a mask in the form of PXCMImage for blending with the current photo frame. Notes: For every pixel, if the mask is between the range of [POIdepth - frontObjectDepth, POIdepth + backObjectDepth], mask[p] -1. For every pixel p with depth in the range [POI - frontObjectDepth - nearFalloffDepth, POI - frontObjectDepth], mask[p] equals the "smoothstep" function value. For every pixel p with depth in the range [POI + backObjectDepth , POI + backOjectDepth + farFallOffDepth], mask[p] equals the "smoothstep" function value. For every pixel p with other depth value, mask[p] = 1. Init: the function initializes the 6DoF parallax function with the photo that needs processing. photo: 2D+depth input image. returns PXCMStatus. Apply: The function applies a 6DoF parallax effect which is the difference in the apparent position of an object when it is viewed from two different positions or viewpoints. motion[3]: is the right, up, and forward motion when (+ve), and Left, down and backward motion when (-ve) motion[0]: + right / - left motion[1]: + up / - down motion[2]: + forward / - backward rotation[3]: is the Pitch, Yaw, Roll rotations in degrees in the Range: 0-360. rotation[0]: pitch rotation[1]: yaw rotation[2]: roll zoomFactor: + zoom in / - zoom out PXCMImage: the returned parallaxed image. Init: the function initializes the Depth Refocus function with the photo that needs processing. sample: 2D+depth input image. returns PXCMStatus. Apply: Refocus the image at input focusPoint by using depth data refocus focusPoint: is the selected point foir refocussing. aperture: Range of the blur area [1-100] Note: The application must release the returned refocussed image EnhanceDepth: enhance the depth image quality by filling holes and denoising outputs the enhanced depth image photo: input color, depth photo, and calibration data depthQuality: Depth fill Quality: HIGH or LOW for post or realtime processing respectively Note: The application must release the returned enhanced depth photo PreviewEnhanceDepth: enhance the depth image quality by filling holes and denoising outputs the enhanced depth image sample: The PXCMCapture::Sample instance from the SenseManager QuerySample(). depthQuality: Depth fill Quality: HIGH or LOW for post or realtime processing respectively Note: The application must release the returned enhanced depth image DepthQuality: retruns the quality of the the depth map photo: input color, raw depth map, and calibration data depthQuality: BAD, FAIR, GOOD Crop: The primary image, the camera[0] RGB and depth images are cropped and the intrinsic / extrinsic info is updated. photo: input image color+depth rect : top left corner (x,y) plus width and height of the window to keep and crop all the rest Returns a photo that has all its images cropped and metadata fixed accordingly. Note: Returns null if function fails UpScaleDepth: Change the size of the enhanced depth map. This function preserves aspect ratio, so only new width is required. photo: input image color+depth width: the new width. enhancementType: if the inPhoto has no enhanced depth, then do this type of depth enhancement before resizing. Returns a Depth map that has the same aspect ratio as the color image resolution. Note: Returns null if the aspect ratio btw color and depth is not preserved PhotoResize: Change the size of the reference (primary) image. This function preserves aspect ratio, so only new width is required. Only the primary image is resized. photo - input photo. width - the new width. Returns a photo with the reference (primary) color image resized while maintaining the aspect ratio. Note: Returns null when the function fails CommonFOV: Matches the Feild Of View (FOV) of color and depth in the photo. Useful for still images. photo: input image color+depth Returns a photo with primary,unedited color images, and depthmaps cropped to the common FOV and the camera meatadata recalculated accordingly. Note: Returns a nullptr if function fails PreviewCommonFOV: Matches the Field of View (FOV) of color and depth in depth photo. Useful for live stream. Use the returned roi to crop the photo photo: input image color+depth rect: Output. Returns roi in color image that matches to FOV of depth image that is suitable for all photos in the live stream. @return pxcmStatus : PXCM_STATUS_NO_ERRROR for successfu operation; PXCM_STATUS_DATA_UNAVAILABLE otherwise PreviewCommonFOV [Deprecated]: Matches the Field of View (FOV) of color and depth in depth photo. Useful for live stream. Use the returned roi to crop the photo photo: input image color+depth rect: Output. Returns roi in color image that matches to FOV of depth image that is suitable for all photos in the live stream. @return pxcStatus : PXC_STATUS_NO_ERRROR for successfu operation; PXC_STATUS_DATA_UNAVAILABLE otherwise Input param for Depth fill Quality: High: better quality, slow execution mostly used for post processing (image) Low : lower quality, fast execution mostly used for realtime processing (live video sequence) DepthMapQuality: Output param for Depth Map Quality: BAD, FAIR and GOOD ObjectSegment: generates an initial mask for any object selected by the bounding box. The mask can then be refined by hints supplied by the user in RefineMask() function. photo: input color and depth photo. topLeftCoord : top left corner of the object to segment. bottomRightCoord: Bottom right corner of the object to segment. Returns a mask in the form of PXCMImage with detected pixels set to 255 and undetected pixels set to 0. ObjectSegment: generates an initial mask for any object selected by the bounding box. The mask can then be refined by hints supplied by the user in RefineMask() function. photo: input color and depth photo. Returns a mask in the form of PXCMImage with detected pixels set to 255 and undetected pixels set to 0. RefineMask: refines the mask generated by the ObjectSegment() function by using hints. hints: input mask with hints. hint values. 0 = no hint 1 = foreground 2 = background Returns a mask in the form of PXCMImage with detected pixels set to 255 and undetected pixels set to 0. RefineMask: refines the mask generated by the ObjectSegment() function by using hints. points: input arrays with hints' coordinates. length: length of the array isForeground: bool set to true if input hint locations are foreground and false if background Returns a mask in the form of PXCImage with detected pixels set to 255 and undetected pixels set to 0. Undo: undo last hints. Returns a mask in the form of PXCImage with detected pixels set to 255 and undetected pixels set to 0. Redo: Redo the previously undone hint. Returns a mask in the form of PXCImage with detected pixels set to 255 and undetected pixels set to 0. SetPhoto: sets the photo that needs to be processed. photo: photo to be processed [color + depth] pasteMode: Indicates whether pasteOnPlane or pasteOnSurface Returns PXC_STATUS_NO_ERROR if success. PXC_STATUS_PROCESS_FAILED if process failed GetPlanesMap: return plane indices map for current SetPhoto Returns a PXCImage of the plane indices in a form of a mask. AddSticker: adds a sticker that will be pasted with all configurations needed and paste effects. sticker: the image to paste onto the photo (foreground image) coord : insertion coordinates stickerData: the sticker size, paste location and anchor point. pasteEffects: the pasting effects. Returns a stickerID number that can be used as input to the UpdateSticker(), RemoveSticker(), and PreviewSticker() functions. A negative return value indicates failure. RemoveSticker: Removes the sticker represented by stickerID The stickerID for the sticker to remove. PXC_STATUS_NO_ERROR for success. PXC_STATUS_ITEM_UNAVAILABLE if the stickerID was not valid RemoveAllStickers: Removes all stickers from the scene. If there are no stickers in the scene, this function has no effect. UpdateSticker: Update the configuration and paste effects for the given sticker. You can pass null for any argument that you do not wish to update. sticker: the image to paste onto the photo (foreground image) coord : insertion coordinates stickerData: the sticker size, paste location and anchor point. pasteEffects: the pasting effects. Returns PXC_STATUS_NO_ERROR for success. Returns PXC_STATUS_ITEM_UNAVAILABLE if the stickerID was not valid SetSticker: sets the sticker that will be pasted with all configurations needed and paste effects. sticker: the image to paste onto the photo (foreground image) coord : insertion coordinates stickerData: the sticker size, paste location and anchor point. pasteEffects: the pasting effects. Returns PXC_STATUS_NO_ERROR if success. PXC_STATUS_PROCESS_FAILED if process failed PreviewSticker: returns a sticker mask showing the location of the pasted sticker. Returns PXCImage with returns values of 0, 1, 2 : 2 U 1 - region where the sticker could be pasted if there were no constraints 1 - apropriate region to paste sticker considering constraints: e.g. plane 0 - all other pixels GetStickerROI: Gives a bounding box showing the location of the sticker Returns PXCM_STATUS_NO_ERROR if operation succeeds Paste: pastes a smaller 2D image (sticker) onto a bigger color + depth image (background). The smaller foreground image, is rendered according to a user-specified position and an auto-detected plane orientation onto the background image. The auto-oriented foreground image and the color data of the background image are composited together according to the alpha channal of the foreground image. Returns the embeded foreground with background image. PasteOnPlane: This function is provided for texturing a smaller 2D image (foreground) onto a bigger color + depth image (background). The smaller foreground image, is rendered according to a user-specified position and an auto-detected plane orientation onto the background image. The auto-oriented foreground image and the color data of the background image are composited together according to the alpha channal of the foreground image. imbedIm: the image to imbed in the photo (foreground image) topLeftCoord, bottomLeftCoord: are the top left corner and the bottom left corner of where the user wants to embed the image. Returns the imbeded foreground with background image. PasteEffects: matchIllumination: Matches sticker illumination to the global RGB scene. Default is true. transparency: Sets Transparency level of the sticker. 0.0f = Opaque (Default); 1.0f = Transparent embossHighFreqPass: High Frequency Pass during emboss, default 0.0f no emboss, 1.0f max shadingCorrection: Matches sticker illumination to local RGB scene, takes shadows in account. Default is false. colorCorrection: Flag to add color correction; Default is false. embossingAmplifier: Embossing Intensity Multiplier. default: 1.0f. should be positive skinDetection: Flag to detect skin under pasted sticker; default is false PasteType: Indicates whether sticker is pasted on detected planes or on any surface PLANE and SURFACE MeasureDistance: measure the distance between 2 points in mm photo: is the photo instance startPoint, endPoint: are the start pont and end point of the distance that need to be measured. Note: Depth data must be available and accurate at the start and end point selected. MeasureUADistance: (Experimental) measure the distance between 2 points in mm by using a experimental algortihm for a User Assisted (UA) measure. photo: is the photo instance startPoint, endPoint: are the user selected start point and end point of the distance that needs to be measured. returns the MeasureData that has the highest confidence value. Note: depth data must be available and accurate at the start and end point selected. QueryUADataSize: (Experimental) returns the size of the MeasureData possibilites. The number of possibilities varries according to the selected points, if they lie on a common plane or independent planes. QueryUAData: (Experimental) Returns an array of the MeasureData possibilites. The size of the array is equal to the value returned by the QueryUADataSize(). DistanceType: indicator whether the Two points measured lie on a the same planar surface This represents a point in 3D world space in millimeter (mm) units. This represents the distance between two world points in millimeters (mm). EnableTracker: creates an object tracker with a specific tracking method and an initial bounding mask as a hint for the object to detect. boundingMask: a hint on what object to detect setting the target pixels to 255 and background to 0. method: Tracking method for depth layer tracking or object tracking. QueryTrackedObject: returns the tracked object selected in EnableTracker() after every processed frame. Returns a mask in the form of PXCMImage with detected pixels set to 255 and undetected pixels set to 0. returned PXCMImage is managed internally APP should not release: TO DO!! Enable alert, so that events are fired when the alert is identified. the label of the alert to enabled. PXCM_STATUS_NO_ERROR if the alert was enabled successfully; otherwise, return one of the following errors: PXCM_STATUS_PARAM_UNSUPPORTED - Unsupported parameter. PXCM_STATUS_DATA_NOT_INITIALIZED - Data failed to initialize. Enable all alert messaging events. PXC_STATUS_NO_ERROR if enabling all alerts was successful; otherwise, return one of the following errors: PXCM_STATUS_PROCESS_FAILED - Module failure during processing. PXCM_STATUS_DATA_NOT_INITIALIZED - Data failed to initialize. Check if an alert is enabled. the ID of the event. true if the alert is enabled; otherwise, return false Disable alert messaging for a specific event. the ID of the event to be disabled. PXC_STATUS_NO_ERROR if disabling the alert was successful; otherwise, return one of the following errors: PXCM_STATUS_PARAM_UNSUPPORTED - Unsupported parameter. PXCM_STATUS_DATA_NOT_INITIALIZED - Data failed to initialize. Disable all alerts messaging for all events. PXC_STATUS_NO_ERROR if disabling all alerts was successful; otherwise, return one of the following errors: PXCM_STATUS_PROCESS_FAILED - Module failure during processing. PXCM_STATUS_DATA_NOT_INITIALIZED - Data failed to initialize. Register an event handler object for the alerts. The event handler's OnFiredAlert method will be called each time an alert is identified. a pointer to the event handler. @see AlertHandler::OnFiredAlert PXCM_STATUS_NO_ERROR if the registering an event handler was successful; otherwise, return the following error: PXCM_STATUS_DATA_NOT_INITIALIZED - Data failed to initialize. Unsubscribe an event handler object for the alerts. a pointer to the event handler that should be removed. PXCM_STATUS_NO_ERROR if the unregistering the event handler was successful, an error otherwise. apply/commit configuration changes to MW restore settings to default values Updates data to latest available configuration. Enables expression module. Disables expression module. Enables all available expressions. Disables all available expressions. Enables specific expression. - single face expression. PXCM_STATUS_NO_ERROR - success. PXCM_STATUS_PARAM_UNSUPPORTED - expression is unsupported. Disables specific expression. - single face expression. Checks if expression is currently enabled in configuration. - single face expression true - enabled, false - disabled. Sets the active Recognition database. - The name of the database to be loaded by the Recognition module. - A pointer to the Recognition database, or NULL for an existing database. PXCM_STATUS_HANDLE_INVALID - if the module wasn't initialized properly. PXCM_STATUS_DATA_UNAVAILABLE - if the registration failed. PXCM_STATUS_NO_ERROR - if registration was successful. Create a new Recognition database. The name of the new database. The new database description. loads previously calculated calib data retrieves calibration data size The actual dominant eye as entered by the user, modifying the optimal eye suggested by the calibration. An alternative option to setting the dominant eye would be to repeat calibration, QueryCalibDominantEye until desired result is reached. The dominant eye is a preference of visual input from one eye over the other. This is the eye relied on in the gaze inference algorithm. Enables pulse module. Disables pulse module. @brief Get the details of the fired alert at the requested index. @param[in] index the zero-based index of the requested fired alert . @param[out] alertData contains all the information for the fired event. @see AlertData @note the index is between 0 and the result of GetFiredAlertsNumber() @see GetFiredAlertsNumber() @return PXC_STATUS_NO_ERROR if returning fired alert data was successful; otherwise, return one of the following errors: PXC_STATUS_PROCESS_FAILED - Module failure during processing.\n PXC_STATUS_DATA_NOT_INITIALIZED - Data failed to initialize.\n @brief Return whether the specified alert is fired in the current frame, and retrieve its data if it is. @param[in] alertEvent the ID of the event. @param[out] alertData contains all the information for the fired event. @see AlertData @return true if the alert is fired, false otherwise. @brief Return whether the specified alert is fired for a specific face in the current frame, and retrieve its data. @param[in] alertEvent the label of the alert event. @param[in] faceID the ID of the face who's alert should be retrieved. @param[out] alertData contains all the information for the fired event. @see AlertData @return true if the alert is fired, false otherwise. Updates data to latest available output. Returns detected frame timestamp. Returns number of total detected faces in frame. Returns tracked face corresponding with the algorithm-assigned faceId, null if no such faceId exists. Returns tracked face corresponding with the given index, 0 being the first according the to chosen tracking strategy; Returns null if no such index exists. Get the number of fired alerts in the current frame. the number of fired alerts. Return whether the specified alert is fired for a specific face in the current frame, and retrieve its data. the label of the alert event. parameter to contain the name of the alert,maximum size - ALERT_NAME_SIZE @see AlertData PXC_STATUS_NO_ERROR if returning the alert's name was successful; otherwise, return one of the following errors: PXCM_STATUS_PARAM_UNSUPPORTED - if outAlertName is null. PXCM_STATUS_DATA_UNAVAILABLE - if no alert corresponding to alertEvent was found. Assigns average depth of detected face to outFaceAverageDepth, returns true if data and outFaceAverageDepth exists, false otherwise. Assigns 2D bounding rectangle of detected face to outBoundingRect, returns true if data and outBoundingRect exists, false otherwise. Returns the number of tracked landmarks. Assigns all the points to outNumPoints array. Returns true if data and parameters exists, false otherwise. Assigns point matched to index to outPoint. Returns true if data and outPoint exists and index is correct, false otherwise. Returns the number of tracked landmarks in groupFlags. Assigns points matched to groupFlags to outPoints. User is expected to allocate outPoints to size bigger than the group size - point contains the original source (index + name). Returns true if data and parameters exist, false otherwise. Mapping function -> retrieves index corresponding to landmark's name. Assigns pose Euler angles to outPoseEulerAngles. Returns true if data and parameters exist, false otherwise. Assigns pose rotation as quaternion to outPoseQuaternion. Returns true if data and parameters exist, false otherwise. Assigns the head position to outHeadPosition. Returns true if data and parameters exist, false otherwise. Assigns 3x3 face's rotation matrix to outRotationMatrix. Returns true if data and parameters exist, false otherwise. Returns position(angle) confidence Returns user ID. Returns user's detection data instance - null if it is not enabled. Returns user's landmarks data instance - null if it is not enabled. Returns user's pose data - null if it is not enabled. Returns user's expressions data - null if it not enabled. Returns user's recognition data - null if it is not enabled. Returns user's gaze data - null if it is not enabled. Returns user's gaze calibration data - null if it is not enabled. Queries single expression result Single expression Expression result - such as intensity Returns true if expression was calculated successfully, false otherwise. Register a user in the Recognition database. The unique user ID assigned to the registered face by the Recognition module. Removes a user from the Recognition database. Checks if a user is registered in the Recognition database. true - if user is in the database, false otherwise. Returns the ID assigned to the current face by the Recognition module The ID assigned by the Recognition module, or -1 if face was not recognized. Assigns gaze result to outGazeResult. Returns true if data and parameters exist, false otherwise. Return gaze horizontal angle in degrees. Return gaze vertical angle in degrees. Assigns gaze result to outGazeResult. Returns true if data and parameters exist, false otherwise. Assigns gaze result to outGazeResult. Returns true if data and parameters exist, false otherwise. retrieves calib data size retrieves calib data buffer The optimal eye of the current calibration - the one which yielded the highest accuracy between the two eyes, aiming at hitting the user's dominant eye; Unless the user requested set of the dominant eye. This is the eye relied on in the gaze inference algorithm. AlertType Available events that can be detected by the system (alert types) create a new copy of active configuration create a placeholder for output @class PXCMHandConfiguration Handles all the configuration options of the hand module. Use this interface to configure the tracking, alerts, gestures and output options. @note Updated configuration is applied only when ApplyChanges is called. Apply the configuration changes to the module. This method must be called in order to apply the current configuration changes. PXCM_STATUS_NO_ERROR - operation succeeded. PXCM_STATUS_DATA_NOT_INITIALIZED - configuration was not initialized.\n Restore configuration settings to the default values. PXCM_STATUS_NO_ERROR - operation succeeded. PXCM_STATUS_DATA_NOT_INITIALIZED - configuration was not initialized.\n Read current configuration settings from the module into this object. PXCM_STATUS_NO_ERROR - operation succeeded. PXCM_STATUS_DATA_NOT_INITIALIZED - configuration was not read.\n Restart the tracking process and reset all the skeleton information. You might want to call this method, for example, when transitioning from one game level to another, \n in order to discard information that is not relevant to the new stage. @note ResetTracking will be executed only when processing the next frame. PXCM_STATUS_NO_ERROR - operation succeeded. PXCM_STATUS_PROCESS_FAILED - there was a module failure during processing. PXCM_STATUS_DATA_NOT_INITIALIZED - the module was not initialized Specify the name of the current user for personalization. The user name will be used to save and retrieve specific measurements (calibration) for this user. - the name of the current user. PXCM_STATUS_NO_ERROR - operation succeeded. PXCM_STATUS_PARAM_UNSUPPORTED - illegal user name(e.g. an empty string) or tracking mode is set to TRACKING_MODE_EXTREMITIES. Get the name of the current user. A null-terminated string containing the user's name. Activate calculation of the speed of a specific joint, according to the given mode.\n The output speed is a 3-dimensional vector, containing the the motion of the requested joint in each direction (x, y and z axis).\n By default, the joint speed calculation is disabled for all joints, in order to conserve CPU and memory resources.\n Typically the feature is only activated for a single fingertip or palm-center joint, as only the overall hand speed is useful.\n - the identifier of the joint. - the speed calculation method. Possible values are:\n JOINT_SPEED_AVERAGE - calculate the average joint speed, over the time period defined in the "time" parameter.\n JOINT_SPEED_ABSOLUTE - calculate the average of the absolute-value joint speed, over the time period defined in the "time" parameter.\n - the period in milliseconds over which the average speed will be calculated (a value of 0 will return the current speed). PXCM_STATUS_NO_ERROR - operation succeeded. PXCM_STATUS_PARAM_UNSUPPORTED - one of the arguments is invalid or tracking mode is set to TRACKING_MODE_EXTREMITIES. @see PXCMHandData.JointType @see PXCMHandData.JointSpeedType Disable calculation of the speed of a specific joint.\n You may want to disable the feature when it is no longer needed, in order to conserve CPU and memory resources.\n - the identifier of the joint PXCM_STATUS_NO_ERROR - operation succeeded. PXCM_STATUS_PARAM_UNSUPPORTED - invalid joint label or tracking mode is set to TRACKING_MODE_EXTREMITIES. @see PXCMHandData.JointType Set the boundaries of the tracking area. The tracking boundaries create a frustum shape in which the hand is tracked.\n (A frustum is a truncated pyramid, with 4 side planes and two rectangular bases.)\n When the tracked hand reaches one of the boundaries (near, far, left, right, top, or bottom), the appropriate alert is fired. - nearest tracking distance (distance of small frustum base from sensor). - farthest tracking distance (distance of large frustum base from sensor). - width of small frustum base. - height of small frustum base. @note The frustum base centers are directly opposite the sensor.\n PXCM_STATUS_NO_ERROR - operation succeeded. PXCM_STATUS_PARAM_UNSUPPORTED - invalid argument. @see PXCMHandData.JointType Loading a calibration file for specific age. People of different ages have different hand sizes and usually there is a correlation between the two. Knowing in advance the age of the players can improve the tracking in most cases. Call this method to load a calibration file that matches specific age. We support specific hand calibrations for ages 4-14. Ages above 14 will use the default calibration. @Note: The best practice is to let the players perform online calibration, or load specific calibration per user. @Note: If you call SetUserName with an existing user name it will override SetDefaultAge @see SetUserName the expected age of the players PXCM_STATUS_NO_ERROR - operation succeeded. PXCM_STATUS_PARAM_UNSUPPORTED for illegal ages or tracking mode is set to TRACKING_MODE_EXTREMITIES. Retrieve the current calibration default age value. The current default age value. @see SetDefaultAge Get the values defining the tracking boundaries frustum. - nearest tracking distance (distance of small frustum base from sensor). - farthest tracking distance (distance of large frustum base from sensor). - width of small frustum base. - height of small frustum base. PXCM_STATUS_NO_ERROR - operation succeeded. Set the tracking mode, which determines the algorithm that will be applied for tracking hands. - the tracking mode to be set. Possible values are:\n TRACKING_MODE_FULL_HAND - track the entire hand skeleton.\n TRACKING_MODE_EXTREMITIES - track only the mask and the extremities of the hand (the points that confine the tracked hand).\n PXCM_STATUS_NO_ERROR - operation succeeded. PXC_STATUS_PARAM_UNSUPPORTED - TrackingModeType is invalid. @see PXCMHandData.TrackingModeType Retrieve the current tracking mode, which indicates the algorithm that should be applied for tracking hands. TrackingModeType @see SetTrackingMode @see PXCMHandData.TrackingModeType Sets the degree of hand motion smoothing. "Smoothing" is algorithm which overcomes local problems in tracking and produces smoother, more continuous tracking information. - a float value between 0 (not smoothed) and 1 (maximal smoothing). PXCM_STATUS_NO_ERROR - operation succeeded. PXCM_STATUS_PARAM_UNSUPPORTED - invalid smoothing value or tracking mode is set to TRACKING_MODE_EXTREMITIES. Retrieve the current smoothing value. The current smoothing value. @see SetSmoothingValue Enable or disable the hand stabilizer feature.\n Enabling this feature produces smoother tracking of the hand motion, ignoring small shifts and "jitters".\n (As a result, in some cases the tracking may be less sensitive to minor movements). - true to enable the hand stabilizer; false to disable it. PXCM_STATUS_NO_ERROR - operation succeeded. PXC_STATUS_PARAM_UNSUPPORTED - tracking mode is set to TRACKING_MODE_EXTREMITIES. Return hand stabilizer activation status. true if hand stabilizer is enabled, false otherwise. Enable the calculation of a normalized skeleton.\n Calculating the normalized skeleton transforms the tracked hand positions to those of a fixed-size skeleton.\n The positions of the normalized skeleton's joints can be retrieved by calling IHand::QueryNormalizedJoint.\n It is recommended to work with a normalized skeleton so that you can use the same code to identify poses and gestures,\n regardless of the hand size. (E.g. the same code can work for a child's hand and for an adult's hand.) - true if the normalized skeleton should be calculated, otherwise false. PXCM_STATUS_NO_ERROR - operation succeeded. PXC_STATUS_PARAM_UNSUPPORTED - tracking mode is set to TRACKING_MODE_EXTREMITIES. @see PXCMHandData.IHand.QueryNormalizedJoint Retrieve normalized joints calculation status. true if normalized joints calculation is enabled, false otherwise. Enable calculation of the hand segmentation image. The hand segmentation image is an image mask of the tracked hand, where the hand pixels are white and all other pixels are black. - true if the segmentation image should be calculated, false otherwise. PXCM_STATUS_NO_ERROR - operation succeeded. Retrieve the hand segmentation image calculation status. true if calculation of the hand segmentation image is enabled, false otherwise. @see EnableSegmentationImage Enable the retrieval of tracked joints information. Enable joint tracking if your application uses specific joint positions; otherwise disable in order to conserve CPU/memory resources.\n @note This option doesn't affect the quality of the tracking, but only the availability of the joints info. - true to enable joint tracking, false to disable it. PXCM_STATUS_NO_ERROR - operation was successful PXC_STATUS_PARAM_UNSUPPORTED - tracking mode is set to TRACKING_MODE_EXTREMITIES. Retrieve the joint tracking status. true if joint tracking is enabled, false otherwise. Enable alert messaging for a specific event. - the ID of the event to be enabled. PXCM_STATUS_NO_ERROR - operation succeeded. PXCM_STATUS_PARAM_UNSUPPORTED - invalid alert type. @see PXCMHandData.AlertType Enable all alert messaging events. PXCM_STATUS_NO_ERROR - operation succeeded. Test the activation status of the given alert. - the ID of the event to be tested. true if the alert is enabled, false otherwise. @see PXCMHandData.AlertType Disable alert messaging for a specific event. - the ID of the event to be disabled PXCM_STATUS_NO_ERROR - operation succeeded. PXCM_STATUS_PARAM_UNSUPPORTED - unsupported parameter. PXCM_STATUS_DATA_NOT_INITIALIZED - data was not initialized. @see PXCMHandData.AlertType Disable messaging for all alerts. PXCM_STATUS_NO_ERROR - operation succeeded. PXCM_STATUS_DATA_NOT_INITIALIZED - data was not initialized. Register an event handler object for the alerts. The event handler's OnFiredAlert method is called each time an alert fires. - a pointer to the event handler. PXCM_STATUS_NO_ERROR - operation succeeded. PXCM_STATUS_PARAM_UNSUPPORTED - null alertHandler pointer. Unsubscribe an alert handler object. - a pointer to the event handler to unsubscribe. PXCM_STATUS_NO_ERROR - operation succeeded. PXCM_STATUS_PARAM_UNSUPPORTED - illegal alertHandler (null pointer). Load a set of gestures from a specified path. A gesture pack is a collection of pre-trained gestures.\n After this call, the gestures that are contained in the pack are available for identification.\n - the full path of the gesture pack location. PXCM_STATUS_NO_ERROR - operation succeeded. PXCM_STATUS_PARAM_UNSUPPORTED - empty path or empty list of gestures. @note This method should be used only for external gesture packs, and not for the default gesture pack, which is loaded automatically. Unload the set of gestures contained in the specified path. - the full path of the the gesture pack location. PXC_STATUS_NO_ERROR - operation succeeded. Unload all the currently loaded sets of the gestures.\n If you are using multiple gesture packs, you may want to load only the packs that are relevant to a particular stage in your application\n and unload all others. This can boost the accuracy of gesture recognition, and conserves system resources. PXCM_STATUS_NO_ERROR - operation succeeded. Retrieve the total number of available gestures that were loaded from all gesture packs. The total number of loaded gestures. Retrieve the gesture name that matches the given index. - the index of the gesture whose name you want to retrieve. - the size of the preallocated gestureName buffer. - preallocated buffer to be filled with the gesture name. PXCM_STATUS_NO_ERROR - operation succeeded. PXCM_STATUS_ITEM_UNAVAILABLE - no gesture for the given index value. Enable a gesture, so that events are fired when the gesture is identified. - the name of the gesture to be enabled. - set to "true" to get an "in progress" event at every frame for which the gesture is active, or "false" to get only "start" and "end" states of the gesture. PXCM_STATUS_NO_ERROR - operation succeeded. PXCM_STATUS_PARAM_UNSUPPORTED - invalid parameter. Enable a gesture, so that events are fired when the gesture is identified. - the name of the gesture to be enabled. PXCM_STATUS_NO_ERROR - operation succeeded. PXCM_STATUS_PARAM_UNSUPPORTED - invalid parameter. Enable all gestures, so that events are fired for every gesture identified. - set to "true" to get an "in progress" event at every frame for which the gesture is active, or "false" to get only "start" and "end" states of the gesture. PXCM_STATUS_NO_ERROR - operation succeeded. Enable all gestures, so that events are fired for every gesture identified. PXCM_STATUS_NO_ERROR - operation succeeded. Check whether a gesture is enabled. - the name of the gesture to be tested. true if the gesture is enabled, false otherwise. Deactivate identification of a gesture. Events will no longer be fired for this gesture. - the name of the gesture to deactivate. PXC_STATUS_NO_ERROR - operation succeeded. PXCM_STATUS_PARAM_UNSUPPORTED - invalid gesture name. Deactivate identification of all gestures. Events will no longer be fired for any gesture. PXCM_STATUS_NO_ERROR - operation succeeded. Register an event handler object to be called on gesture events. The event handler's OnFiredGesture method will be called each time a gesture is identified. - a pointer to the gesture handler. PXCM_STATUS_NO_ERROR - operation succeeded. PXCM_STATUS_PARAM_UNSUPPORTED - null gesture handler. Unsubscribe a gesture event handler object. After this call no callback events will be sent to the given gestureHandler. - a pointer to the event handler to unsubscribe. PXCM_STATUS_NO_ERROR - operation succeeded. PXCM_STATUS_PARAM_UNSUPPORTED - null gesture handler. @Class PXCHandCursorModule The main interface to the hand cursor module's classes.\n Use this interface to access the hand cursor module's configuration and output data. @brief Create a new instance of the hand cursor module's active configuration. Multiple configuration instances can be created in order to define different configurations for different stages of the application. You can switch between the configurations by calling the ApplyChanges method of the required configuration instance. A pointer to the configuration instance. @see PXCMCursorConfiguration Create a new instance of the hand cursor module's current output data. Multiple instances of the output can be created in order to store previous tracking states. A pointer to the output data instance. @see PXCMCursorData @class PXCMHandData This class holds all the output of the hand tracking process. Each instance of this class holds the information of a specific frame. Updates hand data to the most current output. Return the number of fired alerts in the current frame. Get the details of the fired alert with the given index. - the zero-based index of the requested fired alert. - the information for the fired event. @note the index is between 0 and the result of QueryFiredAlertsNumber() PXCM_STATUS_NO_ERROR - operation succeeded. PXCM_STATUS_PARAM_UNSUPPORTED - invalid input parameter. @see AlertData @see QueryFiredAlertsNumber Return whether the specified alert is fired in the current frame, and retrieve its data if it is. - the ID of the fired event. - the information for the fired event. true if the alert is fired, false otherwise. @see AlertType @see AlertData Return whether the specified alert is fired for a specific hand in the current frame, and retrieve its data. - the alert type. - the ID of the hand whose alert should be retrieved. - the information for the fired event. true if the alert is fired, false otherwise. @see AlertType @see AlertData Return the number of gestures fired in the current frame. Get the details of the fired gesture with the given index. - the zero-based index of the requested fired gesture. - the information for the fired gesture. @note The gesture index must be between 0 and [QueryFiredGesturesNumber() - 1] PXCM_STATUS_NO_ERROR - operation succeeded. PXCM_STATUS_PARAM_UNSUPPORTED - invalid input parameter. @see GestureData @see QueryFiredGesturesNumber Check whether a gesture was fired and if so return its details. - the name of the gesture to be checked. - the information for the fired gesture. true if the gesture was fired, false otherwise. @see GestureData Return whether the specified gesture is fired for a specific hand in the current frame, and if so retrieve its data. - the name of the gesture to be checked. - the ID of the hand whose alert should be retrieved. - the information for the fired gesture. true if the gesture was fired, false otherwise. @see GestureData Return the number of hands detected in the current frame. Retrieve the given hand's uniqueId. - the order in which the hands are enumerated (accessed). - the index of the hand to be retrieved, based on the given AccessOrder. - the hand's uniqueId. PXCM_STATUS_NO_ERROR - operation succeeded. PXCM_STATUS_PARAM_UNSUPPORTED - invalid parameter. @see AccessOrderType Retrieve the hand object data using a specific AccessOrder and related index. - the order in which the hands are enumerated (accessed). - the index of the hand to be retrieved, based on the given AccessOrder. - the information for the hand. PXCM_STATUS_NO_ERROR - operation succeeded. PXCM_STATUS_PARAM_UNSUPPORTED - index >= MAX_NUM_HANDS. PXCM_STATUS_DATA_UNAVAILABLE - index >= number of detected hands. @see AccessOrder @see IHand Retrieve the hand object data by its unique Id. - the unique ID of the requested hand - the information for the hand. PXCM_STATUS_NO_ERROR - operation succeeded. PXCM_STATUS_DATA_UNAVAILABLE - there is no output hand data. PXCM_STATUS_PARAM_UNSUPPORTED - there is no hand data for the given hand ID. @see IHand @class IHand Contains all the properties of the hand that were calculated by the tracking algorithm Return the hand's unique identifier. Return the identifier of the user whose hand is represented. Return the time-stamp in which the collection of the hand data was completed. Return true if there is a valid hand calibration, otherwise false. A valid calibration results in more accurate tracking data, that is better fitted to the user's hand.\n After identifying a new hand, the hand module calculates its calibration. When calibration is complete, an alert is issued.\n Tracking is more robust for a calibrated hand. Return the side of the body to which the hand belongs (when known). @note This information is available only in full-hand tracking mode (TRACKING_MODE_FULL_HAND). @see PXCMHandConfiguration.SetTrackingMode Return the location and dimensions of the tracked hand, represented by a 2D bounding box (defined in pixels). The location and dimensions of the 2D bounding box. Return the 2D center of mass of the hand in image space (in pixels). Return the 3D center of mass of the hand in world space (in meters). A quaternion representing the global 3D orientation of the palm. @note This information is available only in full-hand tracking mode (TRACKING_MODE_FULL_HAND). @see PXCMHandConfiguration.SetTrackingMode Return the degree of openness of the hand. The possible degree values range from 0 (all fingers completely folded) to 100 (all fingers fully spread). @note This information is available only in full-hand tracking mode (TRACKING_MODE_FULL_HAND) @see PXCMHandConfiguration.SetTrackingMode Return the palm radius in image space (number of pixels). The palm radius is the radius of the minimal circle that contains the hand's palm. Return the palm radius in world space (meters). The palm radius is the radius of the minimal circle that contains the hand's palm. Return the tracking status (a bit-mask of one or more TrackingStatusType enum values). @see TrackingStatusType Return the data of a specific extremity point - the id of the requested extremity point. - the location data of the requested extremity point. PXC_STATUS_NO_ERROR - operation succeeded. @see ExtremityType @see ExtremityData Return the data of the requested finger @note This information is available only in full-hand tracking mode (TRACKING_MODE_FULL_HAND) @see PXCHandConfiguration::SetTrackingMode - the ID of the requested finger. - the tracking data of the requested finger. PXCM_STATUS_NO_ERROR - operation succeeded. @see FingerType @see FingerData Return the tracking data of a single hand joint @note This information is available only in full-hand tracking mode (TRACKING_MODE_FULL_HAND), when tracked-joints are enabled. @see PXCHandConfiguration::SetTrackingMode @see PXCHandConfiguration::EnableTrackedJoints - the ID of the requested joint. - the tracking data of the requested hand joint. PXCM_STATUS_NO_ERROR - operation succeeded. @see JointType @see JointData Return the tracking data of a single normalized-hand joint. @note This information is available only in full-hand tracking mode, when normalized-skeleton is enabled. @see PXCHandConfiguration::SetTrackingMode @see PXCHandConfiguration::EnableNormalizedJoints - the ID of the requested joint. - the tracking data of the requested normalized-hand joint. PXCM_STATUS_NO_ERROR - operation succeeded. @see JointType @see JointData Retrieve the 2D image mask of the tracked hand. In the image mask, each pixel occupied by the hand is white (value of 255) and all other pixels are black (value of 0). @note This information is available only when the segmentation image is enabled. @see PXCHandConfiguration::EnableSegmentationImage - the 2D image mask. PXCM_STATUS_NO_ERROR - operation succeeded. PXCM_STATUS_DATA_UNAVAILABLE - image mask is not available. Return true/false if tracked joints data exists @note This information is available only when full-hand tracking mode is enabled. @see PXCMHandConfiguration.SetTrackingMode @see PXCMHandConfiguration.EnableTrackedJoints Return true/false if normalized joint data exists . @note This information is available only in full-hand tracking mode, when normalized-skeleton is enabled. @see PXCMHandConfiguration.SetTrackingMode @see PXCMHandConfiguration.EnableNormalizedJoints Return true/false if hand segmentation image exists. @see PXCMHandConfiguration.EnableSegmentationImage Get the number of contour lines extracted (both external and internal). The number of contour lines extracted. Retrieve an IContour object using index (that relates to the given order). - the zero-based index of the requested contour (between 0 and QueryNumberOfContours()-1 ). - contains the extracted contour line data. PXCM_STATUS_NO_ERROR - successful operation. PXCM_STATUS_DATA_UNAVAILABLE - index >= number of detected contours. @see IContour @class IContour An interface that provides access to the contour line data. Get the point array representing a contour line. - the size of the array allocated for the contour points. - the contour points stored in the user-allocated array. PXCM_STATUS_NO_ERROR - successful operation. Return true for the blob's outer contour; false for inner contours. - the zero-based index of the requested contour. true for the blob's outer contour; false for inner contours. Get the contour size (number of points in the contour line). This is the size of the points array that you should allocate. - the zero-based index of the requested contour line. The contour size (number of points in the contour line). JointType Identifiers of joints that can be tracked by the hand module. ExtremityType Identifier of extremity points of the tracked hand. The closest point to the camera in the tracked hand FingerType Finger identifiers. Thumb finger BodySideType The side of the body to which a hand belongs.\n @note Body sides are reported from the player's point-of-view, not the sensor's. The side was not determined AlertType Identifiers for the events that can be detected and fired by the hand module. GestureStateType Enumerates the possible states of a gesture (start/in progress/end). @note Depending on the configuration, you can either get "start" and "end" events when the gesture starts/ends,\n or get the "in_progress" event for every frame in which the gesture is detected. See the "continuousGesture" flag in PXCHandConfiguration::enableGesture for more details. @see PXCMHandConfiguration.enableGesture Gesture started - fired at the first frame where the gesture is identified TrackingModeType Defines the possible tracking modes. TRACKING_MODE_FULL_HAND - enables full tracking of the hand skeleton, including all the joints' information. TRACKING_MODE_EXTREMITIES - tracks only the hand's mask and its extremity points. Track the full skeleton (22 joints) JointSpeedType Modes for calculating the joints' speed. Average of signed speed values (which are positive or negative depending on direction) across time TrackingStatusType Status values of hand tracking. In case of problematic tracking conditions, this value indicates the problem type. Optimal tracking conditions AccessOrderType Orders in which the hands can be accessed. By unique ID of the hand JointData A structure containing information about the position and rotation of a joint in the hand's skeleton. See the Hand Module Developer Guide for more details. RESERVED: for future confidence score feature The geometric position in 3D world coordinates, in meters The geometric position in 2D image coordinates, in pixels. (Note: the Z coordinate is the point's depth in millimeters.) A quaternion representing the local 3D orientation of the joint, relative to its parent joint A quaternion representing the global 3D orientation, relative to the "world" y axis ExtremityData Defines the positions of an extremity point. 3D world coordinates of the extremity point FingerData Defines the properties of a finger. The degree of "foldedness" of the tracked finger, ranging from 0 (least folded / straight) to 100 (most folded). AlertData Defines the properties of an alert event The type of alert The ID of the hand that triggered the alert, if relevant and known The time-stamp in which the event occurred GestureData Defines the properties of a gesture. The gestures in the default gesture package (installed with the hand module by default) are: Gesture that are available for TRACKING_MODE_FULL_HAND: "spreadfingers" - hand open facing the camera. "thumb_up" - hand closed with thumb pointing up. "thumb_down" - hand closed with thumb pointing down. "two_fingers_pinch_open" - hand open with thumb and index finger touching each other. "v_sign" - hand closed with index finger and middle finger pointing up. "fist" - all fingers folded into a fist. The fist can be in different orientations as long as the palm is in the general direction of the camera. "full_pinch" - all fingers extended and touching the thumb. The pinched fingers can be anywhere between pointing directly to the screen or in profile. "tap" - a hand in a natural relaxed pose is moved forward as if pressing a button. "wave" - an open hand facing the screen. The wave gesture's length can be any number of repetitions. "click" - hand facing the camera either with open palm or closed move the index finger fast toward the palm center as if clicking on a mouse. "swipe_down" - hold hand towards the camera and moves it down and then return it toward the starting position. "swipe_up" - hold hand towards the camera and moves it up and then return it toward the starting position. "swipe_right" - hold hand towards the camera and moves it right and then return it toward the starting position. "swipe_left" - hold hand towards the camera and moves it left and then return it toward the starting position. Time-stamp in which the gesture occurred The ID of the hand that made the gesture, if relevant and known The state of the gesture (start, in progress, end) The number of the frame in which the gesture occurred (relevant for recorded sequences) The gesture name @Class PXCMHandModule The main interface to the hand module's classes.\n Use this interface to access the hand module's configuration and output data. Create a new instance of the hand module's active configuration. Multiple configuration instances can be created in order to define different configurations for different stages of the application. You can switch between the configurations by calling the ApplyChanges method of the required configuration instance. A pointer to the configuration instance. @see PXCMHandConfiguration Create a new instance of the hand module's current output data. Multiple instances of the output can be created in order to store previous tracking states. A pointer to the output data instance. @see PXCMHandData Enable and Query the NativeTexturePlugin instance. Texture2Ds must be created with correct PXCMImage.info.width, PXCMImage.info.hight and TextureFormat.BGRA32. Only DX9, DX11, OpenGL2 supported. Color and Depth supported. NativeTexturePlugin instance. Texture2Ds passed need to be of correct PXCMImage.info.width, PXCMImage.info.height and TextureFormat.BGRA. Supports DX9, DX11 and OpenGL2. color or depth PXCMImage Retrieve using Texture2D.GetNativeTexturePtr() @brief Return the image sample information. @return the image sample information in the ImageInfo structure. @brief Lock to access the internal storage of a specified format. The function will perform format conversion if unmatched. @param[in] access The access mode. @param[in] format The requested smaple format. @param[in] options The option flags. @param[out] data The sample data storage, to be returned. @return PXCM_STATUS_NO_ERROR Successful execution. @brief Unlock the previously acquired buffer. @param[in] data The sample data storage previously acquired. @return PXCM_STATUS_NO_ERROR Successful execution. @brief Query rotation data. Convert pixel format to a string representation pixel format. string presentation. Return the image sample time stamp. the time stamp, in 100ns. Return the image stream type. the stream type. Return the image option flags. the option flags. Set the sample time stamp. The time stamp value, in 100ns. Set the sample stream type. The sample stream type. Set the sample options. This function overrides any previously set options. The image options. Copy image data from another image sample. The image sample to copy data from. PXC_STATUS_NO_ERROR Successful execution. Copy image data to the specified external buffer. The ImageData structure that describes the image buffer. Reserved. PXCM_STATUS_NO_ERROR Successful execution. Copy image data from the specified external buffer. The ImageData structure that describes the image buffer. Reserved. PXCM_STATUS_NO_ERROR Successful execution. Lock to access the internal storage of a specified format. The access mode. The sample data storage, to be returned. PXCM_STATUS_NO_ERROR Successful execution. Lock to access the internal storage of a specified format. The function will perform format conversion if unmatched. The access mode. The requested smaple format. The sample data storage, to be returned. PXCM_STATUS_NO_ERROR Successful execution. Lock to access the internal storage of a specified format. The function will perform format conversion if unmatched. The access mode. The requested smaple format. The image rotation The sample data storage, to be returned. PXCM_STATUS_NO_ERROR Successful execution. Increase a reference count of the sample. Query rotation data. A helper function to access PXCMMetadata instance ImageData Describes the image storage details. PixelFormat Describes the image sample pixel format Rotation Image rotation options. ImageInfo Describes the image sample detailed information. Access Describes the image access mode. Option Describes the image options. Metadata types for feedback information attached to current session @brief Return the SDK version. @return the SDK version. @brief Search a module implementation. @param[in] templat The template for the module search. Zero field values match any. @param[in] idx The zero-based index to retrieve multiple matches. @param[out] desc The matched module descritpor, to be returned. @return PXCM_STATUS_NO_ERROR Successful execution. @return PXCM_STATUS_ITEM_UNAVAILABLE No matched module implementation. @brief Return the module descriptor @param[in] module The module instance @param[out] desc The module descriptor, to be returned. @return PXCM_STATUS_NO_ERROR Successful execution. @return PXCM_STATUS_ITEM_UNAVAILABLE Failed to identify the module instance. Create an instance of the specified module. The module descriptor. Optional module implementation identifier. Optional interface identifier. The created instance, to be returned. PXCM_STATUS_NO_ERROR Successful execution. PXCM_STATUS_ITEM_UNAVAILABLE No matched module implementation. Create an instance of the specified module. The module descriptor. Optional interface identifier. The created instance, to be returned. PXCM_STATUS_NO_ERROR Successful execution. PXCM_STATUS_ITEM_UNAVAILABLE No matched module implementation. Create an instance of the specified module. Optional module implementation identifier. Optional interface identifier. The created instance, to be returned. PXCM_STATUS_NO_ERROR Successful execution. PXCM_STATUS_ITEM_UNAVAILABLE No matched module implementation. Create an instance of the specified module. Optional interface identifier. The created instance, to be returned. PXCM_STATUS_NO_ERROR Successful execution. PXCM_STATUS_ITEM_UNAVAILABLE No matched module implementation. Create an instance of the specified module. The module descriptor. Optional module implementation identifier. The created instance, to be returned. PXCM_STATUS_NO_ERROR Successful execution. PXCM_STATUS_ITEM_UNAVAILABLE No matched module implementation. Create an instance of the specified module. The module descriptor. The created instance, to be returned. PXCM_STATUS_NO_ERROR Successful execution. PXCM_STATUS_ITEM_UNAVAILABLE No matched module implementation. Create an instance of the specified module. Optional module implementation identifier. The created instance, to be returned. PXCM_STATUS_NO_ERROR Successful execution. PXCM_STATUS_ITEM_UNAVAILABLE No matched module implementation. Create an instance of the specified module. The created instance, to be returned. PXCM_STATUS_NO_ERROR Successful execution. PXCM_STATUS_ITEM_UNAVAILABLE No matched module implementation. Create an instance of the PXCMSenseManager interface. The PXCMSenseManager instance. Create an instance of the PXCMCaptureManager interface. The PXCMCaptureManager instance. Create an instance of the PXCMAudioSource interface. The PXCMAudioSource instance. Create an instance of the PXCMImage interface with data. The application must maintain the life cycle of the image data for the PXCMImage instance. The format and resolution of the image. Optional image data. The PXCMImage instance. Create an instance of the PXCMImage interface. The format and resolution of the image. The PXCMImage instance. Create an instance of the PXCMPhoto interface. The PXCMImage instance. Create an instance of the PXCMAudio interface with data. The application must maintain the life cycle of the audio data for the PXCMAudio instance. The audio channel information. Optional audio data. The PXCMAudio instance. Create an instance of the PXCAudio interface. The audio channel information. The PXCMAudio instance. Create an instance of the power manager. a PXCMPowerState instance. Load the module from a file. The module file name. PXCM_STATUS_NO_ERROR Successful execution. Unload the specified module. The module file name. PXCM_STATUS_NO_ERROR Successful execution. Set the camera coordinate system. The coordinate system. PXC_STATUS_NO_ERROR Successful execution. Return current camera coordinate system (bit-mask of coordinate systems for front and rear cameras) The coordinate system. PXC_STATUS_NO_ERROR Successful execution. Create an instance of the PXCSession interface. The PXCSession instance, to be returned. PXCM_STATUS_NO_ERROR Successful execution. A helper function to access PXCMMetadata instance ImplVersion Describe the video streams requested by a module implementation. ImplGroup The SDK group I/O and algorithm modules into groups and subgroups. This is the enumerator for algorithm groups. ImplSubgroup The SDK group I/O and algorithm modules into groups and subgroups. This is the enumerator for algorithm subgroups. CoordinateSystem SDK supports several 3D coordinate systems for front and rear facing cameras. ImplDesc The module descriptor lists details about the module implementation. The function returns a unique identifier for the meta data storage. the unique identifier. The function retrieves the identifiers of all available meta data. The zero-based index to retrieve all identifiers. the metadata identifier, or zero if not available. The function detaches the specified metadata. The metadata identifier. PXCM_STATUS_NO_ERROR Successful execution. PXCM_STATUS_ITEM_UNAVAILABLE The metadata is not found. The function attaches the specified metadata. The metadata identifier. The metadata buffer. The metadata buffer size, in bytes. PXC_STATUS_NO_ERROR Successful execution. The function returns the specified metadata buffer size. The metadata identifier. the metadata buffer size, or zero if the metadata is not available. The function retrieves the specified metadata. The metadata identifier. The buffer to retrieve the metadata. PXCM_STATUS_NO_ERROR Successful execution. The function attaches an instance of a serializeable interface to be metadata storage. The metadata identifier. The serializable instance. PXCM_STATUS_NO_ERROR Successful execution. The function creates an instance of a serializeable interface from the metadata storage. The metadata identifier. The interface identifier. The serializable instance, to be returned. PXCM_STATUS_NO_ERROR Successful execution. The function creates an instance of a serializeable interface from the metadata storage. The metadata identifier. The serializable instance, to be returned. PXCM_STATUS_NO_ERROR Successful execution. @class PXCMObjectRecognitionConfiguration @brief Retrieve the current configuration of the ObjectRecognition module and set new configuration values. @note Changes to PXCMObjectRecognitionConfiguration are applied only when ApplyChanges() is called. @brief Apply the configuration changes to the module. This method must be called in order for any configuration changes to apply. @return PXCM_STATUS_NO_ERROR - successful operation. @return PXCM_STATUS_DATA_NOT_INITIALIZED - the configuration was not initialized. @brief Restore configuration settings to the default values. @return PXCM_STATUS_NO_ERROR - successful operation. @return PXCM_STATUS_DATA_NOT_INITIALIZED - the configuration was not initialized. @brief Return the current active recognition configuration. @return RecognitionConfig - the current configuration. @brief Return the number of classes which supported by the recognition configuration. @brief Set the active recognition configuration. @param[in] rConfig - struct with the desired confidence and mode of recognition. @return PXCM_STATUS_NO_ERROR - on operation succeeded. @see PXCRecognitionConfig @brief Return the active classifier model. @return the active classifier name @brief Set the active classifier model. @The number of outputs must be corresponds to the correct number of outputs supported by the specified model @param[in] configFilePath - relative path to the classifier model file from RSSDK data path. @return PXCM_STATUS_NO_ERROR - operation succeeded. @brief Enable or disable the segmented image feature. @param[in] enable - boolean value to enable/disable the segmentation. @return PXCM_STATUS_NO_ERROR - operation succeeded. @brief get the current state of segmented image feature. @brief get the current state of absolute roi feature. @brief Add ROI to the classification. This roi is absolute and the image will croped by this roi @param[in] roi - ROI rectangle to be add. @return PXCM_STATUS_NO_ERROR - operation succeeded. @brief Add ROI to the classification. this roi is absolute and the image will croped by this roi @param[in] roi - ROI rectangle to be add. @return PXCM_STATUS_NO_ERROR - operation succeeded. @brief Return the ROI. @return the current ROI. @brief Return the absolute ROI. @return the current absolute ROI. @brief Query the current localization mechanism. @return the current localization mechanism. @brief Set the active localization mechanism. @brief The localization mechanism is activated only if LOCALIZATION or LOCALIZATION_AND_TRACKING or PROPOSAL_ONLY mode have been selected. @param[in] lm - the selected localization method. @see LocalizationMechanism enum. @return PXC_STATUS_NO_ERROR - operation succeeded. @brief Remove all ROIs from the classification and set the ROI to the whole image. @return PXCM_STATUS_NO_ERROR - operation succeeded. @brief Set or unset the absolute ROI feature. @param[in] enable - boolean value to enable/disable the absolute ROI. @brief Set the initial rois to track by tracking module . @brief Enables only when tracking bool is on and all other are off. @brief The number of rectangles will be between 1 and 5. @param[in] rois - pointer to the rectangles tracked by tracker module. @param[in] nRois - number the rectangles tracked by tracker module. @return the status of the operation. @Class PXCMObjectRecognitionData A class that defines a standard data interface for object recognition algorithms Updates object recognition data to the most current output.. PXCM_STATUS_NO_ERROR - successful operation. PXCM_STATUS_DATA_NOT_INITIALIZED - when the ObjectRecognitionData is not available. Get the number of recognized objects in the current frame. @note the number of recognized objects is between 0 and number of classes @multiplied by number of ROIs. @The function returns the sum of all objects at all ROIs up to the @specified threshold (by configuration class) the number of successfully recognized objects. Return a PXCMImage instance of segmented image of the entire frame. - the rectangle represents the ROI of the specified object index. - the segmented image as output. PXCM_STATUS_NO_ERROR - operation succeeded. @see PXCMImage Return object data of the specified index. @note the index is between 0 and QueryNumberOfRecognizedObjects() in the current frame. @see PXCMObjectRecognitionData.QueryNumberOfRecognizedObjects() - the index of recognition. - the index of probabilty per the recognized object. - data structure filled with the data of the recognized object. PXCM_STATUS_NO_ERROR - operation succeeded. PXCM_STATUS_PARAM_UNSUPPORTED Return object data of the specified index with the highest probability. @note the index is between 0 and QueryNumberOfRecognizedObjects() in the current frame. @see PXCObjectRecognitionData::QueryNumberOfRecognizedObjects() - the index of recognition. - data structure filled with the data of the recognized object. PXC_STATUS_NO_ERROR - operation succeeded. PXC_STATUS_PARAM_UNSUPPORTED on error. Return the name of the object by its recognized label. - the label of recognized object. the corresponding name of the label. @class PXCMObjectRecognition @Defines the PXCMObjectRecognition interface, which programs may use to process @snapshots of captured frames to recognize pre-trained objects. Create a new instance of the OR module's active configuration. @Multiple configuration instances can be created in order to define different configurations for different stages of the application. @The configurations can be switched by calling the ApplyChanges method of the required configuration instance. an object to the configuration instance. @see PXCMObjectRecognitionConfiguaration Create a new instance of the hand module's current output data. @Multiple instances of the output can be created in order to store previous tracking states. an object to the output data instance. @see PXCMObjectRecognitionData @brief Sets the range of user angles to be tracked @brief Enable alert messaging for a specific event @brief Enable all alert messaging events. @brief Test the activation status of the given alert. @brief Disable alert messaging for a specific event. @brief Disable messaging for all alerts. @brief Register an event handler object for the alerts. @brief Unsubscribe an alert handler object. Return the number of persons detected in the current frame. Retrieve the person object data using a specific AccessOrder and related index. Retrieve the person object data by its unique Id. Enters the person to the list of tracked people starting from next frame Removes the person from tracking Retrieve current tracking state of the Person Tracking module Returns the Person Recognition module interface Return the number of fired alerts in the current frame. @Get the details of the fired alert with the given index. Return whether the specified alert is fired in the current frame, and retrieve its data if it is. Get the person's position data describes the person's position is the algorithm's confidence in the determined position Get the direction a person is leaning towards describes where the person is leaning to in terms of yaw, pitch, and roll is the algorithm's confidence in the determined orientation Position Describes the position of the person Register a user in the Recognition database. The unique user ID assigned to the registered recognition by the Recognition module. Removes a user from the Recognition database. Checks if a user is registered in the Recognition database. true - if user is in the database, false otherwise. Returns the ID assigned to the current recognition by the Recognition module The ID assigned by the Recognition module, or -1 if person was not recognized. Retrieves the size of the recognition database for the user to be able to allocate the db buffer in the correct size The size of the database in bytes. Copies the recognition database buffer to the user. Allows user to store it for later usage. A user allocated buffer to copy the database into. The user must make sure the buffer is large enough (can be determined by calling QueryDatabaseSize()). true if database has been successfully copied to db. false - otherwise. Unregisters a user from the database by user ID ID of the user to unregister Returns the number of tracked joints Retrieves all joints the joints' locations are copied into this array. The application is expected to allocate this array (size retrieved from QueryNumJoints()) Returns true if data and parameters exists, false otherwise. Returns the number of tracked bones Retrieves all bones the bones' locations are copied into this array. The application is expected to allocate this array (size retrieved from QueryNumBones()) Returns true if data and parameters exists, false otherwise. Position Describes the position of the person Return the person's unique identifier. Return the location and dimensions of the tracked person, represented by a 2D bounding box (defined in pixels). Retrieve the 2D image mask of the tracked person. Retrieves the center mass of the tracked person The center mass of the tracked person in world coordinates The confidence of the calculated center mass location Return the location and dimensions of the tracked person's head, represented by a 2D bounding box (defined in pixels). Return the location and dimensions of the tracked person, represented by a 3D bounding box. Return the speed of person in 3D world coordinates the direction of the movement the magnitude of the movement in meters/second Get the number of pixels in the blob Retrieves the 3d blob of the tracked person The array of 3d points to which the blob will be copied. Must be allocated by the application Get the contour size (number of points in the contour) Get the data of the contour line Returns the Person Detection interface Returns the Person Recognition interface Returns the Person Joints interface Returns the Person Pose interface AlertType Identifiers for the events that can be detected and fired by the person module. AccessOrderType Orders in which the person can be accessed. By unique ID of the person From oldest to newest person in the scene From nearest to farthest person in scene TrackingState The current state of the module, either tracking specific people or performing full detection The type of alert The ID of the person that triggered the alert, if relevant and known The time-stamp in which the event occurred create a new copy of active configuration create a placeholder for output Import the preview sample content into the photo instance. The PXCMCapture.Sample instance from the SenseManager QuerySample(). PXCM_STATUS_NO_ERROR Successful execution. Check if a file in an XDM file or not. The file name. true if file is XDM and false otherwise. Import the photo content from the XDM File Format v2.0. The file name. subsampling rate. PXCM_STATUS_NO_ERROR Successful execution. Export the photo content to the XDM File Format v2.0. The file name. removeOriginalImage Flag to indicate whether to remove original image from XDM photo if container image is processed. This will reduce XDM photo size. True = Removes the original image from XDM photo. False (Default) = Keeps the original image if the container image is processed. PXCM_STATUS_NO_ERROR Successful execution. Copy the content from the source photo The source photo. PXCM_STATUS_NO_ERROR Successful execution. Get the reference image of the photo. The reference image is usually the processed color image. The PXCMImage instance. copy the camera[0] color image to the container image of the photo. Get the color image in camera[camIdx] of the photo. The unedited image is usually the unprocessed color image in camera[0]. The PXCMImage instance. Get the original image of the photo. The original image is usually the unprocessed color image. The PXCMImage instance. Get the raw depth image of the photo. This would be the unprocessed depth captured from the camera or loaded from a file if it existed. The PXCMImage instance. Get the depth map in camera[camIdx] of the photo. The depth map in camera[0] is the holefilled depth. The PXCMImage instance. Get the depth image of the photo. This would be the processed depth if it undergoes processing. The PXCMImage instance. Get the device revision. The revision of the XDM spec, e.g. “1.0”. Changes to a minor version number ("1.1", "1.2", etc.) do not break compatibility within that major version. See the section on Versioning, under Schema, for more on this. Note that this field is informational; actual compatibility depends on namespace versions. nchars input size of the requested buffer for overrun safety The Revision string. Get the device vendor info. The VendorInfo struct. Get the camera[camIdx] vendor info. The VendorInfo struct. Get the number of cameras in the device. The number of cameras. Get the camera[camIdx] pose = Translation and rotation. The translation(x,y,z) For the first camera, this is 0. For additional cameras, this is relative to the first camera. For the rotation(x,y,z,w) writers of this format should make an effort to normalize [x,y,z], but readers should not expect the rotation axis to be normalized.. trans.x = x position in meters. trans.y = y position in meters. trans.z = z position in meters. rot.x = x component of the rotation vector in the axis-angle representation. rot.y = y component of the rotation vector in the axis-angle representation. rot.z = z component of the rotation vector in the axis-angle representation. rot.w = w rotation angle in radians in the axis-angle representation. Get the PerspectiveCameraModel of camera[camIdx]. The PerspectiveCameraModel This checks that the signature of the container image did not change. Using Adler 32 for signature. s false if signature changed and true otherwise constructors and misc Increase a reference count of the sample. SubSample: Subsampling rate to load High Res images faster NO_SUBSAMPLING = 0 SUBSAMPLE_2 = 2 SUBSAMPLE_4 = 4 SUBSAMPLE_8 = 8 VendorInfo PerspectiveModel. @class PXCMPointConverter A utility for converting 2D/3D data points from defined source to a defined target. The class provides with setters for both source and target rectangle/box The class provides with getters for 2D/3D converted points 2D Rectangle - A data structure that represents a 2D rectangle 3D Box - A data structure representing a "box" in 3D space (a 3D cube). Example: convert from hand-joint image coordinates to user defined screen area coordinates. Example: convert from face-landmark 3d position to a 3d game coordinate system. Set 2D image rectangle source containing desired source rectangle dimensions PXCM_STATUS_VALUE_OUT_OF_RANGE if rectangle2D w, h params less than or equal to 0 PXCM_STATUS_V_NO_ERROR if rectangle2D w,h params > 0 Set 2D target rectangle containing desired target rectangle dimensions PXCM_STATUS_VALUE_OUT_OF_RANGE if rectangle2D w,h params less than or equal to 0 PXCM_STATUS_NO_ERROR if rectangle2D w,h params > 0 Get converted 2D point from source to target converted 2D point @note Call handData.Update() before calling this function in order to get converted point based on updated frame data. Converted 2D Point Set 3D point to be converted @note only relevant when using CreateCustomPointConverter PXCM_STATUS_ITEM_UNAVAILABLE if not using CustomPointConverter inverting x,y axis Use if you wish to invert converted point axis. @example Use Invert2DAxis(false, true) if user app y axis is facing up invert x axis invert y axis PXCM_STATUS_NO_ERROR Set 3D world box source containing desired source world box dimensions PXCM_STATUS_VALUE_OUT_OF_RANGE if any of box3D dimension params less than or equal to 0 PXCM_STATUS_NO_ERROR if box3D dimension params > 0 Set 3D target box containing desired target box dimensions PXCM_STATMUS_VALUE_OUT_OF_RANGE if any of box3D dimension params less than or equal to 0 PXCM_STAUS_NO_ERROR if box3D dimension params > 0 Set 2D point to be converted @note only relevant when using CreateCustomPointConverter PXCM_STATUS_ITEM_UNAVAILABLE if not using CustomPointConverter Get converted 3D point from source to target @note Call handData.Update() before calling this function in order to get converted point based on updated frame data. Converted 3D Point inverting x,y,z axis Use if you wish to invert converted point axis. @example Use Invert3DAxis(false,true,false ) if user app y axis is facing down invert x axis invert y axis invert z axis PXCM_STATUS_NO_ERROR @class PXCMPointConverterFactory Factory class for creating module based point converter Create hand joint data PointConverter for PXCMHandModule The converter will convert joint position to target rectangle/3dbox based on requested hand. @note Make sure the handData is constantly updated throughtout the session. @example pointConverter.CreateHandJointConverter(handData,PXCMHandData.ACCESS_ORDER_BY_TIME,0,PXCMHandData.JOINT_WRIST); a pointer to PXCHandData The desired hand access order hand index desired joint type to be converted an object of the created PointConverter, or null in case of illegal arguments Create hand Extremity data PointConverter for PXCMHandModule The converter will convert extremity data to target rectangle/3dbox based on requested hand. @note Make sure the handData is constantly updated throughtout the session. @example pointConverter.CreateHandExtremityConverter(handData,PXCMHandData.ACCESS_ORDER_BY_TIME,0,PXCMHandData.EXTREMITY_CENTER); a pointer to PXCMHandData The desired hand access order hand index desired extremity type to be converted an object of the created PointConverter, or null in case of illegal arguments Create blob data PointConverter for PXCMBlobModule The converter will convert exrtemity data to target rectangle/3dbox based on requested access order and index. @note Make sure the blobData is constantly updated throughtout the session. @example pointConverter.CreateBlobPointConverter(blobData,PXCMBlobData.ACCESS_ORDER_BY_TIME,0,PXCMBlobData.EXTREMITY_CENTER); a pointer to PXCBlobData The desired blob access order blob index desired exrtemity point to be converted an object to the created PointConverter, or null in case of illegal arguments Create custom PointConverter The converter will convert any data point to target rectangle/3dbox @note make sure to call Set2DPoint or Set3DPoint @example pointConverter.CreateCustomPointConverter(); PXCMPointF32 point = {22.f,40.f}; pointConverter.Set2DPoint(point); pointConveter.GetConverted2DPoint(); pointer to the created PointConverter Query current power state of the device, returns maximal used state Try to set power state of all used devices, all streams, application should call QueryStream to check if the desired state was set Sets inactivity interval Returns inactivity interval constructors and misc Map depth coordinates to color coordinates for a few pixels. The array of depth coordinates + depth value in the PXCMPoint3DF32 structure. The array of color coordinates, to be returned. PXCM_STATUS_NO_ERROR Successful execution. Map color coordinates to depth coordiantes for a few pixels. The depthmap image. The array of color coordinates. The array of depth coordinates, to be returned. PXCM_STATUS_NO_ERROR Successful execution. Map depth coordinates to world coordinates for a few pixels. The array of depth coordinates + depth value in the PXCMPoint3DF32 structure. The array of world coordinates, in mm, to be returned. PXCM_STATUS_NO_ERROR Successful execution. Map color pixel coordinates to camera coordinates for a few pixels. The array of color coordinates + depth value in the PXCMPoint3DF32 structure. The array of camera coordinates, in mm, to be returned. PXCM_STATUS_NO_ERROR Successful execution. Map camera coordinates to depth coordinates for a few pixels. The array of world coordinates, in mm. The array of depth coordinates, to be returned. PXCM_STATUS_NO_ERROR Successful execution. Map camera coordinates to color coordinates for a few pixels. The array of world coordinates, in mm. The array of color coordinates, to be returned. PXCM_STATUS_NO_ERROR Successful execution. Retrieve the UV map for the specific depth image. The UVMap is a PXCMPointF32 array of depth size width*height. The depth image instance. The UV map, to be returned. PXCM_STATUS_NO_ERROR Successful execution. Retrieve the inverse UV map for the specific depth image. The inverse UV map maps color coordinates back to the depth coordinates. The inverse UVMap is a PXCMPointF32 array of color size width*height. The depth image instance. The inverse UV map, to be returned. PXCM_STATUS_NO_ERROR Successful execution. Retrieve the vertices for the specific depth image. The vertices is a PXCMPoint3DF32 array of depth size width*height. The world coordiantes units are in mm. The depth image instance. The inverse UV map, to be returned. PXCM_STATUS_NO_ERROR Successful execution. Get the color pixel for every depth pixel using the UV map, and output a color image, aligned in space and resolution to the depth image. The depth image instance. The color image instance. The output image in the depth image resolution. Map every depth pixel to the color image resolution using the UV map, and output an incomplete depth image (with holes), aligned in space and resolution to the color image. The depth image instance. The color image instance. The output image in the color image resolution. get rotation in Euler angles representation. Euler angles are a 3D point that represents rotation in 3D. Each variable is the angle of rotation around a certain axis (x/y/z). the order in which we get the Euler angles (ROLL_PITCH_YAW as default) 3D point containing Euler angles (RADIANS) in the given order. get rotation in quaternion representation. Quaternion is a 4D point that represents rotation in 3D. 4D point containing a quaternion representation (w,x,y,z) . get rotation matrix representation. @param rotation matrix - 3x3 float array, containing the rotation matrix get rotation in angle-axis representation. angle-axis represents rotation (angle in RADIANS) around an axis AngleAxis struct containing an axis and angle of rotation around this axis. get roll - angle of rotation around z axis using ROLL_PITCH_YAW eulerOrder. roll - angle of rotation around the z axis get pitch - angle of rotation around x axis using ROLL_PITCH_YAW eulerOrder. pitch - angle of rotation around the x axis get yaw - angle of rotation around y axis using ROLL_PITCH_YAW eulerOrder. pitch - angle of rotation around the y axis Set rotation as a concatenation of current rotation and the given Rotation. - the given rotation Get rotated vector according to current rotation. - the vector we want to rotate rotated vector according to current rotation. Set rotation based on a quaternion. Quaternion is a 4D point that represents rotation in 3D. rotation in quaternion representation. Set rotation based on Euler angles representation. Euler angles are a 3D point that represents rotation in 3D. Each variable is the angle of rotation around a certain axis (x/y/z). the rotation in Euler angles representation. the order in which we set the rotation (ROLL_PITCH_YAW as default). Set rotation based on a 3x3 rotation matrix. Note that only rotation (not scale or translation) is taken into acount from the rotation matrix. That is, two matrices with the same rotation will yield the same Rotation instance regardless of their inequality. rotation in rotation matrix representation. Set rotation based on a rotation angle(RADIANS) around an axis. angle-axis represents rotation (angle in RADIANS) around an axis rotation angle (RADIANS). rotation around this axis. Set rotation from Spherical linear interpolation between two rotations. - start rotation - end rotation - interpolation factor constructors and misc @class AngleAxis Rotation in Angle-Axis representation. Based on a rotation angle (RADIANS) around an axis EulerOrder EulerOrder indicates the order in which to get the Euler angles. This order matters. (ROLL_PITCH_YAW != ROLL_YAW_PITCH) Roll, Pitch and Yaw are the angles of rotation around the x, y and z axis accordingly. Instance of this interface class can be created using PXCMScenePerception.CreatePXCSurfaceVoxelsData(...). ExportSurfaceVoxels function fills the data buffer. It's client's responsibility to explicitly release the memory by calling Dispose() on PXCMSurfaceVoxelsData. Returns number of surface voxels present in the buffer. This function is expected to be used after successful call to ExportSurfaceVoxels(). Returns an array to center of surface voxels extracted by ExportSurfaceVoxels. This function is expected to be used after successful call to ExportSurfaceVoxels(). Valid range is [0, 3*QueryNumberOfSurfaceVoxels()). Returns an array of colors with length 3*QueryNumberOfSurfaceVoxels(). Three color channels (RGB) per voxel. This function will return null, if PXCMSurfaceVoxelsData was created using PXCMScenePerception.CreatePXCMSurfaceVoxelsData with bUseColor set to false. Sets number of surface voxels to 0. However it doesn't release memory. It should be used when you reset scene perception using PXCMScenePerception.Reset() client should Reset PXCMSurfaceVoxelsData when scene perception is Reset to stay in sync with the scene perception. An instance of this interface can be created using PXCMScenePerception.CreatePXCMBlockMeshingData method DoMeshingUpdate function fills all the buffer with the data. It's client's responsibility to explicitly release the memory by calling Dispose() on PXCMBlockMeshingData. Returns number of PXCMBlockMesh present inside the buffer returned by QueryBlockMeshes(). This function is expected to be used after successful call to DoMeshingUpdate(...). Returns number of vertices present in the buffer returned by QueryVertices(). This function is expected to be used after successful call to DoMeshingUpdate(...). Returns number of faces in the buffer returned by QueryFaces(). This function is expected to be used after successful call to DoMeshingUpdate(...). Returns maximum number of PXCMBlockMesh that can be returned by DoMeshingUpdate. This value remains same throughout the lifetime of the instance. Returns maximum number of vertices that can be returned by PXCMBlockMeshingData. This value remains same throughout the lifetime of the instance. Returns maximum number of faces that can be returned by PXCMBlockMeshingData. This value remains same throughout the lifetime of the instance. Returns an array of PXCMBlockMesh objects with length same as QueryNumberOfBlockMeshes(). Returns an array of PXCMBlockMesh objects with length same as QueryNumberOfBlockMeshes(). Returns an array of float points with length 4*QueryNumberOfVertices() Each vertex consists of 4 float points: (x, y, z) coordinates in meter unit + a confidence value. The confidence value is in the range [0, 1] indicating how confident scene perception is about the presence of the vertex. Returns an array of float points with length 4*QueryNumberOfVertices() Each vertex is consist of 4 float points: (x, y, z) coordinates in meter unit + a confidence value. The confidence value is in the range [0, 1] indicating how confident scene perception is about the presence of the vertex. Returns an array of colors with length 3*QueryNumberOfVertices(). Three color channels (RGB) per vertex. This function will return null, if PXCMBlockMeshingData was created using PXCMScenePerception.CreatePXCMBlockMeshingData(...) with bUseColor set to false. Returns an array of colors with length 3*QueryNumberOfVertices(). Three color channels (RGB) per vertex. This function will return NULL, if PXCMBlockMeshingData was created using PXCMScenePerception.CreatePXCMBlockMeshingData(...) with bUseColor set to false. Returns an array of faces forming the mesh (3 Int32 indices per triangle) valid range is from [0, 3*QueryNumberOfFaces()]. Returns an array of faces forming the mesh (3 Int32 indices per triangle) valid range is from [0, 3*QueryNumberOfFaces()]. Sets number of BlockMeshes, number of vertices and number of faces to 0. However it doesn't release memory. It should be used when you reset scene perception using PXCMScenePerception.Reset(). Client should Reset PXCMBlockMeshingData when scene perception is reset to stay in sync with the scene perception. Describes each BlockMesh present inside list returned by QueryBlockMeshes(). SetVoxelResolution sets volume resolution for the scene perception. The VoxelResolution is locked when PXCMSenseManager.Init() is called. Afterwards value for VoxelResolution remains same throughout the lifetime of PXCMSenseManager. The default value of voxel resolution is LOW_RESOLUTION. Resolution of the three dimensional reconstruction. Possible values are: LOW_RESOLUTION: For room-sized scenario (4/256m) MED_RESOLUTION: For table-top-sized scenario (2/256m) HIGH_RESOLUTION: For object-sized scenario (1/256m) Choosing HIGH_RESOLUTION in a room-size environment may degrade the tracking robustness and quality. Choosing LOW_RESOLUTION in an object-sized scenario may result in a reconstructed model missing the fine details. PXCM_STATUS_NO_ERROR if it succeeds, returns PXCM_STATUS_ITEM_UNAVAILABLE if called after making call to PXCMSenseManager.Init(). To get voxel resolution used by the scene perception module. Please refer to SetVoxelResolution(...) for more details. Returns current value of VoxelResolution used by the scene perception module. Allows user to enable/disable integration of upcoming camera stream into 3D volume. If disabled the volume will not be updated. However scene perception will still keep tracking the camera. This is a control parameter which can be updated before passing every frame to the module. Enable/Disable flag for integrating depth data into the 3D volumetric representation. PXCM_STATUS_NO_ERROR if it succeeds, otherwise returns the error code. Allows user to to check Whether integration of upcoming camera stream into 3D volume is enabled or disabled. True, if integrating depth data into the 3D volumetric representation is enabled. Allows user to set the initial camera pose. This function is only available before first frame is passed to the module. Once the first frame is passed the initial camera pose is locked and this function will be unavailable. If this function is not used then the module default pose as the initial pose for tracking for the device with no platform IMU and for device with platform IMU the tracking pose will be computed using gravity vector to align 3D volume with gravity when the first frame is passed to the module. Array of 12 pxcF32 that stores initial camera pose user wishes to set in row-major order. Camera pose is specified in a 3 by 4 matrix [R | T] = [Rotation Matrix | Translation Vector] where R = [ r11 r12 r13 ] [ r21 r22 r23 ] [ r31 r32 r33 ] T = [ tx ty tz ] Pose Array Layout = [r11 r12 r13 tx r21 r22 r23 ty r31 r32 r33 tz] Translation vector is in meters. If successful it returns PXCM_STATUS_NO_ERROR, otherwise returns error code if invalid pose is passed or the function is called after passing the first frame. Allows user to get tracking accuracy of the last frame processed by the module. We expect users to call this function after successful PXCMSenseManager.AcquireFrame(...) call and before calling PXCMSenesManager.ReleaseFrame(). If tracking accuracy is FAILED the volume data and camera pose are not updated. TrackingAccuracy which can be HIGH, LOW, MED or FAILED. Allows user to access camera's latest pose. The correctness of the pose depends on value obtained from QueryTrackingAccuracy(). Array of 12 Single to store camera pose in row-major order. Camera pose is specified in a 3 by 4 matrix [R | T] = [Rotation Matrix | Translation Vector] where R = [ r11 r12 r13 ] [ r21 r22 r23 ] [ r31 r32 r33 ] T = [ tx ty tz ] Pose Array Layout = [r11 r12 r13 tx r21 r22 r23 ty r31 r32 r33 tz] Translation vector is in meters. PXCM_STATUS_NO_ERROR, If the function succeeds. Otherwise error code will be returned. Allows user to check whether the 3D volume was updated since last call to DoMeshingUpdate(...). This function is useful for determining when to call DoMeshingUpdate. flag indicating that reconstruction was updated. Allows user to access 2D projection image of reconstructed volume from a given camera pose by ray-casting. This function is optimized for real time performance. It is also useful for visualizing progress of the scene reconstruction. User should explicitly call Dispose() on PXCMImage after copying the data. or before making subsequent call to QueryVolumePreview(...). Array of 12 Singles that stores camera pose in row-major order. Camera pose is specified in a 3 by 4 matrix [R | T] = [Rotation Matrix | Translation Vector] where R = [ r11 r12 r13 ] [ r21 r22 r23 ] [ r31 r32 r33 ] T = [ tx ty tz ] Pose Array Layout = [r11 r12 r13 tx r21 r22 r23 ty r31 r32 r33 tz] Translation vector is in meters. Instance of PXCMImage whose content can be used for volume rendering. Returns null if there is an internal state error or when the rendering is failed or when an invalid pose matrix is passed. Reset removes all reconstructed model (volume) information and the module will reinitialize the model when next stream is passed to the module. It also resets the camera pose to the one provided. If the pose is not provided then the module will use default pose if there is no platform IMU on the device and in case of device with platform IMU the pose will be computed using gravity vector to align 3D volume with gravity when the next frame is passed to the module. However it doesn't Reset instance of PXCMBlockMeshingData created using PXCMScenePerception.CreatePXCMBlockMeshingData(...). User should explicitly call PXCMBlockMeshingData.Reset() to stay in sync with the reconstruction model inside scene perception. Array of 12 Singles that stores initial camera pose user wishes to set in row-major order. Camera pose is specified in a 3 by 4 matrix [R | T] = [Rotation Matrix | Translation Vector] where R = [ r11 r12 r13 ] [ r21 r22 r23 ] [ r31 r32 r33 ] T = [ tx ty tz ] Pose Array Layout = [r11 r12 r13 tx r21 r22 r23 ty r31 r32 r33 tz] Translation vector is in meters. On success returns PXCM_STATUS_NO_ERROR. Otherwise returns error code like when an invalid pose argument is passed. Reset removes all reconstructed model (volume) information and the module will reinitializes the model when next Stream is passed to the module. It also resets the camera pose to the one provided or otherwise uses default initial pose. However If the platform IMU is detected then the rotation matrix set by Reset will be modified using gravity vector to align 3D volume with gravity when the next frame frame is passed to the module And the translation vector will be retained. If the reset is called without Pose Platform with IMU then the module will use default translation and rotation will be obtained based on value of gravity vector when the next frame is passed. However it doesn't Reset instance of PXCMBlockMeshingData created using PXCMScenePerception.CreatePXCMBlockMeshingData. User should explicitly call PXCMBlockMeshingData.Reset to stay in sync with the reconstruction model inside scene perception. Array of 12 Singles that stores initial camera pose user wishes to set in row-major order. Camera pose is specified in a 3 by 4 matrix [R | T] = [Rotation Matrix | Translation Vector] where R = [ r11 r12 r13 ] [ r21 r22 r23 ] [ r31 r32 r33 ] T = [ tx ty tz ] Pose Array Layout = [r11 r12 r13 tx r21 r22 r23 ty r31 r32 r33 tz] Translation vector is in meters. On success returns PXCM_STATUS_NO_ERROR. Otherwise returns error code like when an invalid pose is argument passed. Is an optional function meant for expert users. It allows users to set meshing thresholds for DoMeshingUpdate(...) The values set by this function will be used by succeeding calls to DoMeshingUpdate(...). Sets the thresholds indicating the magnitude of changes occurring in any block that would be considered significant for re-meshing. If the maximum change in a block exceeds this value, then the block will be re-meshed. Setting the value to zero will retrieve all blocks. If the average change in a block exceeds this value, then the block will be re-meshed. Setting the value to zero will retrieve all blocks. PXCM_STATUS_NO_ERROR, on success otherwise returns error code. Allows user to allocate PXCMBlockMeshingData which can be passed to DoMeshingUpdate. It's user's responsibility to explicitly release the memory by calling Dispose(). Maximum number of mesh blocks client can handle in one update from DoMeshingUpdate(...), If non-positive value is passed then it uses the default value. Use PXCMBlockMeshingData::QueryMaxNumberOfBlockMeshes() to check the value. Maximum number of faces that client can handle in one update from DoMeshingUpdate(...), If non-positive value is passed then it uses the default value. Use PXCMBlockMeshingData::QueryMaxNumberOfFaces() to check the value. Maximum number of vertices that client can handle in one update from DoMeshingUpdate(...). If non-positive value is passed then it uses the default value. Use PXCMBlockMeshingData::QueryMaxNumberOfVertices() to check the value. Flag indicating whether user wants scene perception to return color per vertex in the mesh update. If set the color buffer will be created in PXCMBlockMeshingData otherwise color buffer will not be created and any calls made to PXCMBlockMeshingData::QueryVerticesColor() will return null. on success returns valid handle to the instance otherwise returns null. Allows user to allocate PXCMBlockMeshingData with color enabled which can be passed to DoMeshingUpdate. It's user's responsibility to explicitly release the memory by calling Dispose(). Maximum number of mesh blocks client can handle in one update from DoMeshingUpdate(...), If non-positive value is passed then it uses the default value. Use PXCMBlockMeshingData::QueryMaxNumberOfBlockMeshes() to check the value. Maximum number of faces that client can handle in one update from DoMeshingUpdate(...), If non-positive value is passed then it uses the default value. Use PXCMBlockMeshingData::QueryMaxNumberOfFaces() to check the value. Maximum number of vertices that client can handle in one update from DoMeshingUpdate(...). If non-positive value is passed then it uses the default value. Use PXCMBlockMeshingData::QueryMaxNumberOfVertices() to check the value. on success returns valid handle to the instance otherwise returns null. Allows user to allocate PXCMBlockMeshingData with default number of mesh blocks, vertices, face and color enabled which can be passed to DoMeshingUpdate(...). It's user's responsibility to explicitly release the memory by calling Dispose(). on success returns valid handle to the instance otherwise returns null. Performs meshing and hole filling if requested. This function can be slow if there is a lot of data to be meshed. For Efficiency reason we recommend running this function on a separate thread. This call is designed to be thread safe if called in parallel with ProcessImageAsync. Instance of pre-allocated PXCMBlockMeshingData. Refer to PXCMScenePerception::CreatePXCMBlockMeshingData(...) for how to allocate PXCMBlockMeshingData. Argument to indicate whether to fill holes in mesh blocks. If set, it will fill missing details in each mesh block that is visible from scene perception's camera current pose and completely surrounded by closed surface(holes) by smooth linear interpolation of adjacent mesh data. Argument to indicate which mesh data you wish to use -countOfBlockMeshesRequired: If set, on successful call this function will set number of block meshes available for meshing which can be retrieved using QueryNumberOfBlockMeshes() -blockMeshesRequired: Can only be set to true if countOfBlockMeshesRequired is set to true otherwise the value is ignored, If set, on successful call to this function it will update block meshes array in pBlockMeshingUpdateInfo which can be retrieved using QueryBlockMeshes() -countOfVeticesRequired: If set, on successful call this function it will set number of vertices available for meshing which can be retrieved using QueryNumberOfVertices() -verticesRequired: Can only be set if countOfVeticesRequired is set to true otherwise the value is ignored, If set, on successful call to this function it will update vertices array in pBlockMeshingUpdateInfo which can be retrieved using QueryVertices() -countOfFacesRequired: If set, on successful call this function it will set number of faces available for meshing which can be retrieved using QueryNumberOfFaces() -facesRequired: Can only be set, If countOfFacesRequired is set to true otherwise the value is ignored, If set, on successful call to this function it will update faces array in pBlockMeshingUpdateInfo which can be retrieved using QueryFaces() -colorsRequired: If set and PXCMBlockMeshingData was created with color, on success function will fill in colors array which can be accessed using QueryVerticesColor() +NOTE: set meshing threshold to (0, 0) prior to calling DoMeshingUpdate with hole filling enabled to fill mesh regions that are not changed. On success PXCM_STATUS_NO_ERROR otherwise error code will be returned Performs DoMeshingUpdate(...) with hole filling disabled and requests all mesh data (vertices, faces, block meshes and color). Instance of pre-allocated PXCMBlockMeshingData. Refer to PXCMScenePerception::CreatePXCMBlockMeshingData For how to allocate PXCMBlockMeshingData. On success PXCM_STATUS_NO_ERROR otherwise error code will be returned. Allows users to save mesh in an ASCII obj file in MeshResolution::HIGH_RESOLUTION_MESH. the path of the file to use for saving the mesh. Indicates whether to fill holes in mesh before saving the mesh. On success PXCM_STATUS_NO_ ERROR, Otherwise error code is returned on failure. Allows user to check whether the input stream is suitable for starting, resetting/restarting or tracking scene perception. Input stream sample required by scene perception module. Returns positive values between 0.0 and 1.0 to indicate how good is scene for starting, tracking or resetting scene perception. 1.0 -> represents ideal scene for starting scene perception. 0.0 -> represents unsuitable scene for starting scene perception. Returns negative values to indicate potential reason for tracking failure -1.0 -> represents a scene without enough structure/geomtery -2.0 -> represents a scene without enough depth pixels (Too far or too close to the target scene or outside the range of depth camera) Also, value 0.0 is returned when an invalid argument is passed or if the function is called before calling PXCSenseManager::Init(). Fills holes in the supplied depth image. Instance of depth image to be filled. Pixels with depth value equal to zero will be linearly interpolated with adjacent depth pixels. The image resolution should be 320X240. On success PXCM_STATUS_NO_ERROR, Otherwise error code will be returned on failure. Allows user to access normals of surface that are within view from the camera's current pose. Array of pre-allocated PXCMPoint3DF32 to store normal vectors. Each normal vector has three components namely x, y and z. The size in pixels must be QVGA and hence the array size in bytes should be: (PXCMPoint3DF32's byte size) x (320 x 240). On success PXCM_STATUS_NO_ERROR, Otherwise error code will be returned on failure. Allows user to access the surface's vertices that are within view from camera's current pose. Array of pre-allocated PXCMPoint3DF32 to store vertices. Each element is a vector of x, y and z components. The image size in pixels must be QVGA and hence the array size in bytes should be: (PXCMPoint3DF32's byte size) x (320 x 240). On success PXCM_STATUS_NO_ERROR, Otherwise error code will be returned on failure. Allows user to save the current scene perception's state to a file and later supply the file to LoadState() to restore scene perception to the saved state. The path of the file to use for saving the scene perception state. On success PXCM_STATUS_NO_ERROR, Otherwise error code will be returned on failure. Allows user to load the current scene perception's state from the file that has been created using SaveCurrentState. This function is only available before calling PXCMSenseManager::Init(). The path of the file to load scene perception state from. On success PXCM_STATUS_NO_ERROR, Otherwise error code will be returned on failure. Allows user to allocate CreatePXCMSurfaceVoxelsData which can be passed to ExportSurfaceVoxels. It's user's responsibility to explicitly release the instance by calling Dispose(). Maximum number of voxels client is expecting in each call to ExportSurfaceVoxels(...). Flag indicating whether user wants scene perception to return voxel when ExportSurfaceVoxels(...) is called. If set, the color buffer will be allocated in PXCMSurfaceVoxelsData otherwise color buffer will not be created and any calls made to PXCMSurfaceVoxelsData.QuerySurfaceVoxelsColor() will return null. on success returns valid handle to the instance otherwise returns null. Allows user to allocate CreatePXCMSurfaceVoxelsData without color, which can be passed to ExportSurfaceVoxels with default estimate of number of voxels. It's user's responsibility to explicitly release the instance by calling Dispose(). Allows user to export the voxels intersected by the surface scanned. Optionally allows to specify region of interest for surface voxels to be exported. voxels will be exported in parts over multiple calls to this function. Client is expected to check return code to determine if all voxels are exported successfully or not. Pre-allocated instance of PXCMSurfaceVoxelsData using CreatePXCMSurfaceVoxelsData(...). On success the function will fill in center of each surface voxel in an array which can be obtained using QueryCenterOfSurfaceVoxels and number of voxels which can be retrieved using QueryNumberOfSurfaceVoxels(). Optional, PXCMPoint3DF32 represents lower left corner of the front face of the bounding box which specifies region of interest for exporting surface voxels. Optional, PXCMPoint3DF32 represents upper right corner of the rear face of the bounding box which specifies region of interest for exporting surface voxels. If scene perception module is able to export all the surface voxels it has acquired it will return PXCM_STATUS_NO_ERROR and after that any calls made to ExportSurfaceVoxels(...) will restart exporting all the voxels again. If all voxels cannot be fit into specified surfaceVoxelsData, it will return warning code PXCM_STATUS_DATA_PENDING indicating that client should make additional calls to ExportSurfaceVoxels to get remaining voxels until PXCM_STATUS_NO_ERROR is returned. Allows to Export Surface Voxels present in the entire volume. Allows user to set meshing resolution for DoMeshingUpdate(...). Mesh Resolution user wishes to set. On success PXCM_STATUS_NO_ERROR, Otherwise error code will be returned on failure. Allows user to get meshing resolution used by DoMeshingUpdate(...). MeshResolution used by DoMeshingUpdate(...). Allows user to get meshing thresholds used by scene perception. retrieve max distance change threshold. retrieve average distance change threshold. On success PXCM_STATUS_NO_ERROR, Otherwise error code will be returned on failure. Allows user to set region of interest for Meshing. If used, The DoMeshingUpdate(...) function will only mesh these specified regions. Use ClearMeshingRegion() to clear the meshing region set by this function. Pre-allocated PXCMPoint3DF32 which specifies lower left corner of the front face of the bounding box. Pre-allocated PXCMPoint3DF32 which specifies upper right corner of the rear face of the bounding box. On success PXCM_STATUS_NO_ERROR, Otherwise error code will be returned on failure. Allows user to clear meshing region set by SetMeshingRegion(...). On success PXCM_STATUS_NO_ERROR, Otherwise error code will be returned on failure. Allows user to enforce the supplied pose as the camera pose .The module will track the camera from this pose when the next frame is passed. This function can be called any time after module finishes processing first frame or any time after module successfully processes the first frame post a call to Reset scene perception. Array of 12 pxcF32 that stores the camera pose user wishes to set in row-major order. Camera pose is specified in a 3 by 4 matrix [R | T] = [Rotation Matrix | Translation Vector] where R = [ r11 r12 r13 ] [ r21 r22 r23 ] [ r31 r32 r33 ] T = [ tx ty tz ] Pose Array Layout = [r11 r12 r13 tx r21 r22 r23 ty r31 r32 r33 tz] Translation vector is in meters. PXCM_STATUS_NO_ERROR, if the function succeeds. Otherwise error code will be returned. Allows user to get length of side of voxel cube in meters. Returns length of side of voxel cube in meters. Allows user to get the intrinsics of internal scene perception camera. These intrinsics should be used with output images obtained from the module. Such as QueryVolumePreview(...), GetVertices(...) and GetNormals(..). This function should only be used after calling PXCMSenseManager.Init() otherwise it would return an error code. Handle to pre-allocated instance of ScenePerceptionIntrinsics. On success this instance will be filled with appropriate values. PXCM_STATUS_NO_ERROR, if the function succeeds. Otherwise error code will be returned. Allows user to integrate specified stream from supplied pose in to the reconstructed volume. Input stream sample required by scene perception module. Obtained using PXCMSenseManager.QueryScenePerceptionSample(). Estimated pose for the supplied input stream. Array of 12 pxcF32 that stores the camera pose user wishes to set in row-major order. Camera pose is specified in a 3 by 4 matrix [R | T] = [Rotation Matrix | Translation Vector] where R = [ r11 r12 r13 ] [ r21 r22 r23 ] [ r31 r32 r33 ] T = [ tx ty tz ] Pose Array Layout = [r11 r12 r13 tx r21 r22 r23 ty r31 r32 r33 tz] Translation vector is in meters. PXCM_STATUS_NO_ERROR, if the function succeeds. Otherwise error code will be returned. Allows user to enable/disable re-localization feature of scene perception's camera tracking. By default re-localization is enabled. This functionality is only available after PXCMSenseManager.Init() is called. Flag specifying whether to enable or disable re-localization. PXCM_STATUS_NO_ERROR, if the function succeeds. Otherwise error code will be returned. Allows user to transform plane equations obtained from ExtractPlanes(...) to world co-ordinate system using the provided pose and returns number of planes found in supplied plane equation. Number of rows of the equation array pPlaneEq. Pre-allocated float array plane equations obtained from ExtractPlanes(...). On success the plane equations will be transformed in to world coordinate system using supplied camera pose. Array of 12 pxcF32 that stores camera pose of the capture sample that was supplied to the ExtractPlanes(...). stored in row-major order. Camera pose is specified in a 3 by 4 matrix [R | T] = [Rotation Matrix | Translation Vector] where R = [ r11 r12 r13 ] [ r21 r22 r23 ] [ r31 r32 r33 ] T = [ tx ty tz ] Pose Array Layout = [r11 r12 r13 tx r21 r22 r23 ty r31 r32 r33 tz] Translation vector is in meters. +NOTE: Use the pose obtained from GetCameraPose(...) to transform plane equations. On success, returns positive number indicating number of planes found in pPlaneEq. Negative number indicates errors like invalid argument. Allows users to save different configuration of mesh in an ASCII obj file. the path of the file to use for saving the mesh. Argument to indicate mesh configuration you wish to save. -fillMeshHoles: Flag indicates whether to fill holes in saved mesh. saveMeshColor: Flag indicates whether to save mesh with color. meshResolution: Indicates resolution for mesh to be saved. On success PXCM_STATUS_NO_ERROR, Otherwise error code is returned on failure. Allows user to enable or disable inertial sensor support for scene perception, by default it is disabled. This function is only available before calling PXCSenseManager::Init(). On success PXCM_STATUS_NO_ ERROR, Otherwise error code is returned on failure. Allows user to enable or disable gravity sensor support for scene perception, by default it is enabled. This function is only available before calling PXCSenseManager::Init(). On success PXCM_STATUS_NO_ ERROR, Otherwise error code is returned on failure. Allows user to get status(enabled/disabled) of gravity sensor support for scene perception. On success PXCM_STATUS_NO_ ERROR, Otherwise error code is returned on failure. Allows user to get status(enabled/disabled) of inertial sensor support for scene perception. On success PXCM_STATUS_NO_ ERROR, Otherwise error code is returned on failure. @brief This function closes the execution pipeline. Initialize the SenseManager pipeline for streaming with callbacks. The application must enable raw streams or algorithm modules before this function. Optional callback instance. PXC_STATUS_NO_ERROR Successful execution. Stream frames from the capture module to the algorithm modules. The application must initialize the pipeline before calling this function. AcquireFrame/ReleaseFrame are not compatible with StreamFrames. Run the SenseManager in the pulling mode with AcquireFrame/ReleaseFrame, or the callback mode with StreamFrames. True: the function blocks until the streaming stops (upon any capture device error or any callback function returns any error. False: the function returns immediately while running streaming in a thread. PXCM_STATUS_NO_ERROR Successful execution. Return the captured sample for the specified module or explicitly/impl requested streams. For modules, use mid=module interface identifier. For explictly requested streams via multiple calls to EnableStream(s), use mid=PXCCapture::CUID+0,1,2... The captured sample is managed internally by the SenseManager. Do not release the instance. The module identifier. Usually this is the interface identifier, or PXCCapture::CUID+n for raw video streams. The sample instance, or null if the captured sample is not available. Return the PXCSession instance. Internally managed. Do not release the instance. The session instance is managed internally by the SenseManager. Do not release the session instance. The PXCMSession instance. Return the PXCMCaptureManager instance. Internally managed. Do not release the instance. The instance is managed internally by the SenseManager. Do not release the instance. The PXCMCaptureManager instance. Return the captured sample for the specified module or explicitly requested streams. For modules, use mid=module interface identifier. For explictly requested streams via multiple calls to EnableStream(s), use mid=PXCMCapture.CUID+0,1,2... The captured sample is managed internally by the SenseManager. Do not release the instance. The module identifier. Usually this is the interface identifier, or PXCMCapture.CUID+n for raw video streams. The sample instance, or null if the captured sample is not available. Return the captured sample for the user segmentation module. The captured sample is managed internally by the SenseManager. Do not release the sample. The sample instance, or NULL if the captured sample is not available. Return the captured sample for the scene perception module. The captured sample is managed internally by the SenseManager. Do not release the sample. The sample instance, or NULL if the captured sample is not available. Return the captured sample for the enhanced Videography module. The captured sample is managed internally by the SenseManager. Do not release the sample. The sample instance, or NULL if the captured sample is not available. Return the captured sample for the object tracking module. The captured sample is managed internally by the SenseManager. Do not release the sample. The sample instance, or NULL if the captured sample is not available. Return the captured sample for the face module. The captured sample is managed internally by the SenseManager. Do not release the sample. The sample instance, or NULL if the captured sample is not available. Return the captured sample for the PersonTracking module. The captured sample is managed internally by the SenseManager. Do not release the sample. The sample instance, or NULL if the captured sample is not available. Return the captured sample for the hand module. The captured sample is managed internally by the SenseManager. Do not release the sample. The sample instance, or NULL if the captured sample is not available. Return the captured sample for the hand module. The captured sample is managed internally by the SenseManager. Do not release the sample. The sample instance, or NULL if the captured sample is not available. Return the captured sample for the blob module. The captured sample is managed internally by the SenseManager. Do not release the sample. The sample instance, or NULL if the captured sample is not available. Return the captured sample for the ObjectRecognition module. The captured sample is managed internally by the SenseManager. Do not release the sample. The sample instance, or null if the captured sample is not available. Return the module instance. Between AcquireFrame/ReleaseFrame, the function returns NULL if the specified module hasn't completed processing the current frame of image data. The instance is managed internally by the SenseManager. Do not release the instance. The module identifier. Usually this is the interface identifier. The module instance. Return the user segmentation module instance. Between AcquireFrame/ReleaseFrame, the function returns NULL if the specified module hasn't completed processing the current frame of image data. The instance is managed internally by the SenseManager. Do not release the instance. The module instance. Return the 3D scanning module instance. Between AcquireFrame/ReleaseFrame, the function returns NULL if the specified module hasn't completed processing the current frame of image data. The instance is managed internally by the SenseManager. Do not release the instance. The module instance. Return the Scene Perception module instance. Between AcquireFrame/ReleaseFrame, the function returns null if the specified module hasn't completed processing the current frame of image data. The instance is managed internally by the SenseManager. Do not release the instance. The module instance. Return the object tracking module instance. Between AcquireFrame/ReleaseFrame, the function returns NULL if the specified module hasn't completed processing the current frame of image data. The instance is managed internally by the SenseManager. Do not release the instance. The module instance. Return the Face module instance. Between AcquireFrame/ReleaseFrame, the function returns NULL if the specified module hasn't completed processing the current frame of image data. The instance is managed internally by the SenseManager. Do not release the instance. The module instance. Return the Touchless module instance. Between AcquireFrame/ReleaseFrame, the function returns NULL if the specified module hasn't completed processing the current frame of image data. The instance is managed internally by the SenseManager. Do not release the instance. The module instance. Return the hand module instance. Between AcquireFrame/ReleaseFrame, the function returns NULL if the specified module hasn't completed processing the current frame of image data. The instance is managed internally by the SenseManager. Do not release the instance. The module instance. Return the hand module instance. Between AcquireFrame/ReleaseFrame, the function returns NULL if the specified module hasn't completed processing the current frame of image data. The instance is managed internally by the SenseManager. Do not release the instance. The module instance. Return the ObjectRecognition module instance. Between AcquireFrame/ReleaseFrame, the function returns NULL if the specified module hasn't completed processing the current frame of image data. The instance is managed internally by the SenseManager. Do not release the instance. The module instance. Return the Blob module instance. Between AcquireFrame/ReleaseFrame, the function returns NULL if the specified module hasn't completed processing the current frame of image data. The instance is managed internally by the SenseManager. Do not release the instance. The module instance. Return the Enhanced Videography module instance. Between AcquireFrame/ReleaseFrame, the function returns NULL if the specified module hasn't completed processing the current frame of image data. The instance is managed internally by the SenseManager. Do not release the instance. The module instance. Return the Enhanced Videography module instance. Between AcquireFrame/ReleaseFrame, the function returns NULL if the specified module hasn't completed processing the current frame of image data. The instance is managed internally by the SenseManager. Do not release the instance. The module instance. Initialize the SenseManager pipeline for streaming. The application must enable raw streams or algorithm modules before this function. PXCM_STATUS_NO_ERROR Successful execution. Stream frames from the capture module to the algorithm modules. The application must initialize the pipeline before calling this function. If blocking, the function blocks until the streaming stops (upon any capture device error or any callback function returns any error. If non-blocking, the function returns immediately while running streaming in a thread. The blocking status. PXCM_STATUS_NO_ERROR Successful execution. This function starts streaming and waits until certain events occur. If ifall=true, the function blocks until all samples are ready and the modules completed processing the samples. If ifall=false, the function blocks until any of the mentioned is ready. The SenseManager pipeline pauses at this point for the application to retrieve the processed module data, until the application calls ReleaseFrame. AcquireFrame/ReleaseFrame are not compatible with StreamFrames. Run the SenseManager in the pulling mode with AcquireFrame/ReleaseFrame, or the callback mode with StreamFrames. If true, wait for all modules to complete processing the data. The time out value in milliseconds. PXCM_STATUS_NO_ERROR Successful execution. This function starts streaming and waits until certain events occur. If ifall=true, the function blocks until all samples are ready and the modules completed processing the samples. If ifall=false, the function blocks until any of the mentioned is ready. The SenseManager pipeline pauses at this point for the application to retrieve the processed module data, until the application calls ReleaseFrame. AcquireFrame/ReleaseFrame are not compatible with StreamFrames. Run the SenseManager in the pulling mode with AcquireFrame/ReleaseFrame, or the callback mode with StreamFrames. If true, wait for all modules to complete processing the data. PXCM_STATUS_NO_ERROR Successful execution. This function starts streaming and waits until certain events occur. If ifall=true, the function blocks until all samples are ready and the modules completed processing the samples. If ifall=false, the function blocks until any of the mentioned is ready. The SenseManager pipeline pauses at this point for the application to retrieve the processed module data, until the application calls ReleaseFrame. AcquireFrame/ReleaseFrame are not compatible with StreamFrames. Run the SenseManager in the pulling mode with AcquireFrame/ReleaseFrame, or the callback mode with StreamFrames.. PXCM_STATUS_NO_ERROR Successful execution. This function resumes streaming after AcquireFrame. AcquireFrame/ReleaseFrame are not compatible with StreamFrames. Run the SenseManager in the pulling mode with AcquireFrame/ReleaseFrame, or the callback mode with StreamFrames. Explicitly request to stream the specified raw stream. If specified more than one stream, SenseManager will synchronize these streams. If called multiple times, the function treats each stream request as independent (unaligned). The stream identifier is PXCCapture.CUID+n. The stream type. stream width. stream height. PXC_STATUS_NO_ERROR Successful execution. Explicitly request to stream the specified raw stream. If specified more than one stream, SenseManager will synchronize these streams. If called multiple times, the function treats each stream request as independent (unaligned). The stream identifier is PXCMCapture.CUID+n. The stream type. stream width. stream height. stream frame rate. PXC_STATUS_NO_ERROR Successful execution. Explicitly request to stream the specified raw stream. If specified more than one stream, SenseManager will synchronize these streams. If called multiple times, the function treats each stream request as independent (unaligned). The stream identifier is PXCMCapture.CUID+n. The stream type. stream width. stream height. stream frame rate. stream flags. PXC_STATUS_NO_ERROR Successful execution. Enable a module in the pipeline. The module identifier. This is usually the interface identifier. The module descriptor. PXCM_STATUS_NO_ERROR Successful execution. Enable the face module in the pipeline. The optional module name. The module identifier. This is usually the interface identifier. PXCM_STATUS_NO_ERROR Successful execution. Enable the user segmentation module in the pipeline. The module name. PXCM_STATUS_NO_ERROR Successful execution. Enable the user segmentation module in the pipeline. The module name. PXCM_STATUS_NO_ERROR Successful execution. Enable the 3D scanning module in the pipeline. The module name. PXCM_STATUS_NO_ERROR Successful execution. Enable the Scene Perception module in the pipeline. The module name. PXCM_STATUS_NO_ERROR Successful execution. Enable the Scene Perception module in the pipeline. PXCM_STATUS_NO_ERROR Successful execution. Enable the Enhanced Video module in the pipeline. The module name. PXCM_STATUS_NO_ERROR Successful execution. Enable the Enhanced Videogrphy module in the pipeline. PXCM_STATUS_NO_ERROR Successful execution. Enable the 3D scanning module in the pipeline. The module name. PXCM_STATUS_NO_ERROR Successful execution. Enable the object tracking module in the pipeline. The module name. PXCM_STATUS_NO_ERROR Successful execution. Enable the object tracking module in the pipeline. The module name. PXCM_STATUS_NO_ERROR Successful execution. Enable the face module in the pipeline. The module name. PXCM_STATUS_NO_ERROR Successful execution. Enable the face module in the pipeline. PXCM_STATUS_NO_ERROR Successful execution. Enable the touchless controller module in the pipeline. The module name. PXCM_STATUS_NO_ERROR Successful execution. Enable the touchless controller module in the pipeline. PXCM_STATUS_NO_ERROR Successful execution. Enable the hand module in the pipeline. The module name. PXCM_STATUS_NO_ERROR Successful execution. Enable the hand module in the pipeline. PXCM_STATUS_NO_ERROR Successful execution. Enable the hand module in the pipeline. The module name. PXCM_STATUS_NO_ERROR Successful execution. Enable the hand module in the pipeline. PXCM_STATUS_NO_ERROR Successful execution. Enable the Blob module in the pipeline. The module name. PXCM_STATUS_NO_ERROR Successful execution. Enable the Blob module in the pipeline. PXCM_STATUS_NO_ERROR Successful execution. Enable the ObjectRecognition module in the pipeline. The module name. PXCM_STATUS_NO_ERROR Successful execution. Enable the ObjectRecognition module in the pipeline. PXCM_STATUS_NO_ERROR Successful execution. Enable the PersonTracking module in the pipeline. The module name. PXCM_STATUS_NO_ERROR Successful execution. Enable the PersonTracking module in the pipeline. PXCM_STATUS_NO_ERROR Successful execution. Pause/Resume the execution of the specified module. The module identifier. This is usually the interface identifier. If true, pause the module. Otherwise, resume the module. Pause/Resume the execution of the user segmentation module. If true, pause the module. Otherwise, resume the module. Pause/Resume the execution of the Scene Perception module. If true, pause the module. Otherwise, resume the module. Pause/Resume the execution of the object tracking module. If true, pause the module. Otherwise, resume the module. Pause/Resume the execution of the face module. If true, pause the module. Otherwise, resume the module. Pause/Resume the execution of the Enhanced Videography module. If true, pause the module. Otherwise, resume the module. Pause/Resume the execution of the touchless controller module. If true, pause the module. Otherwise, resume the module. Pause/Resume the execution of the hand module. If true, pause the module. Otherwise, resume the module. Pause/Resume the execution of the hand module. If true, pause the module. Otherwise, resume the module. Pause/Resume the execution of the Blob module. If true, pause the module. Otherwise, resume the module. Pause/Resume the execution of the ObjectRecognition module. If true, pause the module. Otherwise, resume the module. Pause/Resume the execution of the PersonTracking module. If true, pause the module. Otherwise, resume the module. Create an instance of the PXCSenseManager instance. The PXCMSenseManager instance. The SenseManager calls back this function when there is a device connection or disconnection. During initialization, the SenseManager callbacks this function when openning or closing any capture devices. The video device instance. The device connection status. The return status is ignored during the PXCSenseManager initialization. During streaming, the SenseManager aborts the execution pipeline if the status is an error. The SenseManager calls back this function during initialization after each device configuration is set. The module identifier. Usually this is the interface identifier, or PXCMCapture.CUID+n for raw video streams. The module instance, or NULL for raw video streams. The SenseManager aborts the execution pipeline if the status is an error. The SenseManager calls back this function after a module completed processing the frame data. The module identifier. Usually this is the interface identifier. The module instance. The SenseManager aborts the execution pipeline if the status is an error. The SenseManager calls back this function when raw video streams (explicitly requested) are available. The module identifier. Usually this is the interface identifier. The sample from capture device The SenseManager aborts the execution pipeline if the status is an error. The SenseManager calls back this function when streaming loop in StreamFrames() function terminated. The error code @class PXCMSmoother A utility that allows smoothing data of different types, using a variety of algorithms Stabilizer Smoother – The stabilizer smoother keeps the smoothed data point stable as long as it has not moved more than a given threshold. Weighted Smoother – The weighted smoother applies a (possibly) different weight to each of the previous data samples. Quadratic Smoother – The quadratic smoother is a time based smoother ideal for UI (User Interface) purposes. Spring Smoother – The spring smoother is a time based smoother ideal for gaming purposes. Create Stabilizer smoother instance for single floats The stabilizer keeps the smoothed data point stable as long as it has not moved more than a given threshold The stabilizer smoother strength, default value is 0.5f The stabilizer smoother radius in correlation to the input unit value an object of the created Smoother, or null in case of illegal arguments Create the Weighted algorithm for single floats The Weighted algorithm applies a (possibly) different weight to each of the previous data samples If the weights vector is not assigned (NULL) all the weights will be equal (1/numWeights) The Weighted smoother weight values an object of the created Smoother, or null in case of illegal arguments Create the Quadratic algorithm for single floats The Quadratic smoother smooth strength, default value is 0.5f an object of the created Smoother, or null in case of illegal arguments Create the Spring algorithm for single floats The Spring smoother smooth strength, default value is 0.5f an object of the created Smoother, or null in case of illegal arguments Create Stabilizer smoother instance for 2-dimensional points The stabilizer keeps the smoothed data point stable as long as it has not moved more than a given threshold The stabilizer smoother strength, default value is 0.5f The stabilizer smoother radius in correlation to the input unit value an object of the created Smoother, or null in case of illegal arguments Create the Weighted algorithm for 2-dimensional points The Weighted algorithm applies a (possibly) different weight to each of the previous data samples If the weights vector is not assigned (null) all the weights will be equal (1/numWeights) The Weighted smoother weight values an object of the created Smoother, or null in case of illegal arguments Create the Quadratic algorithm for 2-dimensional points The Quadratic smoother smooth strength, default value is 0.5f an object of the created Smoother, or null in case of illegal arguments Create the Quadratic algorithm for 2-dimensional points an object of the created Smoother Create the Spring algorithm for 2-dimensional points The Spring smoother smooth strength a an object of pointer to the created Smoother, or null in case of illegal arguments Create the Spring algorithm for 2-dimensional points an object of the created Smoother Create Stabilizer smoother instance for 3-dimensional points The stabilizer keeps the smoothed data point stable as long as it has not moved more than a given threshold The stabilizer smoother strength, default value is 0.5f The stabilizer smoother radius in correlation to the input unit value an object of the created Smoother, or null in case of illegal arguments Create Stabilizer smoother instance for 3-dimensional points The stabilizer keeps the smoothed data point stable as long as it has not moved more than a given threshold The stabilizer smoother radius in correlation to the input unit value an object of the created Smoother Create the Weighted algorithm for 3-dimensional points The Weighted algorithm applies a (possibly) different weight to each of the previous data samples If the weights vector is not assigned (null) all the weights will be equal (1/numWeights) The Weighted smoother weight values an object of the created Smoother, or NULL in case of illegal arguments Create the Weighted algorithm for 3-dimensional points The Weighted algorithm applies a (possibly) different weight to each of the previous data samples If the weights vector is not assigned (NULL) all the weights will be equal (1/numWeights) The Weighted smoother number of weights a pointer to thean object of the created Smoother, or NULL in case of illegal arguments Create the Quadratic algorithm for 3-dimensional points The Quadratic smoother smooth strength, default value is 0.5f an object of the created Smoother, or null in case of illegal arguments Create the Quadratic algorithm for 3-dimensional points an object of the created Smoother, or NULL in case of illegal arguments Create the Spring algorithm for 3-dimensional points The Spring smoother smooth strength, default value is 0.5f an object of the created Smoother, or NULL in case of illegal arguments Create the Spring algorithm for 3-dimensional points an object of the created Smoother @class Smoother1D Handles the smoothing of a stream of floats, using a specific smoothing algorithm Add a new data sample to the smoothing algorithm the latest data sample smoothed value of Single type Reset smoother algorithm data Add a new data sample to the smoothing algorithm the latest data sample smoothed value in PXCMPointF32 format Reset smoother algorithm data Add a new data sample to the smoothing algorithm the latest data sample smoothed value in PXCMPoint3DF32 format Reset smoother algorithm data The function returns the available algorithm configurations. The zero-based index to retrieve all algorithm configurations. The algorithm configuration, to be returned. PXCM_STATUS_NO_ERROR Successful execution. PXCM_STATUS_ITEM_UNAVAILABLE There is no more configuration. The function returns the working algorithm configurations. The algorithm configuration, to be returned. PXCM_STATUS_NO_ERROR Successful execution. The function sets the working algorithm configurations. The algorithm configuration. PXCM_STATUS_NO_ERROR Successful execution. The function builds the recognition grammar from the list of strings. The grammar identifier. Can be any non-zero number. The string list. Optional list of labels. If not provided, the labels are 1...ncmds. The number of strings in the string list. PXCM_STATUS_NO_ERROR Successful execution. The function deletes the specified grammar and releases any resources allocated. The grammar identifier. PXCM_STATUS_NO_ERROR Successful execution. PXCM_STATUS_ITEM_UNAVAILABLE The grammar is not found. The function sets the active grammar for recognition. The grammar identifier. PXCM_STATUS_NO_ERROR Successful execution. PXCM_STATUS_ITEM_UNAVAILABLE The grammar is not found. The function sets the dictation recognition mode. The function may take some time to initialize. PXCM_STATUS_NO_ERROR Successful execution. The function starts voice recognition. The audio source. The callback handler instance. PXCM_STATUS_NO_ERROR Successful execution. The function stops voice recognition. The function create grammar from file The grammar identifier. Can be any non-zero number. The file type from GrammarFileType structure. The full path to file. PXC_STATUS_NO_ERROR Successful execution. PXC_STATUS_EXEC_ABORTED Incorrect file extension. The function create grammar from memory The grammar identifier. Can be any non-zero number. The file type from GrammarFileType structure. The grammar specification. The size of grammar specification. PXC_STATUS_NO_ERROR Successful execution. PXC_STATUS_EXEC_ABORTED Incorrect file type. PXC_STATUS_HANDLE_INVALID Incorect memSize or grammarMemory equal NULL. The function get array with error The grammar identifier. Can be any non-zero number. pxcCHAR * NULL terminated array with error or NULL in case of internal error. The function add file with vocabulary The vocabulary file type The full path to file. PXC_STATUS_NO_ERROR Successful execution. NBest The NBest data structure describes the NBest data returned from the recognition engine. The label that refers to the recognized speech (command list grammars only) The confidence score of the recognition: 0-100. The recognized sentence text data. The (grammar) tags of the recognized utterance. RecognitionData The data structure describes the recgonized speech data. The time stamp of the recognition, in 100ns. The grammar identifier for command and control, or zero for dictation. The duration of the speech, in ms. The top-N recognition results. AlertType Enumeratea all supported alert events. The volume is too high. The volume is too low. Too much noise. There is some speech available but not recognizeable. The begining of a speech. The end of a speech. The recognition is aborted due to device lost, engine error, etc The recognition is completed. The audio source no longer provides data. AlertData Describe the alert parameters. The time stamp of when the alert occurs, in 100ns. The alert event label LanguageType Enumerate all supported languages. Describe the algorithm configuration parameters. The optional speaker name for adaptation The supported language The length of end of sentence silence in ms The recognition confidence threshold: 0-100 GrammarFileType Enumerate all supported grammar file types. unspecified type, use filename extension text file, list of commands Java Speech Grammar Format Previously compiled format (vendor specific) VocabFileType Enumerate all supported vocabulary file types. unspecified type, use filename extension text file The function is invoked when there is some speech recognized. The data structure to describe the recognized speech. The function is triggered by any alert event. The data structure to describe the alert. The function retrieves the synthesized speech as an Unity AudioClip object. The audio clip name The sentence identifier the AudioClip instance, or null if there is any error. The function returns the available algorithm configuration parameters. The zero-based index to retrieve all configuration parameters. The configuration parameters, to be returned. PXCM_STATUS_NO_ERROR Successful execution. PXCM_STATUS_ITEM_UNAVAILABLE No more configurations. The function returns the current working algorithm configuration parameters. The configuration parameters, to be returned. PXCM_STATUS_NO_ERROR Successful execution. The function sets the current working algorithm configuration parameters. The configuration parameters. PXCM_STATUS_NO_ERROR Successful execution. The function synthesizes the sentence for later use. The function may take some time to generate the fully synthesized speech. The sentence identifier. Can be any non-zero unique number. The sentence string. PXC_STATUS_NO_ERROR Successful execution. The function retrieves the PXCAudio buffer for the specified sentence. There could be more than one PXCAudio buffer. The application should keep retrieving with increased index, until the function returns NULL. The audio buffer is internally managed. Do not release the instance. The sentence identifier. The zero-based index to retrieve multiple samples. the Audio buffer, or NULL if there is no more. The function returns the number of PXCAudio buffers used for the specified synthesized sentence. The sentence identifier. the number of PXCAudio buffers, or 0 if the sentence is not found. The function returns the number of audio samples for the specified synthesized sentence. Each audio sample consists of multiple channels according to the format definition. The sentence identifier. the sample number, or 0 if the sentence is not found. The function releases any resources allocated for the sentence identifier. The sentence identifier. LanguageType Enumerate all supported languages. VoiceType Enumerate all supported voices. ProfileInfo Describe the algorithm configuration parameters. The synthesized audio format. Adjust bufferSize for the required latency. The supported language The voice The speaking speed. The default is 100. Smaller is slower and bigger is faster. The speaking volume from 0 to 100 (loudest). default pitch is 100. range [50 to 200] End of sentence wait duration. range [0 to 9 multiplied by 200msec] The function synchronizes a single SP with timeout. The timeout value in ms. PXCM_STATUS_NO_ERROR Successful execution. PXCM_STATUS_EXEC_TIMEOUT The timeout value is reached. The function synchronizes a single SP infinitely. PXCM_STATUS_NO_ERROR Successful execution. The function synchronizes multiple SPs as well as OS events. Zero SPs or OS events are skipped automatically. If the idx argument is NULL, the function waits until all events are signaled. If the idx argument is not NULL, the function waits until any of the events is signaled and returns the index of the signalled events. The number of SPs to be synchronized. The SP array. The event index, to be returned. The timeout value in ms. PXCM_STATUS_NO_ERROR Successful execution. PXCM_STATUS_EXEC_TIMEOUT The timeout value is reached. The function synchronizes multiple SPs. Zero SPs are skipped automatically. If the idx argument is NULL, the function waits until all events are signaled. If the idx argument is not NULL, the function waits until any of the events is signaled and returns the index of the signalled events. The SP array. PXCM_STATUS_NO_ERROR Successful execution. PXCM_STATUS_EXEC_TIMEOUT The timeout value is reached. The function synchronizes multiple SPs. Zero SPs are skipped automatically. If the idx argument is NULL, the function waits until all events are signaled. If the idx argument is not NULL, the function waits until any of the events is signaled and returns the index of the signalled events. The SP array. The event index, to be returned. PXCM_STATUS_NO_ERROR Successful execution. PXCM_STATUS_EXEC_TIMEOUT The timeout value is reached. The function synchronizes multiple SPs. Zero SPs are skipped automatically. If the idx argument is NULL, the function waits until all events are signaled. If the idx argument is not NULL, the function waits until any of the events is signaled and returns the index of the signalled events. The SP array. PXCM_STATUS_NO_ERROR Successful execution. PXCM_STATUS_EXEC_TIMEOUT The timeout value is reached. A convenient function to release an array of objects @brief Return the configuration parameters of the SDK's TouchlessController application. @param[out] pinfo the profile info structure of the configuration parameters. @return PXC_STATUS_NO_ERROR if the parameters were returned successfully; otherwise, return one of the following errors: PXC_STATUS_ITEM_UNAVAILABLE - Item not found/not available.\n PXC_STATUS_DATA_NOT_INITIALIZED - Data failed to initialize.\n Set configuration parameters of the SDK TouchlessController application. the profile info structure of the configuration parameters. PXC_STATUS_NO_ERROR if the parameters were set correctly; otherwise, return one of the following errors: PXC_STATUS_INIT_FAILED - Module failure during initialization.\n PXC_STATUS_DATA_NOT_INITIALIZED - Data failed to initialize.\n Adds a gesture action mapping. name of the gesture. The action to perform when gesture is recognized. The pxcmStatus. Register an event handler for UX Event. The event handler's will be called each time a UX event is identified. a delegete event handle. PXC_STATUS_NO_ERROR if the registering an event handler was successful; otherwise, return the following error: PXC_STATUS_DATA_NOT_INITIALIZED - Data failed to initialize. Unsubscribe an event handler for UX events. a delegete event handle. that should be removed. PXC_STATUS_NO_ERROR if the unregistering the event handler was successful, an error otherwise. Register an event handler for alerts. The event handler's will be called each time an alert is identified. a delegate event handler. PXC_STATUS_NO_ERROR if the registering an event handler was successful; otherwise, return the following error: PXC_STATUS_DATA_NOT_INITIALIZED - Data failed to initialize. Unsubscribe an event handler for alerts. a delegate event handler that should be removed. PXC_STATUS_NO_ERROR if the unregistering the event handler was successful, an error otherwise. Adds a gesture action mapping. name of the gesture. The action to perform when gesture is recognized . actionHandler will be called when the action is performed The pxcmStatus. Clear all previous Gesture to Action mappings ProfileInfo Containing the parameters that define a TouchlessController session An or value of configuration options Configuration an or value of UX options relevant to specific application No option is selected - use default behavior Should zoom be allowed Use draw mode - should be used for applications the need continues interaction (touch + movement) like drawing Enable horizontal scrolling Enable vertical scrolling On a "V" gesture enables the *meta* UXEvents Causes the OS to simulate the gesture with the appropriate injection Enable horizontal scrolling Enable vertical scrolling Enable Back Gesture Enable Selection Gesture if enabled TouchlessController will stop tracking the hand while the mouse moves Describe a UXEvent, UXEventType Values that represent UXEventType. An alert data, contain data describing an alert. Values that represent AlertType. Values that represent Action. Those are actions the module will inject to the OS Values that represent Sensitivity level for the pointer movement Returns TRUE if the current state is actively tracking (valid pose information is available) Set the camera parameters, which can be the result of camera calibration from the toolbox Add a 2D reference image for tracking an object path to image file coordinate system ID of added target image width in mm (optional) image height in mm (optional) minimal similarity measure [0..1] that has to be fulfilled for the image or one of its sub-patches. Use features from the environment to improve tracking Add a 2D reference image for tracking an object Target image data coordinate system ID of added target image width in mm (optional) image height in mm (optional) minimal similarity measure [0..1] that has to be fulfilled for the image or one of its sub-patches. Use features from the environment to improve tracking Add a 3D tracking configuration for a target This file can be generated with the Toolbox The full path to the configuration file (*.slam, *.xml) coordinate system ID of the first added target coordinate system ID of the last added target (may be the same as firstCosID) Use features from the environment to improve tracking Remove a previously returned cosID from tracking. Remove all previous created cosIDs from tracking Enable instant 3D tracking (SLAM). This form of tracking does not require an object model to be previously created and loaded. Specify the coordinate system origin and orientation of the tracked object. true uses the first image captured from the camera false (default) uses the "main plane" of the scene which is determined heuristically Instant tracking may fail to initialize correctly if the camera image has not stabilized or is not pointing at the desired object when the first frames are processed. This parameter skips the initial frames which may have automatic adjustments such as contrast occuring. This parameter may be 0 if instant 3D tracking should initialize from the next frame. Get the number of targets currently tracking see QueryTrackingValues, QueryAllTrackingValues The number of active tracking targets Get information for all of the active tracking targets Pointer to store the tracking results at. The passed in block must be at least QueryNumberTrackingValues() elements long Return information for a particular coordinate system ID. This value can be returned from Set2DTrackFromFile(), Set2DTrackFromImage(), or Set3DTrack(). coordinate system IDs for Set3DInstantTrack() are generated dynamically as targets that are determined in the scene. The coordinate system ID to return the status for The returned tracking values. the user needs to manage the mapping between the cosIDs and targets in loaded. The tracking states of a target. The state of a target usually starts with ETS_NOT_TRACKING. When it is found in the current camera image, the state change to ETS_FOUND for one image, the following images where the location of the target is successfully determined will have the state ETS_TRACKING. Once the tracking is lost, there will be one single frame ETS_LOST, then the state will be ETS_NOT_TRACKING again. In case there is extrapolation of the pose requested, the transition may be from ETS_TRACKING to ETS_EXTRAPOLATED. To sum up, these are the state transitions to be expected: ETS_NOT_TRACKING -> ETS_FOUND ETS_FOUND -> ETS_TRACKING ETS_TRACKING -> ETS_LOST ETS_LOST -> ETS_NOT_TRACKING With additional extrapolation, these transitions can occur as well: ETS_TRACKING -> ETS_EXTRAPOLATED ETS_EXTRAPOLATED -> ETS_LOST "Event-States" do not necessarily correspond to a complete frame but can be used to flag individual tracking events or replace tracking states to clarify their context: ETS_NOT_TRACKING -> ETS_REGISTERED -> ETS_FOUND for edge based initialization Quality of the tracking values. Value between 0 and 1 defining the tracking quality. A higher value means better tracking results. More specifically: - 1 means the system is tracking perfectly. - 0 means that we are not tracking at all. Time elapsed (in ms) since last state change of the tracking system Time (in milliseconds) used for tracking the respective frame The ID of the target object The name of the target object Extra space for information provided by a sensor that cannot be expressed with translation and rotation properly. The sensor that provided the values The translation component of the pose projected onto the color image, in pixels Interactive version of map creation, similar to the toolbox functionality. Depth is automatically used if supported in the current camera profile. Map creation is stopped either explicitly with \c Cancel3DMapCreation or by pausing the tracking module using pSenseManager->PauseTracker(TRUE). The map file may be saved at any time with Save3DMap. see also Cancel3DMapCreation, Save3DMap, QueryNumberFeaturePoints, QueryFeaturePoints Relative size of the object to create a map for, an accurate value helps improve the initialization time. Cancel map creation without saving a file, resetting the internal state. see also Start3DMapCreation Begins extending a previously created 3D Map with additional feature points. Map extension is an interactive process only. The extended map may be saved using \c Save3DMap at any time. see also Load3DMap Cancel map extension without saving a file, and reset the internal state. see also Cancel3DMapExtension Load a 3D Map from disk in preparation for map extension or alignment operations. Name of the filename to be loaded Save a 3D Map. Maps must be saved to disk for further usage, it is not possible to generate a map in memory and use it for tracking or extension later. Name of the filename to be saved Returns the number of detected feature points during map creation. see also QueryFeaturePoints Retrieve the detected feature points for map creation. Active points are ones which have been detected in the current frame, inactive points were detected previously but are not detected in the current frame. see also QueryNumberFeaturePoints, Start3DMapCreation, Start3DMapExtension Array where the feature points will be stored Return the active (currently tracked) features in the array Return the inactive (not currently tracked) features in the array The number of feature points copied into points Aligns a loaded 3D map to the specified marker. Alignment defines the initial pose of the model relative to the axes printed on the marker (+Z points up out of the page). By default, the coordinate system pose (origin and rotation) is in an undefined position with respect to the object. The placement of the marker specifies the (0,0,0) origin as well as the alignment of the coordinate axes (initial rotation). Alignment also enhances the returned pose coordinates to be in units of millimeters, instead of an undefined unit. see also Stop3DMapAlignment, Is3dMapAlignmentComplete Integer identifier for the marker (from the marker PDF) Size of the marker in millimeters Cancel the current 3D Map alignment operation before it is complete. Any in-progress state will be lost. see also Start3DMapAlignment, Is3DMapAlignmentComplete Returns TRUE if alignment is complete. At that point the file may be saved with the new alignment values. see also Start3DMapAlignment Start the camera calibration process. Calibration can improve the tracking results by compensating for camera distortion and other intrinsic camera values. A successful calibration requires several frames, with the marker in different orientations and rotations relative to the camera. see also QueryCalibrationProgress, SaveCameraParameters Integer identifier for the marker (from the marker PDF file) Size of the printed marker in millimeters Stop the camera calibration process before it is complete. No new calibration parameters may be saved. see also StartCameraCalibration, QueryCalibrationProgress Return the calibration progress as a percentage (0 - 100%). Calibration requires several different views of the marker to produce an accurate result, this function returns the relative progress. A calibration file may be saved before this function returns 100% but the quality will be degraded. If calibration has not been started this function returns a negative value see also StartCameraCalibration Save the current camera intrinsic parameters to an XML file. see also SetCameraParameters, StartCameraCalibration Filename to save the XML camera parameters in The relative size of a target object. Specifying the appropriate size helps improve the training initialization process. Cup sized Desktop sized Room sized Special purpose cosIDs which may be passed in to QueryTrackingValues. These values may be used to get the current tracking position of the map creation operations. see also PXCMTracker.QueryTrackingValues Pose of the detected calibration marker New pose of the tracked object based on the alignment marker @brief Return the available module input descriptors. @param[in] pidx The zero-based index used to retrieve all configurations. @param[out] inputs The module input descriptor, to be returned. @return PXCM_STATUS_NO_ERROR Successful execution. @return PXCM_STATUS_ITEM_UNAVAILABLE No specified input descriptor is not available. @brief Pass projection object for mappings between color and depth coordinate systems @param[in] projection The projection object. @brief enables GPU processing controls @param[in] enable is a bool that enables a specific taskId on GPU. @param[in] taskId provides more fine-grained controls on which task would be enbled on GPU. default is -1, meaning that all tasks are enabled on GPU. Return the active input descriptor that the module works on. The module input descriptor, to be returned. PXCM_STATUS_NO_ERROR Successful execution. Set the active input descriptor with the device information from the capture device. The input descriptor with the device information. PXCM_STATUS_NO_ERROR Successful execution. Feed captured samples to module for processing. If the samples are not available immediately, the function will register to run the module processing when the samples are ready. This is an asynchronous function. The application must synchronize the returned SP before retrieving any module data, which is not available during processing. The samples from the capture device. The SP, to be returned. PXCM_STATUS_NO_ERROR Successful execution. @brief Pass projection object for mappings between color and depth coordinate systems @param[in] projection The projection object. @brief enables GPU processing controls @param[in] enable is a bool that enables a specific taskId on GPU. @param[in] taskId provides more fine-grained controls on which task would be enbled on GPU. default is -1, meaning that all tasks are enabled on GPU. struct Describes a pair value of device property and its value. Use the inline functions to access specific device properties. struct Describes the streams requested by a module implementation. StreamDescSet A set of stream descriptors accessed by StreamType. Access the stream descriptor by the stream type. The stream type. The stream descriptor instance. struct Data descriptor to describe the module input needs. A type representing a 3d box with pxcF32 values This enumeration defines various return codes that SDK interfaces use. Negative values indicate errors, a zero value indicates success, and positive values indicate warnings. Unsupported feature Unsupported parameter(s) Item not found/not available Invalid session, algorithm instance, or pointer Memory allocation failure Acceleration device failed/lost Acceleration device lost Acceleration device busy Execution aborted due to errors in upstream components Asynchronous operation is in execution Operation time out Failure in open file in WRITE mode Failure in open file in READ mode Failure in close a file handle Data not available for MW model or processing Data failed to initialize Module failure during initialization Configuration for the stream has changed parameter cannot be changed since configuration for capturing has been already set Mismatched coordinate system between modules calibration values not matching Acceleration unsupported or unavailable time gap in time stamps the same parameters already defined Data not changed (no new data available) Module failure during processing Data value(s) out of range Not all data was copied, more data is available for fetching