Intel’s Pittsburgh Research Lab opened its doors this week for a tour of the fascinating exploratory research they’re doing on future technologies including a natural gesture interface for games built as a novel application of SLIPstream parallelization techniques.
The Pittsburgh lab demonstrated this interface with a head-to-head Tetris-style game, where the players use whole body gestures to control the motion of their pieces.
Unlike typical approaches to gesture detection that employ props, special clothing/markers (motion capture systems) or a controlled environment such as a blue screen), the Intel approach is designed to work in everyday environments and does not require users to be segmented from the background. Although the technique is computationally expensive, the researchers have achieved interactive speeds by parallelizing the vision algorithm across a cluster of machines in a manner that minimizes latency.
Unlike typical approaches to gesture detection that employ props, special clothing/markers (motion capture systems) or a controlled environment such as a blue screen), the Intel approach is designed to work in everyday environments and does not require users to be segmented from the background. Although the technique is computationally expensive, the researchers have achieved interactive speeds by parallelizing the vision algorithm across a cluster of machines in a manner that minimizes latency.