Blog posts from category: Observational Research
News from the development department:
Mangold VideoSyncPro supports now video recordings at 120 frames per second
Presume you have multiple coders that collected their observations in different data files, or you collected your observations in separate data files per video. Analyzing such data becomes a lot easier, if all observations (belonging to the same study) are merged into one big file! That way you can visualize all data at once and get comparative statistics within a single table. Read more
Data collection is mainly a laborious task, but playing around with the data to gain further insights is where the fun starts. The first stage is taking a look at descriptive statistics like frequency, duration, percentage over time and latencies. With INTERACT, those results are available within a few mouse clicks. Read more
The software Mangold INTERACT is used in numerous scientific studies for the qualitative and quantitative analysis of observation videos. In the following list you can see how diverse the application areas are in the field of video analysis.
For the collection of these documents we used Google, May 2017.
Integrate high-end EEG recordings using BRAIN PRODUCTS equipment and video based observations in one study and get more information by combining behavioral observation + EEG measurement. Read more
A Dynamic Systems Analysis of Children with and without Autism Spectrum Disorder - a study from Yuqing Guo & Dana Rose Garfin & Agnes Ly & Wendy A. Goldberg, published in the Journal of Abnormal Child Psychology.
A Dynamic Systems Analysis of Children with and without Autism Spectrum Disorder - another amazing study, presented at the poster session of the SRCD 2017.
We are very proud that the researchers used Mangold INTERACT for the emotion engagement coding scheme.
This study by Lauren B. Adamson, Roger Bakeman, Katharine Suma, and Diana L. Robins from the Georgia State University provides an unprecedented view of how toddlers react to and share speech, music, and environmental sounds.
We proudly publish the poster of the intensive frame-by-frame analysis from Yonat Rum, Ditza A. Zachor and Esther Dromi about sibling interactions, which was presented on the SRCD 2017 in Austin.
We will show you how you can use Mangold INTERACT to compare automatically observed emotions with manually collected observations. Integrate full natural spoken word transcriptions during observation.