This episode features Peter Hartzbech of iMotions, who discusses the company’s biosensor software platform for studying human behavior. Key points:
- iMotions provides a software platform that uses biosensors like eye tracking, galvanic skin response, EEG, voice analysis, and facial expression analysis to study human behavior in both academic and commercial research.
- The company serves diverse sectors including academia, consumer insights, and human factors, offering tools for understanding human-to-human and human-to-machine interactions.
- Imotions helps brands with early-stage R&D, packaging design, and shopper insights using techniques that go beyond traditional surveys to measure emotional responses.
- By integrating Affectiva’s technology, iMotions can now offer scalable solutions for market research, including facial expression analysis, head pose, and attention metrics.
- Imotions is working towards a SaaS model where clients can easily conduct online studies with features like webcam-based eye tracking and respiration measurement.
- The company focuses on providing high-quality raw data and proven methodologies, allowing clients to integrate various behavioral science concepts into their research.
- iMotions has a strong global foothold in academia, working with more than 70 of the world’s top 100 universities.
Listen or Watch
Podcast: Play in new window | Download
VIDEO:
Peter Hartzbech – Key Moments
00:00 Behavioral Analysis in Various Fields
05:14 Driving Behavior and Monitoring Systems
06:58 “Multi-Modal Sensory Analysis”
12:36 NeuroMarketing and Sensory Research
15:22 SmartEye Merges for Enhanced In-Cabin Tech
19:41 Webcam-Based Eye Tracking Advances
22:57 Multimodal Insights via Webcam Analysis
25:40 Gen AI: Quick Adoption vs. Informed Decisions
27:50 Open, Transparent Tech Integration
Peter Hartzbech Quotes
- On emotion vs. rational thought: “…a lot of the decisions are based on the emotion response and the emotional impact and not on the rational thought process…” [00:07:50]
- On the importance of research methodology: “I think that you still need to, to use the right method, research methodology. Right. You know, need to know what is your research question before you can answer it.” [00:09:06]
- On the value of identifying what not to do: “…I think it’s also a lot about picking the losers, so to say, which ones should you not put into the market?” [00:12:36]
- About iMotions’ approach to data: “Our approach has always been that we want to have high quality raw data, right…So we want to have very scientifically proven data that’s properly synchronized and precise” [00:27:06].
Peter Hartzbech
Peter Hartzbech is the Founder & CEO of iMotions, a company based in Copenhagen that specializes in biosensor software designed for the analysis of complex human behaviors. Driven by a desire to use technology to improve the world, Hartzbech established iMotions at the age of 25. The iMotions platform is used by over 1,300 organizations to study human behavior and cognition, diagnose neurological diseases, improve customer experiences and learning environments, and enhance overall human well-being. In October 2021, iMotions joined Smart Eye, a leader in Human Insight AI. Hartzbech is also a Business Angel Investor in start-ups and a founding member of the ByFounders Fund.
Peter Hartzbech Resources
Twitter/X: @Hartzbech
LinkedIn: https://www.linkedin.com/in/peter-hartzbech-the-entrepreneurial-gladiator/
Website: https://imotions.com
If you like Brainfluence…
- Never miss an episode by subscribing via iTunes, Stitcher or by RSS
- Help improve the show by leaving a Rating & Review in iTunes (Here’s How)
- Join the discussion for this episode in the comments section below
Full Transcript:
Full Episode Transcript PDF: Click HERE
Roger Dooley [00:00:00]:
So today I’m with Peter Hartzbech of IMotions. Peter, can you please explain what IMotions is and does for those parts of our audience that may not be familiar with the company?
Peter Hartzbech [00:00:15]:
Thank you, Roger. So basically, IMotions, we work with the biosensors. So we started out as an eye tracking company that most of you of course know, Toby and other eye tracking companies out there. And then we moved into what we call multimodal research. So we work with different sensors and different signals. So galvanic skin response, eeg, voice analysis and facial expression analysis. So basically what we do is we are software platform for researchers, both academic researchers, but also commercial researchers. So we have approximately 1,500 clients in 90 countries across the academic and the commercial fields that use our platform.
Roger Dooley [00:00:58]:
And so how do these different customer groups use your technology talking? I mean, you’ve got academics, you’ve got brands, you’ve got healthcare. Very briefly, how, how would these different, very actually very different groups use those techno use your technologies?
Peter Hartzbech [00:01:14]:
Yeah, good question. Yeah, it is really, really broad, right? Because I mean, basically you can do whatever kind of human behavior research you want to do. All human human or human machine. So if we, we kind of have three fundamental pillars. We say in the company. So we have the academic pillar I just mentioned. So that’s university researchers. It could be, for example, negotiation research at Sloan School of Management at mit.
Peter Hartzbech [00:01:40]:
They use it to, to look at facial expressions while you negotiate to see if there’s, for example, if a brow furrow, you can see that, you know, this person is a bit irritated on, on negotiation or something like that. So for example, business schools can use it for that also for marketing testing. Right? All kinds of images, videos, websites, all kinds of marketing collateral. So of course that’s, that’s pretty normal to test on both marketing and, and business schools. We have engineering schools that use it for testing a new car design, for example, or how should the door handle sound? You know, when you, you know, some cars where you then, uh, open the door, it should have a specific sound, for example. So a lot of design in the engineering space. And then we have psychology, psychiatry, and, and I think in the last podcast we did, I think we talk about the original founding idea was we wanted to diagnose neurological diseases. Alzheimer’s, Parkinson’s, adhd, autism, kids.
Peter Hartzbech [00:02:42]:
And that’s also a very big area for us. All the medical centers like mgh, Mayo Clinic, all these kind of clients are using it for fundamental research on, on measuring and diagnosis of, of neurological diseases. So on the academic side, which is pillar one, that’s tons of use cases as you mentioned yourself. Then we have the Consumer insights we call it, right. Which is basically what’s relevant also to this discussion we have today with Effectiva. But on the imotion side we basically help large brands, you know, the PNGs, Unilevers, that kind of companies that basically go in and they test very early stage rd so it could be frequents. So how does you, how does people react to a fragrance or scent? So for example, if you produced a strawberry ice cream, then you would have for example 10 different scents of strawberry with different intensities. And then what people did before iMotions was they just asked a survey.
Peter Hartzbech [00:03:40]:
So it’s like okay, do you like this scent or not? On the scale from 0 to 10, you know, the good old market research way. But what we do here is that we can go in and use facial expressions, EEG and so on to basically measure the reactions and better get a full understanding of the insights. So you can take better decisions of what scent to choose to optimize it for your, your particular audience in the market. Right. So, so that’s the consumer insights part. And so it’s both an RD but also packaging. Right. So from the start of the package, how should it look and feel up to also shoppers insights where you basically can use eye track glasses with sensors to walk around the store and see how you know that the packets pop on the shelf against the competition, for example.
Peter Hartzbech [00:04:30]:
And then of course there’s the marketing side of it, you know, all the collateral, all the websites and so on. And then of course the advertising part, which is what’s interesting about this, this new integration with the Effectiva, which is basically the world leading measurement of advertising and the facial coding that they do on it. So but we can come back to that. But that’s the consumer insights pillar. And then we have what we call the human factors pillar, which is complex driving simulator setups. It’s a lot in the automotive space. It’s training simulation where you have dangerous situation on a drilling platform, for example. We can have people doing research where you can follow where people they look at if, if they have to.
Peter Hartzbech [00:05:14]:
Basically if there’s a blowout or something, how do people react? You can train people in 3D environments, for example, where you also measure the sensors to see if they become stressed and so on. And then there’s car simulators. It’s a very large area for us. And since we talked last time, we have grown a lot in the automotive space due to our, you know, our acquisition by SmartEye. So we, we help a lot in the RD stage of, of building new cars, but also driving monitoring systems. All cars by 2026 need to actually have a driving monitoring system by law in eu. So, so that’s scaling up right now when we have more than 1.5 million cars driving on the road with, with our system. But what iMotions do for them is like how do you measure if this system actually works or not? If you have to look left, you know, if you look out the window, then it has to be, for example, right after a while.
Peter Hartzbech [00:06:06]:
So that’s, that’s also used to kind of validate that the driving monitoring systems works. So yeah, so that’s the, that’s kind of what we call the human factors pillar. So to shortly summarize, we have the academia, we have the consumer insight space, and then the human factors.
Roger Dooley [00:06:25]:
I’m curious when you say, okay, used to be if we wanted to measure how something smelled or maybe how a car door sounded when it shut, we just ask people, do you like that or don’t you like it? Or rate it on a scale of 1 to 5 or something like that. Are there consistent inputs or consistent measurements that say from an EEG that you can tell if somebody smells five different scents, which ones they prefer more?
Peter Hartzbech [00:06:58]:
Yeah, I mean, that’s a good question. So of course we also combine it with traditional methods, right? So when we do a study, we measure, for example, we expose these different scents and then we can see if they react to it with galvanic skin response, right? Do they have peaks when they are exposed to the smell, for example? And then we have different, of course, validated metrics that we look at within each of the sensors. But I think the main point here is it’s the multi modality, right? It’s looking at as many signals as possible. And these signals are something you cannot control as a human being, right? I mean you do get these facial expressions and you also have, you know, basically skin conductivity, how much you sweat in your fingers and eg, the brain waves. I mean, it’s not something that you can kind of control. So the combination of that combined with the traditional methods is really what makes it very powerful because people cannot articulate necessarily their opinion, right? I mean, they will rationalize it, right, and come up with an answer. But you know, basically what a lot of the decisions are based on is the emotion response and the emotional impact and not on the rational thought process, right? So you want to make sure that car design is neat and that it attracts you. But of course, you also need to rationally be able to afford the car.
Peter Hartzbech [00:08:06]:
Right? I mean, everybody gets emotional arousal by looking at a nice new Porsche. But you know, again, is, is to have the money to actually buy it. Right? So those things also have to, of course, be consolidated into the research. So you also ask people the questions. Right?
Roger Dooley [00:08:22]:
So I guess, Peter, what I’m, what I’m asking is. Oh, and I, I think you’re saying the answer is yes. But when you combine these various inputs from skin conductance and EG and whatever else you happen to be measuring biometric measurements, are you at a point now where you can reliably say, okay, people like this particular thing or they don’t like this particular thing, because that’s always been the thing that’s bedeviled some of these things, even going all the way back to FMRI research, you know, it’s like, okay, we can see the reactions, but what do those reactions mean? Wow, this, this produced a bigger reaction. But do, do we, can we reliably predict that that’s a positive reaction every time?
Peter Hartzbech [00:09:06]:
Yeah, I, I think, I think we are much further than where we were back 20 years, of course, ago. But I do think that you still need to, to use the right method, research methodology. Right. You know, need to know what is your research question before you can answer it. Right. So I don’t believe it’s just something that you test everything in front of people, then you can say yes or no here. So I think it’s about make the most informed decisions. And when you make your decisions based on only a survey, you’re missing a lot of the piece of the puzzle here.
Peter Hartzbech [00:09:35]:
Right? So I think that’s the point, that we get much better metrics to really decide what is the best option of these ones. But in my experience, I think it’s also often, how do you take, let’s say five or ten designs of something or ten, five or ten cents, and then funnel it down to maybe have two to three to choose from? So I think it’s also a lot about picking the losers, so to say, which ones should you not put into the market? Because I don’t know if you saw the case that we had with Tropicana, everybody knows that their sales dropped more than 20%. It was like billions of dollars. So that they lost because of a wrong package. But they could maybe have entered this if they had done some more research on it and, and, and been able to make sure they didn’t pick this particular package but one of the other ones. And then let’s say if, if you have, let’s say you had a, an arbitrary score of 8 and 8 and then you had something that had 2. The point is here you want to make sure you find the one that has two. Right.
Peter Hartzbech [00:10:28]:
If you put two in the market, that’s eight, then you’re not going to flunk in the market. Right. So I think that’s really the point here that you can also use this to, to basically make sure you don’t take a really losing package, for example, into the market. Right. So and of course the more you test with this, the more you can build up norms and so on. And, and that’s what a lot of companies are doing. You also know we have a long standing relationship with cancer on the advertising space. Right.
Peter Hartzbech [00:10:51]:
We have tested 90,000 ads at this point with the fxv platform. So a 9 0, not 1 9, like 90,000 ads. Right. So then you begin to of course understand how an ad should perform in different categories and so on. Right. So I think that’s what’s interesting now that these technologies become more scalable and then the intention is here that we want to make it more multimodal. But we can talk about that in a, a second. So yeah, so I mean of course we get a much closer and we see really things scaling back up in the in person research.
Peter Hartzbech [00:11:22]:
I mean obviously in Corona when we talked last time, it was tough times because people couldn’t get into the lab and that’s what accelerated our online presence and online software, which is actually really good now as well. But of course a lot of people, they still like to get the consumers in there, speak to them, talk to them, interview them, have focus groups and also do these kind of tests where you sit in front of the system and engage with some kind of product. Right.
Roger Dooley [00:11:48]:
So Peter, you describe the work that you’re doing at IMotions primarily as behavioral research or behavior research. How do you feel about the terms neuromarketing and consumer neuroscience? And I don’t know what other terms might have been applied to this space.
Peter Hartzbech [00:12:03]:
Yeah. While the two of us has been around, there’s been a lot of different ways to talk about this. I mean neuromarketing, you know, it’s, it’s, it sounds good and fancy and so on, but I do think there’s much more science behind it today. I think this thing about like that people had the buying button and so on, that’s not Something that I, I believe in. I, I believe that you have to do thorough research, real research. Right. And need to have sharp researchers that can really answer, sorry, ask the right questions in order to really get better and more insightful answers. Right.
Peter Hartzbech [00:12:36]:
So, so I think you know, neomarketing is, is a bit narrow I would say we, we more like out more kind like sensory research and so on. But of course on the advertising side it’s pretty fitting, right? I mean it’s like if you test advertising you use Nero, but then again if you use facial action coding system is set Nero. Right. I mean the way I see Nero, it’s more like if you, if you of course use EEG and, and look deeper into the brain and of course we also do that. But also a lot of people, they use eye tracking, facial expressions and GSR which is basically the standard triangulation today, right. Combined with surveys. Then you have, for the GSI you have arousal, right? So the intensity of the motion response, you have the valence from facial co. Is it positive, negative, pleasant, unpleasant and then you have the eye tracking for the attention.
Peter Hartzbech [00:13:27]:
What did you actually look at? What did you observe while you did it? And so that’s the kind of standard solution we sell today. And then for the more advanced researchers we would add eeg. Right. But anyway so, so you know, neuromarketing is, is of course, you know, some kind of marketing way of saying it but, but honestly I think you know, there’s a lot of behavioral research that’s being done that is not neuromarketing. Right. And, and basically that’s, I think it’s.
Roger Dooley [00:13:54]:
It’S been a, an evolution because even in the early days neuromarketing studies often incorporated other things like biometrics or eye tracking and whatnot. But they all sort of fell under that, that umbrella. But yeah, maybe we’re just splitting hairs there now. Peter, today you are integrating Affectiva technology into your platform. You’ve been working with Affectiva stuff for years now, right? Explain what is changing and how that’s going to change the results for your clients.
Peter Hartzbech [00:14:30]:
Yeah, yeah. This is super exciting I think and as you know we have more worked more than decades to together with Fxiva from Rana and I, we met many years ago and, and basically the way we started the collaboration was that iMotions we had a company called Emotion, I think you maybe recall it back in like 2016 that was sold to Apple and then we looked for a facial expression engine and that was the best in the world at the time. So then we integrated into the iMotions platform. So we have been selling this technology to more than 1,500 institutions across the world that use Effectiva’s SDK or software development kit through imotion software. Right. So it’s a module inside imotion software. So if people want to use facial expressions they basically buy it through us. And then you know, it was effective as technology on a very new share basis.
Peter Hartzbech [00:15:22]:
And then you know, Effectiva had as you know, just shortly we, we in the smarter group we, we merged three companies together at SmartEye that was an eye tracking company, both hardware and software for, for research but also for the automotive space. And that’s primarily why it was acquired by SmartEye because what they do with the emotion testing and so on is the in cabin testing or in cabin observation or you have the dry monitoring which is eye track and then you in the future will observe the whole cabin, then the mood and, and so on. And that’s what Fictio was really, really good at. So that’s why they were put into the group, so to say for the automotive space. But then they also had a very strong research arm and that’s the one that we have merged with now. And, and as you maybe know there has been a long standing relationship with cancer and a few other large clients and we just actually renewed the, the cancer relationship for another three years here. So that’s super exciting. It’s very good and healthy relationship we have.
Peter Hartzbech [00:16:24]:
I look forward to dig much more into that. But basically what we’re doing now is that we are taking the core technologies of FXIVA and bring them into iMotions. Right. So of course you can still buy it as an iMotions module but what we can also do now is that we can create much stronger solutions for market research companies. And that’s been one of the challenges for iMotions because we sold to a lot of the end clients like the big brands as you mentioned before. But the market research companies we have not really had the right model and scalable model and I think that’s the kind of what Effectiva has been really good at is to really have a scalable technology primarily within advertising that is being built into all the link tests today. For example, right. With with cancer.
Peter Hartzbech [00:17:08]:
So, so I think we have a very scalable solution that can be integrated into the market research panels, for example, so we can very easily add facial expression to the surveys. Right. And, and you know like when we talk about facial expression analysis and the effective SDK that’s also much More than just facial expression and emotions. There’s head pose, right. There’s distance to the screen. There’s a lot of behavioral metrics that are really relevant for this kind of online scalable test. And then we have the attention metric. So basically do we have attention on the screen or not? Because one thing is that you for example, look like this out to the side but my eyes could still look on the screen.
Peter Hartzbech [00:17:47]:
So for example, you can also get the gaze direction in, in you know, not extremely precise like we do with our normal hardware but like on a pretty okay indication of if there’s attention or not on the advertisement either on your phone or on the, on the, on the laptop. Right. So we begin to try to go into multimodal direction with this where the core was the facial expressions. But now we can have a scalable solution and that we can bring out to the, you know, thousands of clients that iMotions have. And that’s of course why this is going to be very, very interesting for the clients because now we have scalable technology that’s easy to implement, that gives you new insights on top of what you get by just doing a qualic survey or whatever other survey. Right. So integrating that into the panel providers, it’s going to be super exciting. And, and for me, Roger, I, I remember we maybe met the first times maybe 20 years ago now or some 15 at least.
Peter Hartzbech [00:18:40]:
Right. At the market research conferences. You know, Mia, AIF and I talked to all of my colleague here. We have 20th anniversary this year and we started there, right. We started trying to sell our eye tracking solution back then and got rejected a lot of places because it was just too early. Right. And we’re up against all the hardware companies and the eye tracking side. So I’m very excited now to actually get back into the market research space with our solution.
Peter Hartzbech [00:19:07]:
And of course I still have all the contacts so I look forward to, to contact all the large market research companies to try to work together with this more integratable solution. So yeah, so it’s, it’s really exciting. So they have a lot of technology, we can get it to market. Right. So that’s why we, we’re integrating Effectiva’s media analytics department that fully enjoy motion. So they’re going to be iMotions. Yeah. Under our umbrella since January 1st.
Roger Dooley [00:19:34]:
What do you think about, speaking of eye tracking, what do you think about AI simulations of eye tracking?
Peter Hartzbech [00:19:41]:
I think that’s, that’s, that’s a long discussion but I must say I believe in, in, in at Least web camera based eye tracking. I think, you know, it’s, it’s, it’s very clear that when you use these AI based methods it, it recognize faces, right? Like it’s pretty natural. But whenever there’s something else, I feel that there’s at least a pretty big room for error. So I think sometimes people, they use this just to have something and say, yeah, we did some kind of eye tracking, but I do think we have something that’s going to be much more powerful. So we, we have just come out to market with a new algorithm that’s calibration, like no calibration eye tracking. So we, we have now the world’s best, I would claim, web based eye tracking where you, where you also calibrate and we didn’t have that last time we spoke, right, that happened during corona. So through a webcam we can now calibrate. People have pretty good results that are near some of the lower end hardware eye tracking companies, right? Precision wise and accuracy wise.
Peter Hartzbech [00:20:40]:
But what we come with now for the more scalable platform here is the calibration less eye tracking. So that means because of AI and so on and training the systems we get very close to. Of course it’s not the same position as if you calibrate, but it’s definitely much stronger than having just an AI prediction model. And because you collect the data anyway through the survey, right, where you collect facial expressions and, and, and you know, the survey inserts and all that, you know, it’s not going to take any extra time. So now we actually begin to have something that’s really precise. Especially when you, when you look at advertising, 30 second advertising, right, it’s comparable, it’s not like it’s a website and so on where you maybe move the eyes a bit quicker and so on. So I think for, for, for the advertising space we have something really, really interesting here. And it’s not, it’s not generative AI, right? It’s basically real data that we get out of the web camera where we combine both some of the facial features but also the eye tracking, right.
Peter Hartzbech [00:21:46]:
Looking at, directly at the eye through the web camera.
Roger Dooley [00:21:49]:
Do you see this evolving into a SaaS model where you know, some random small agency can just go to a website and sign up for, you know, €100amonth or something and conduct some limited number of eye tracking studies that way? Or will this always be more of a professional service, do you think?
Peter Hartzbech [00:22:10]:
No, I think, I mean generally we are SAS company, right? We are a software company in iMotions. So I think this is something actually the client can choose what they want to do if, if, if. Also we have enabling services that we restarted here since Corona, because obviously that was tough during Corona. But we also help people with the first studies, right? So let’s say you’re a market research company, you want to get into the space, then we have consultants that can help you to basically do the first studies and deliver the results, or you can deliver the results yourself. Right? But I do think that we get closer to having an online platform. You log in on a browser, we actually have that already. And then you can drag in some advertisements, make a randomization or something like that. And then you, you, you do your survey, or you have a standard survey you do every single time.
Peter Hartzbech [00:22:57]:
And then you can conduct the study relatively quickly and have like a dashboard kind of output. So you can do that with, with, with facial expression analysis and also with the calibration, less eye tracking. But also, interestingly, we just released, I don’t know if you saw that, but a new metric we built in house and eye motions that’s actually measuring respiration. So that’s an arousal indicator out of the webcam and that’s really super powerful, right? Because we sit, we move a little bit when we breathe. So, so we have built an algorithm that we have trained and it’s really, really great. So out of the webcam we can now make a fully multimodal platform that has eye tracking and it has the facial expression for valence, as we talked about before, and then the respiration for, you know, an arousal kind of metric. So then you can talk about, you really get, get true insights here and then make it integratable to people’s panels so they don’t really have to do anything. Basically they set up their quartic survey and boom, then they have all of this extra data on top of what they did already.
Peter Hartzbech [00:23:58]:
So when you test on a lot of stuff and you integrate it into your own methods, it’s becoming also much more affordable. So, so that’s a whole new market opening up here that we are super excited about. And then we take the facial expression technology from Affectiva and then make it multimodal with these other features. So we are very excited about getting that into the market.
Roger Dooley [00:24:17]:
Peter, you’ve been in this industry for a long time, as you mentioned, and continuing to evolve the services your company offers, the technologies and so on. In the last few years, has there been anything that really surprised you that you could share that might surprise other people who aren’t as familiar with the technologies or industry.
Peter Hartzbech [00:24:39]:
Yeah, I think, I mean it’s, it’s, it surprised me how long time it actually took to get the in person research back after Corona. Right. Because I think what happened was that people couldn’t get into the laboratories. Right. And then a lot of people, they, they had to spend the budget somehow so they started doing only online research. Right. And, and the technologies were not like super good for it. So I think it was a bit surprising to me that it took so long to actually see that there’s a lot of value in, in person and like very high quality research.
Peter Hartzbech [00:25:09]:
Right. Of course you also want a quick and efficient. But I think it depends on when you use this kind of technology. Right. I mean if you have a quick assessment tool that can find, as we talk about, you know, something where you test 10 ads or 20 ads quickly and then you find the five ones that are good, then you go more in depth research with those so that you can do it for a smaller budget, for example. Right. I think that’s, that’s definitely the, the, the in person research is gonna go up again. That’s, that’s what I see.
Peter Hartzbech [00:25:40]:
And it surprised me a little bit that it, it took so long and now we begin to see all the large brands coming back and say, well we, we really, we like this gen AI because it’s quick but you know, do we actually dare to take the decisions based on this or do we want to make, be more sure in our decisions and have a more informed decision? So and that’s why I think now it’s, it’s really a quick acceleration of the technologies but we still are scientifically back coming. You know, I have like 15, 20 PhDs on staff here and I’m also neuroscientist, psychologists, all kinds of people, right. That, that knows their research, knows their methodologies. So it’s like, you know, I think there’s still a large segment of the brands and so on that really wants to do this kind of in depth research, especially in sensory. But I also think, yeah, an advertisement.
Roger Dooley [00:26:30]:
Peter, you’re doing behavioral research and then do you also attempt to, rather than just measuring the behavior of people who are interacting with whatever materials they are advertising or packaging, whatever, do you attempt to also bring in concepts from behavioral science research, whether it’s Cialdini or Kahneman or any of the thinkers in that area, do you try to integrate that type of thinking in any way with the sort of one on one measurements you’re taking on actual humans?
Peter Hartzbech [00:27:06]:
I think our approach has always been that we want to have high quality raw data, right. Because we work with the sensors and so on. So we want to have very scientifically proven data that’s properly synchronized and precise. And then the approach has been that we don’t build our own metrics. We only take published research from you know, like known methodologies. For example, how do you do peak detection on gsr? All the ways that you basically get these more valuable analysis out of the system. And, and that’s why we have built what we call the R notebook. So we have integrated with R so that we can take a lot of these metrics and clients can also themselves take some of these ideas or thinkers that you mentioned here and bring it into the platform.
Peter Hartzbech [00:27:50]:
Right. So I think it’s basically we try to expand the platform to do whatever people, they want to do and we listen to them and then we, we enable them to do it. So we don’t have any specific thinkers that we have integrated but we try to take all the proven technologies, all the peer reviewed journals when there are something we can truly trust scientifically and then we bring it into the notebooks and then we make it transparent. So and that’s why we are very different from, you know, you know, the buying button, kind of 2000s neuromarketing companies where we basically, we don’t believe in black box. We kind of make things available so people can twist it or they can do whatever they want with it. But we’re taking all the proven methodologies and put it in there. So if you have good ideas for what needs to get in there, I would love to hear it. And then we, you know, we can get it done.
Roger Dooley [00:28:37]:
Peter, how can people learn more about IMotions if they’re intrigued by what they’ve heard so far?
Peter Hartzbech [00:28:42]:
IMotions.com, iMotions.com right. And yeah, we look forward to talk to everybody out there. And yeah, I look forward to come back into the market research space and, and connect with people again. So that’s pretty exciting.
Roger Dooley [00:28:57]:
Well, Peter, thanks for being on the show.
Peter Hartzbech [00:28:59]:
Of course, thank you very much for having me.