108 avsnitt • Längd: 30 min • Månadsvis
What if technology could understand people in the same way that people understand one another? Tune in as Affectiva, the pioneer of Emotion AI, endeavors to humanize technology as a new Smart Eye Company. The Human-Centric AI podcast dissects how we can put the human before the artificial as AI manifests in our daily lives, with insights from the world’s top thinkers in automotive, market research, aviation, robotics, education, academia and beyond.
The podcast The Human-Centric AI Podcast is created by Smart Eye. The podcast and the artwork on this page are embedded on this page using the public podcast feed (RSS).
Eye tracking is more than just following a person’s gaze — it’s about understanding behavior, optimizing research, and improving real-world applications. But what are the biggest misconceptions holding researchers back? And how can the right technology make all the difference?
In this episode of the Human-Centric AI Podcast, we sit down with Brant Hayes, Regional Sales Director for Southeast US at Smart Eye, to discuss the most common concerns and challenges customers face when considering eye tracking technology. With over a decade of experience helping clients integrate eye tracking into their workflows, Brant shares insights on how Smart Eye’s multi-camera system, 3D world model, and innovative approach set a new standard for accuracy and ease of use.
Tune in to learn:
Listen now to discover how eye tracking is transforming research and real-world applications across industries.
Eye-tracking technology is evolving rapidly, unlocking new possibilities across industries like automotive, aviation, and behavioral research. But what does it take to implement it successfully? In this episode of the Human-Centric AI Podcast, we sit down with Sebastian Johansson, Regional Sales Manager for APAC at Smart Eye, to explore the real-world challenges customers face when integrating eye tracking into their research and development processes—and how Smart Eye’s cutting-edge solutions help overcome them.
Sebastian shares insights from over a decade of working closely with partners and customers across Asia, shedding light on key misconceptions, technical considerations, and the ROI impact of eye tracking. From understanding how lighting conditions affect performance to the role of multi-camera setups and Smart Eye’s pioneering 3D world model, this episode offers a comprehensive look at how eye tracking is shaping the future of human-machine interaction.
Tune in to learn:
Listen now and gain expert insights into the power and potential of eye-tracking technology.
Today’s episode features Bahman Hadji, Director of Product Management, Automotive Sensing Division at onsemi. Bahman has 15+ years of experience working with image sensors in the medical, consumer, and automotive markets. In his role in the Automotive Sensing Division of onsemi's Intelligent Sensing Group, he manages a global team focused on driving the product roadmap strategy and bringing to market high-performance automotive image sensors used in ADAS, autonomous driving, in-cabin, and LiDAR applications. Prior to joining onsemi in 2017, he held product engineering and product marketing roles at Aptina Imaging and OmniVision Technologies. Bahman obtained both his Bachelor of Applied Science in Computer Engineering and Master of Applied Science in Electrical and Computer Engineering degrees from the University of Waterloo in Canada
onsemi is driving disruptive innovations to help build a better future. With a focus on automotive and industrial end-markets, the company is accelerating change in megatrends such as vehicle electrification and safety, sustainable energy grids, industrial automation, and 5G and cloud infrastructure. With a highly differentiated and innovative product portfolio, onsemi creates intelligent power and sensing technologies that solve the world’s most complex challenges and leads the way in creating a safer, cleaner, and smarter world. onsemi operates a responsive, reliable supply chain and quality programs, and robust ESG programs. Headquartered in Scottsdale, Arizona, the company has a global network of manufacturing facilities, sales and marketing offices and engineering centers in its key markets.
We invited him to talk today about depth sensing and its importance, applications for adaptive restraint systems, user interaction / security features, and future protocols and industry standards. Listen in to learn more.
Links of interest:
In high-stakes environments like air traffic control, understanding and managing stress is critical. In today's episode, we explore groundbreaking research conducted by LFV/LIU in collaboration with Smart Eye, investigating the stress responses of air traffic controllers. Using cutting-edge EEG and eye tracking technologies, this study sheds light on how these tools can predict brain activity and anticipate the effects of stress.
Listen now to discover how these insights can help improve safety, optimize performance, and shape the future of air traffic management. Whether you’re in aviation, neuroscience, or human factors engineering, this episode offers valuable takeaways on harnessing biometric data to understand and mitigate stress in complex environments.
Links of Interest:
Vehicle manufacturers on the European market have a new set of rules to play by: since July 7, 2024, the EU’s General Safety Regulation (GSR) requires all new cars, buses, and trucks to include Driver Drowsiness and Attention Warning (DDAW) systems. At the same time, Euro NCAP are expanding their scope to provide detailed information about the safety of heavy trucks via an innovative Truck Safe City and Highway rating scheme.
As legislators and rating institutes are raising the bar for road safety, many car, bus, and truck manufacturers may find themselves struggling to answer a whole new set of complex questions.
To help navigate this new reality, we assembled a panel of industry experts to give you some crucial insights to demystify the new rules and ensure your vehicles meet (or exceed) the latest standards, including:
➤ The Technical requirements for DDAW and ADDW integration
➤ The specific features that will help you achieve top Euro NCAP Truck Safe ratings
➤ How to Prepare your vehicles for upcoming Advanced Driver Distraction Warning (ADDW) requirements in 2026
➤ Tackle the challenges of aligning your manufacturing processes with the latest GSR standards
Listen to learn more on ensuring your vehicles not only comply with but excel under these new regulations.
Links of interest:
Today’s episode features Matt Strafuss, Director of Product and Customer Solutions at Affectiva (now a Smart Eye company). Matt has been with Affectiva for over 10 years in various roles. Affectiva, pioneer of Emotion AI, was acquired by Smart Eye in 2021 and has since continued its strong brand presence in the media analytics space. With a degree in Physics and Computer Science, Matt has worked on everything from software development, project management, customer success, and everything in between.
His deep expertise in the space made him a natural candidate to interview about the new Affectiva calibration-free eye tracking feature, which represents a culmination of the Smart Eye eye tracking tech and Affectiva Emotion AI just announced last week. Matt talked about the development of this feature, how its differentiated from other eye tracking and attention measurement solutions on the market, some high-level mechanics on how it works and where we may be going with it in the future. Listen in to learn more.
Links of Interest:
Today’s episode features Lynn Deason, Head of Creative Excellence at Kantar. Kantar is the world’s leading data, insights and consulting company. They understand more about how people think, feel, shop, share, vote and view than anyone else. Combining their expertise in human understanding with advanced technologies, Kantar’s 30,000 people help the world’s leading organizations succeed and grow. Lynn is a trusted advisor to the most senior levels in major client organisations who is passionate about brands, brand communications and brand experience. A recognised high achiever with proven impact to clients as individuals, their brands and their organisations overall, Lynn provides a very informed perspective as a result of wide experience internationally and across sectors.
Lynn talked about the secrets behind creating impactful and memorable ad campaigns. We also explored the creative strategies behind brand McDonald's and Cadbury campaigns, the role of Emotion AI technology in advertising, and the future trends shaping the industry. Listen to learn more.
Links of interest:
Today’s episode features Detlef Wilke, Vice President of Innovation & Strategic Partnerships at Smart Eye. With a degree in electrical engineering, Detlef has over 25 years of experience within the automotive industry with deep technical expertise in Driver Monitoring and Interior Sensing systems. I invited him to talk today about the evolution and impact of driver monitoring and interior sensing technologies in the automotive industry, highlighting their role in enhancing vehicle safety and user experience. Let’s listen in to learn more.
Links of interest:
Today’s episode features Vera Sidlova, Global Brand Manager, Creative at Kantar. Kantar is the world’s leading data, insights and consulting company. They understand more about how people think, feel, shop, share, vote and view than anyone else. Combining their expertise in human understanding with advanced technologies, Kantar’s 30,000 people help the world’s leading organizations succeed and grow. Vera has been focused on creative effectiveness research and helps manage Kantar’s creative solutions portfolio.
Vera and I talked about the recent Kantar Creative Effectiveness Awards, including winning ad examples in each category, themes and trends seen this year - as well as some advice advertisers can follow moving forward from facial coding AI. Let’s listen to learn more.
Links of interest:
- [Learn More] Kantar's 2024 Creative Effectiveness Awards: https://www.kantar.com/campaigns/creative-effective
- [Read] Download the Booklet, Kantar Revealed: the most create and effective ads from 2023: https://www.kantar.com/campaigns/creative-effective/download-the-booklet
- [eBook] Generative AI meets Emotion AI: AI Disruption in Advertising - https://go.affectiva.com/generative-ai-meets-emotion-ai-ebook
- [Watch] the On-Demand webinar, Kantar Creative Effectiveness Awards 2024: Creative consumer connections: https://event.on24.com/eventRegistration/EventLobbyServlet?target=reg20.jsp&eventid=4534231&sessionid=1&key=1C1F17B0B947D969EF76413B6251C110&groupId=5348734&sourcepage=register
While our eye tracking systems are comprehensive and cutting-edge, setting them up and working with them doesn’t have to be daunting. At Smart Eye, we understand that success lies not only in the sophistication of our technology but also in the support and guidance we provide to our customers.
Whether you’re a seasoned professional or just beginning to explore the world of eye tracking, rest assured that you’ll be well-supported every step of the way. In today's episode, our customer support team will talk you through how we make your journey with Smart Eye both seamless and successful. Listen to learn more.
Links of interest:
Generative AI has changed the world of advertising. Large Language Models create draft copy instantly. Image generators produce visuals in minutes. Text-to-video tools will make creative development even more streamlined. But what about the risks from all this AI-led efficiency? Can Generative AI really design compelling ads?
Emotional engagement is critical for advertising success. Will the rush to automate the creative process strip advertising of its emotional power?
Today, we are featuring our own Sasha Mukhanova, Account and Business Development Director at Affectiva. With 15 years experience in advertising effectiveness research, Sasha is an advertising nerd and neuroscience enthusiast. Her mission is to humanize technology with ad testing AI. She is joined by Graham Page, Affectiva’s Global Managing Director of Media Analytics for a moderated discussion with Mike Stevens, the Founder of Insight Platforms.
This episode shares how Emotion AI can work with Generative AI to ensure that ads drive long-term brand equity, audience engagement and even virality. Listen in to learn more.
Links of interest:
Today, we are joined by a guest whose journey traverses the dynamic realms of traditional media and cutting-edge Emotion AI technology. Dr. Leon Hawthorne is not only a former news anchor for CNN International and CNBC Europe but also a Founder CEO of two satellite television channels. His extensive experience in the media landscape has been complemented by his entrepreneurial spirit, leading him to explore the forefront of Artificial Intelligence in digital communication.
Dr. Hawthorne recently earned his Ph.D. in 2024, delving into the captivating world of Generative AI and its impact on our digital lives. Long before the current surge in AI adoption, Leon embarked on a research project titled 'Talking Heads.' This initiative aimed to evaluate the integration of AI-generated virtual humans as on-screen news presenters, a concept that has gained unprecedented relevance in today's rapidly evolving technological landscape.
His findings, which we will be exploring in detail, shed light on audience perceptions, preferences, and the challenges that arise when blending human authenticity with the prowess of Generative AI. Let’s listen in to learn more.
Links of interest:
Today we have a very special episode featuring Dr. Rana el Kaliouby, former Co-Founder and CEO of Affectiva, currently Deputy CEO of Smart Eye, having a discussion with Smart Eye CEO Martin Krantz.
In their conversation, Rana and Martin touch on the significance of recent design wins, the methodology behind estimating revenue and lifetime value, and the exciting potential of combining interior sensing with generative AI. Listen in to learn more.
Links of interest:
In today's episode, we dive into a recent collaborative event Smart Eye attended at the University of North Dakota (or UND) and the Research Institute for Autonomous Systems (RIAS). With a comprehensive setup featuring multiple cameras, EEG data collection modules, and emotion sensing capabilities, the UND RIAS team is poised to unlock the true potential of eye tracking technology combined with other biosensors.
Speaking with Smart Eye, UND Professor and project sponsor Philip Brandt and the students involved in this week-long tech event, we got a comprehensive look into their project, which includes equipping a HUMVEE with advanced Smart Eye Pro eye tracking systems and iMotions analysis software. This state-of-the-art arrangement provides a powerful platform for capturing and analyzing data related to driver and passenger behavior, setting the stage for groundbreaking research. Listen to learn more!
Links of interest:
Marketers often talk about the art of storytelling, but how do they measure up to the experts in filmmaking? In a world where emotions drive engagement, Affectiva's Emotion AI database, with 80 thousand pieces of advertising and entertainment content, provides unique insights. This technology reads between the lines, decoding facial expressions to understand audience engagement and feelings in the moment of viewing.
In a recent webinar hosted by Greenbook, Affectiva's Mitzi Lorentzen guided attendees through case studies and quantitative analyses, uncovering the role of emotions in driving key outcomes such as advertising sales and box office success. Mitzi dove into the comparison between advertising and entertainment in evoking viewer emotion and the consequences that follow.
She also covered the art of getting emotion right – when to use positive versus negative emotions and how to tailor them to different genres or messages. Throughout the webinar event, she showcased practical applications beyond box office hits, illustrating how the tricks of narrative revealed by emotional AI can drive attention and engagement across various industries. Listen in to learn more.
Links of interest:
Today’s episode features two Smart Eye team members, Mario and Aaron, discussing the transformative potential of Smart Eye’s eye tracking solutions in redefining the possibilities of automotive research in a recent live virtual event. Attendees discovered the essence of eye tracking, its profound relevance, and how it enables breakthroughs in road safety, human-machine interaction, and human factors research. Listen in to learn more.
Links of interest:
Today we are going to dive into some of the latest advancements in visual attention measurement, and how it is revolutionizing ad creative effectiveness testing. We will also discuss creative development strategies, how to balance attention and brand goals, and briefly touch on integrating attention metrics into your existing measurement frameworks.
Our expert speakers:
Links of interest:
Today’s episode features Elsa Magner, Senior Project Manager at Smart Eye. Elsa has extensive expertise spanning various technical domains, including project management, quality assurance, and product development. With a mission to instill structure, purpose, and clear goals in her work, Elsa is a seasoned professional who thrives on delivering on expectations, collaborating with competent teams, and adapting swiftly to new challenges.
As a vital part of Smart Eye's Applied AI Systems team, Elsa shared insights into her role enhancing road safety through driver monitoring systems. As an expert navigating legislative requirements, she discussed challenges and successes in implementing safety systems, emphasizing the role of data. Listen in to learn more.
Links of interest:
Today’s episode features Dr. Tara Akhavan, Global Innovation and Ecosystems Director at Forvia. Dr. Akhavan is an award-winning technology entrepreneur with over 15 years of experience in leadership and product strategy. She is the Founder, President & CEO of Faurecia, a groundbreaking company in the field of Perceptual Display/Image Processing for both consumer and automotive markets. Faurecia Irystec was acquired by Faurecia (Forvia) in 2020, where Tara now serves as the Global Innovation & Ecosystems Director, overseeing startups internally and externally, as well as leading the central Innovation team. She holds a Ph.D. in Image Processing and Computer Vision and actively contributes to industry committees such as CIE and SID.
In our conversation, Dr. Akhavan unveils the fascinating journey from her Vienna PhD to pioneering human-centric mobility, teasing exclusive details about Forvia's CES 2024 presence and sharing insights on the impactful intersection of Emotion AI and innovative automotive technologies.
Links of interest:
In today's episode, we are delving into the groundbreaking world of AI art expression for those with Motor Neuron Disease (or MND) such as ALS. We spoke with guests Franklin Williams and Evan Schmidt from the AREA 23 Agency, along with Richard Cave from the MND Association.
Richard Cave is a Speech and Language Therapist, working with the MND Association and also with Google, providing specialist Speech Therapy consultancy to the technology teams.
He is also PhD candidate at University College London, national adviser to Royal College of Speech and Language Therapists for voice banking and 2022 Allied Professional of the Year recipient from the International Alliance of ALS/MND Associations.
Then from Area 23, we have Franklin Williams and Evan Schmidt. As EVP, Executive Director of Experience Design, Franklin is an ideal blend of creativity, user experience, and technology, and has been using his passion for innovation to elevate the importance of experience design in the advertising industry for nearly 20 years.
Along with Franklin we also have Evan Schmidt, Associate Creative Director (Art) at AREA 23. Evan is a dynamic creative fueled by an unwavering passion for AI art. With a background in illustration and a flair for innovation, he seamlessly integrates AI algorithms into his artistic vision, crafting visually stunning and thought-provoking pieces. Together with the team at AREA 23 an IPG Health Company, Evan helped to develop the Mind’s Eye app.
Mind’s Eye is the world’s first AI art expression tool for people with MND/ALS. As an Associate Creative Director, Evan inspires his team to embrace the transformative power of AI, creating a new era of visually captivating and intellectually stimulating experiences.
In our conversation, we unravelled the journey behind Mind's Eye, exploring its purpose, unique features, and the impact it has on the lives of its users. We also uncover the synergy between Mind's Eye and the Smart Eye assistive technology partner Smartbox (who use our eye tracking to help give a voice to those who are unable to) the pivotal role of eye tracking technology, and hear inspiring user stories. Evan has an exciting call-to-action for our listeners—try Mind's Eye in Grid with a 60-day free trial for Windows or a 30-day free trial for iPad at thinksmartbox.com/try-grid. Get ready for a deep dive into the transformative realm of Mind's Eye and the evolution of human-centric AI!
Links of interest:
We have a special episode today featuring Smart Eye Deputy CEO Dr. Rana El Kaliouby (formerly Co-Founder and CEO of Affectiva) speaking with with Leïla Maidane:
Leïla is a socially-minded entrepreneur who puts technology at the service of economic mobility. With expertise in digital transformation, she launched Femmes Fières in 2019 to elevate businesses founded by female entrepreneurs using technology as a catalyst. In 2020, she founded InterSkillar, a startup revolutionising the transition between education and work for young people. Leïla's influence also extends to her position on the board of Agoria Brussels and her active role in advancing the digital transition for women entrepreneurs in Europe. She was voted Inspiring50 in 2022, Top 100 most influential female entrepreneurs in Europe in 2023 and is a member of the Belgium 40under40.
In this livestream, Rana and Leïla discussed the AI landscape in Europe, especially as it relates to female entrepreneurs, how technology can be a vehicle for economic and social mobility, and topics around ethics, equity and responsible use of AI. Let’s listen in to learn more.
Links of interest:
Today’s episode features Daniel Murray, Senior Vice President at the American Transportation Research Institute (ATRI). Dan has more than 30 years of experience in a broad range of transportation fields including trucking research and economics, transportation safety technologies and autonomous vehicles. At ATRI, Mr. Murray is responsible for managing ATRI’s transportation research, testing and evaluation activities, and leads multiple national activities including the U.S. DOT-sponsored Freight Mobility Initiative. He is a well-known freight subject-matter expert and is a keynote speaker at more than a dozen annual events.
Dan and I talked about top concerns in trucking fleet safety, challenges in fleet safety management, why safety matters, and critical technologies for fleet safety. Listen to learn more.
Links of interest:
How can eye tracking technology help improve flight safety?
In this episode, we explore the fascinating world of eye tracking and its profound impact on the aviation industry. We dove into the fundamentals of eye tracking, its crucial relevance in aviation, and present real-world case studies and testimonials showcasing its effectiveness.
Virtual Attendees Learned:
Links of Interest:
Today’s episode features Kate Monninger, VP of Campaign Analytics at MarketCast. A versatile research analyst with fifteen years of experience in supplier-side strategic consulting, Kate has proven expertise as a research manager overseeing the design, implementation and analysis of quantitative and qualitative projects on global scale. Her current work focuses on theatrical campaign and franchise/brand messaging development, with experience also in the TV and OTT marketplaces. She has also conducted quant and qual research throughout North America, Europe, Asia and the Middle East.
MarketCast is a data and technology-driven research and insights firm serving CMOs and marketers at the world’s top brands, media companies, tech platforms, and sports and video games organizations. The company brings together a unique mix of primary research, AI, and big data to deliver full-funnel transparency for marketers. Their insights guide critical marketing decisions, helping brands determine which audiences to prioritize and product benefits to communicate, in addition to developing, launching, and measuring brand and advertising campaigns across media platforms.
Kate and I talked about some of her recent entertainment research she conducted on a very popular action movie franchise, including what she set out to understand from viewers of this content, the methodology of understanding user emotion, her learnings, and the data behind them. Listen to learn more.
Links of interest:
Today’s episode features a recent livestream event where we hosted Solmaz Shahmehr, Smart Eye VP of Research Instruments. We had a quick chat about the fundamentals of eye tracking, applications in aviation, and how eye tracking can help improve flight safety through pilot training research and future aircraft design. Let’s listen in to learn more
Solmaz has been with Smart Eye since 2009 in different capacities – she is responsible for directing the Research Instruments organization, and charged with delivering a sustainable, profitable business model there. She also leads global sales, product development and product management, marketing, support & ops.
Links of interest:
Today’s episode features Matt Strafuss, Director of Product and Customer Solutions at Affectiva (now a Smart Eye company). Matt has been with Affectiva for almost 10 years in various roles. Affectiva, pioneer of Emotion AI, was acquired by Smart Eye in 2021 and has since continued its strong brand presence in the media analytics space. With a degree in Physics and Computer Science, Matt has worked on everything from software development, project management, customer success, and everything in between.
His deep expertise in the space made him a natural candidate to interview about the new Affectiva attention metric, which represents a culmination of the Smart Eye eye tracking tech and Affectiva Emotion AI just announced. Matt and I talked about the development of this metric, how its differentiated from other “Attention” measurement solutions on the market, some high-level mechanics on how it works and where we may be going in the future. Listen in to learn more.
Links of Interest:
Today’s episode features Senior Neuroscience Product Specialist at iMotions, Nam Nguyen. With over 15 years of experience shaping cutting-edge research across diverse industries, Nam’s background encompasses everything from military and healthcare technologies, automotive UX, game design, and academic research. His expertise is matched only by his passion for sensors and technology. Currently advising research programs at iMotions, he brings cognitive neuroscience methodology to the forefront, backed by a history of rigorous scientific work and multiple peer-reviewed publications.
Nam and I talked about aviation and how cutting-edge technology like eye tracking and biometrics are revolutionizing the aviation industry. Having gained insights from pilots and exploring applications in flight safety, training, and more over the years - from Smart Eye Pro eye tracking to emotion analysis - we covered how iMotions enhances aviation tasks, monitors human-machine interaction, and ensures pilot readiness in the face of automation.
Join us as we gain insights into the intersections of neuroscience and technology for aviation: Let’s listen in to learn more.
Links of interest:
- [eBook] The Comprehensive Guide to Eye Tracking Technology for the Aviation Industry
- [Blog] How Are Simulations Used in Human Behavior Research
Today’s episode features the Founder & CEO @iMotions, and member of the Executive Management team in Smart Eye Group, Peter Hartzbech. Peter is an entrepreneur who is driven by using technology to make the world a better place. His company, iMotions, is a fully-integrated, hardware-agnostic software platform that allows researchers to use the power of any neuroscience technology, as well as traditional surveys and focus groups, to gain unparalleled insight into what people actual think and feel. Their customers use iMotions for everything from diagnosis of neurological diseases such as Parkinson, Schizophrenia, Autism and Alzheimers to personnel training, UX testing, advertising and military human research.
Peter and I talked about the future of biometric research, the cutting-edge advancements and trends that are reshaping the field, uncovering new possibilities and applications. We discussed the groundbreaking development of web-based eye tracking and its implications for attention measurement and eye movement metrics. Peter will also shed light on the power of multi-modality, combining biometric signals to gain a comprehensive understanding of human behavior, as well as the exciting world of real-time analysis and its transformative impact on biometric research. Let’s listen in to learn more.
Links of interest:
Today’s episode features Dr. Bryan Reimer, Research Scientist at the MIT Center for Transportation and Logistics and MIT AgeLab, specializing in driver safety and mobility. With extensive experience in driver behavior research and a multidisciplinary approach, Dr. Reimer's work addresses the challenges of driver attention management, distraction, automation, and advanced driver assistance systems. His research informs technology development, business strategy, and public policy, making him a leading expert in the field.
We talked about the latest advancements in mitigating driver distractions through improved driver support, highlighting the role of vision sensor technology in enabling OEMs to make better decisions in supporting drivers. He emphasized the importance of effectively utilizing the information collected by driver monitoring systems to enhance driver support and provided examples of positive reinforcement techniques that can positively impact driver behavior. Let’s listen in to learn more.
Links of interest:
Today’s episode features Smart Eye Deputy CEO Dr. Rana El Kaliouby (formerly Co-Founder and CEO of Affectiva) moderating a discussion with Pernille Bülow, PhD, founder and CEO of the nonprofit Mind Blossom, and Loren Larsen of Videra Health.
On this last day of Mental Health Awareness month, these three came together to put a spotlight on this issue, and delve into the role technologies like Emotion AI can play in mental health. The panelists also discussed the ethical concerns surrounding the use of technology in mental health and how they can be addressed. Additionally, they examine the potential impact of this tech on mental health research and how it can be used to identify new treatments and therapies.
The discussion highlights the potential of recent technology advancements to revolutionize the way we approach mental health and underscores the importance of responsible and ethical use of this technology. Listen in to learn more.
Links of interest:
Today’s episode features a Q&A with our own Graham Page. Graham leads the Media Analytics business Unit as Global Managing Director of Media Analytics at Affectiva, a Smart Eye company. He pioneered the integration of biometric and behavioral measures to mainstream brand and advertising research for 26 years as Executive VP and Head of Global Research Solutions at Kantar.
Over the course of the last year or so, there has been a thread of debate in the media regarding the validity and ethics of facial emotion recognition. This has often reflected the point of view of some data privacy groups who are concerned about the use of facial technologies across several use cases, or the opinions of commercial interests who offer alternative biometric technologies, or traditional research methodologies.
Scrutiny of emerging technologies is vital, and the concerns raised are important points for debate. Affectiva has led the development of the Emotion AI field for over a decade, and the use of automated facial expression analysis in particular. Listen in to learn more.
Links of interest:
Additional Sources Referenced:
[1] Barrett, Lisa Feldman, et al. "Emotional expressions reconsidered: Challenges to inferring emotion from human facial movements." Psychological science in the public interest 20.1 (2019): 1-68.
[2] Ekman, Paul, and Wallace V. Friesen. "Facial action coding system." Environmental Psychology & Nonverbal Behavior (1978).
[3] Rosenberg, Erika L., and Paul Ekman, eds. What the face reveals: Basic and applied studies of spontaneous expression using the Facial Action Coding System (FACS). Oxford University Press, 2020.
[4] Martinez, Brais, et al. "Automatic analysis of facial actions: A survey." IEEE transactions on affective computing 10.3 (2017): 325-347.
[5] McDuff, Daniel, et al. "AFFDEX SDK: a cross-platform real-time multi-face expression recognition toolkit." Proceedings of the 2016 CHI conference extended abstracts on human factors in computing systems. 2016.
[6] Bishay, Mina, et al. "AFFDEX 2.0: A Real-Time Facial Expression Analysis Toolkit." arXiv preprint arXiv:2202.12059 (2022). Accepted at the FG2023 conference.
[7] McDuff, Daniel, et al. "Predicting ad liking and purchase intent: Large-scale analysis of facial responses to ads." IEEE Transactions on Affective Computing 6.3 (2014): 223-235.
[8] Koldra, Evan, et al. Do emotions in advertising drive sales? https://ana.esomar.org/documents/do-emotions-in-advertising-drive-sales--8059.
[9] McDuff, Daniel, and Rana El Kaliouby. "Applications of automated facial coding in media measurement." IEEE transactions on affective computing 8.2 (2016): 148-160.
[10] Teixeira, Thales, Rosalind Picard, and Rana El Kaliouby. "Why, when, and how much to entertain consumers in advertisements? A web-based facial tracking field study." Marketing Science 33.6 (2014): 809-827.
[11] McDuff, Daniel, et al. "Automatic measurement of ad preferences from facial responses gathered
Today’s episode features Dr. Michael Seiler, an expert in marketing and consumer behavior at the Raymond A. Mason School of Business at the College of William and Mary.
Dr. Seiler is an internationally recognized behavioral real estate researcher whose studies have been cited in the Wall Street Journal, NY Times, LA Times, and Washington Post. He has published over 185 research studies, has written several books, and is currently the #1 ranked real estate researcher, consuling in the areas of behavioral and experimental real estate and finance.
Dr. Seiler's research focuses on the intersection of technology, marketing, and consumer behavior, and he has a particular interest in how affective computing can be used to understand people's responses to financial products. In this episode, Mike shared his insights into how he used Affectiva’s Emotion AI technology to assess people's attitudes towards financial products by showing them content and measuring their emotional responses to it. Let’s listen in to learn more.
Links of interest:
Smart Eye recently hosted an assistive technology virtual event which dove deep into the ways in which individuals with disabilities use different access methods such as eye gaze to control computers, with a special focus on Smartbox and their groundbreaking Lumin-i eye tracker. In this webinar, we explored how this technology revolutionizes the way these individuals communicate, and perform all the activities to give them the independence to live their lives.
So today’s episode features Solmaz Shahmehr, Smart Eye VP of Research Instruments and Neil Fitzgerald, Smartbox Product Manager discussing how eye tracking technology is being used in Smartbox’s Lumin-i eye tracker for Augmentative and Alternative Communication (AAC) devices, and what makes it unique.
Soli and Neil talked about the exciting world of eye gaze technology and discuss how it fits into these AAC devices, along with some of the incredible things it can achieve. You'll hear inspiring stories of how these communication aids have given those with disabilities the power to interact with friends and family, and communicate in ways they otherwise never thought possible. Listen in to learn more.
Links of interest:
What role can eye tracking play in building trust between pilots and the aircraft they fly? Can it help improve flight safety, or help train the next generation of pilots?
These are the questions tackled in today's episode featuring Karl Tschurtschenthaler and Simon Schwerd both from the Institute of Flight Systems. Karl studied Biomedical Engineering and received his Master of Science degree in 2018. Afterward, he worked as a research engineer in the field of numerical modeling and has been working as a research associate and Ph.D. student at the Institute of Flight Systems since October 2020. His research objective is Pilot Activity Recognition using probabilistic graphical models based on real-time data (eye tracking and psychophysiological sensors). Simon studied engineering at the Technical University of Munich with a bachelor's degree and master's degree in mechanical engineering. At the Institute of Flight Systems, he developed future cockpit concepts for military aircrafts and is pursuing his PhD in the estimation of Situation Awareness based on pilot monitoring via eye-tracking. His general research interests include Operator & Pilot Assistance, Adaptive Automation and Human-Automation-Integration.
Simon and Karl talked all about their work with eye tracking in aviation, including cooperative automation and human-automation-interaction in aircraft cockpits. Listen in to learn more.
Links of interest:
Today’s episode features a Q&A discussion between our own Sasha Mukhanova, Business development and Account director, Media analytics at Affectiva (a Smart Eye company) and Stefan Ruff, Co-CEO & Co-Founder of Oculid.
Sasha is an experienced brand strategist with a passion for research and consumer insights. Through her work and extensive background as a creative domain lead at Kantar, she is a self-described “advertising nerd,” and expert in utilizing emotion AI to help clients drive deeper consumer insights and create more engaging advertising campaigns for over a decade.
Stefan is a leading expert in the field of eye tracking technology. With extensive experience in the advertising industry, Stefan has a deep understanding of the importance of understanding consumer emotions and behavior. His expertise in this area is born from years of research and hands-on experience, as he has worked with some of the world's leading brands, helping them to leverage eye tracking to gain a deeper understanding of their customers.
The two discussed a recent collaboration between Affectiva and Oculid to test a few ads, and covered some practical advice for businesses looking to incorporate technologies such as facial expression analysis, Emotion AI & eye tracking into your advertising strategy. Listen in to learn more.
Links of interest:
Today’s episode features prior guest Dr. Pernille Bülow, Product Specialist at iMotions where she consults and trains researchers on multimodal biometric data collection and study design. Penny speaks with Dr. Ryan Jane Jacoby, who is a licensed clinical psychologist at the Center for OCD and Related Disorders at Massachusetts General Hospital and an Assistant Professor of Psychology (Psychiatry) at Harvard Medical School. Her clinical & research interests focus on the behavioral/psychophysiological measurement of transdiagnostic psychological processes using multi-method approaches.
From investigations into clinical mental illness regarding bipolar disorder, depression and PTSD, to general health monitoring, education, and performance, multimodal biosensor studies have seen blossoming interest. These two experts covered how combining facial expression analysis, heart rate, eye tracking, skin conductance or other multi-sensor behavioral research techniques can have immediate implications on the clinical, diagnostic or intervention aspects of mental health research.
Listen to learn more.
Today’s episode features Jane Ostler, EVP Global Thought Leadership at Kantar with Ecem Erdem, Global Manager – Creative at Kantar together speaking with Graham Page, our Global Managing Director of Media Analytics at Affectiva, a Smart Eye company.
As the connected world continues to grow, so do the opportunities for brands to engage with their audiences. Investment in digital media channels continues to increase, from online video to social media feeds, influencer content to online games, and Kantar’s Media Reactions study shows that people now feel more positive towards digital channels.
With the increasing reach and impact of digital channels, it is imperative for brands to get digital advertising right. Kantar’s Link ad testing database shows that digital ads that evoke strong emotions are 4 times more likely to generate impact than those with weaker emotional connections.
So how can you make the right emotional connections with your digital ads?
Listen to find out what makes digital campaigns resonate and help build brand equity. Experts from Kantar and Affectiva explore findings from their databases, Emotion AI analysis and share case studies to illustrate best practice. Listen in to learn more.
Links of interest:
Today’s episode features Rob Wesley, North American Sales Director at Smart Eye. Rob holds a Bachelor of Science in Forensic Science and Technology, and a Bachelor of Arts in Chemistry. He has spent his career at a number of automotive companies including Panasonic Automotive Systems and Bosch in roles varying from account management to strategic consulting. Rob has been with Smart Eye for 6 years, and has been an integral part in scaling its business development initiatives, working with researchers and clients to find the eye tracking solutions that best align with what they are trying to achieve.
We talked all about Human Factors: everything from analyzing behavior to providing insights in a wide variety of fields, making research, development and education progress towards a safer and better tomorrow. We got deeper into how some of the more technical aspects of Smart Eye’s Research Instruments eye tracking products work, as well as some use cases on how different organizations use our technology. Let’s listen in to learn more.
Links of interest:
Today’s episode features a Product Specialist at iMotions, Dr. Pernille Bülow. iMotions is a fully-integrated, hardware-agnostic software platform that allows researchers to use the power of any neuroscience technology, as well as traditional surveys and focus groups, to gain unparalleled insight into what people actual think and feel. Their customers use iMotions for everything from diagnosis of neurological diseases such as Parkinson, Schizophrenia, Autism and Alzheimers to personnel training, UX testing, advertising and military human research.
At iMotions, Pernille consults and trains academic and commercial researchers on multimodal biometric data collection and study design. Pernille finished her BS at UC Berkeley and completed her PhD in Neuroscience at Emory University where she studied brain mechanisms associated with the neurodevelopmental disorder Fragile X Syndrome. Pernille is passionate about sharing her knowledge to help others create ground-breaking science.
We talked about the use of biosensors in development (particularly in infants), some examples of where tools like eye tracking can be applied in mental health research, and how this technology can help the neurodivergent and neurotypical thrive. Listen in to learn more.
Links of interest:
Today’s episode features Dennis Nobelius, Chief Operations Officer of Polestar - a Swedish premium electric vehicle manufacturer.
Prior to this role, Dennis was CEO of Zenuity–a company that develops software for self-driving cars. He also worked as an MD for Volvo Cars in Switzerland, Program Leader for the all-new XC90 / S90 / V90 and the new SPA platform for Volvo Car Group, and Plant Director for the final assembly shop at Volvo Cars Torslanda.
He is the initiator and founder of MobilityXlab, a start-up business interface, which sees collaboration between Lindholmen Science Park, big Swedish companies like Ericsson, Zenseact, Volvo Cars and start-ups from all over the world. Dennis is passionate about high technology environments, sustainability, and authentic leadership, and holds a PhD in R&D Management.
We talked about the new release of the Polestar 3 - which has Smart Eye driver monitoring systems as standard, the user experience, technical challenges to overcome and general advice based on his vast experience. Listen to learn more.
Links of interest:
Mental health research takes many forms, but what unites it can be the application of biosensors and psychophysiological methodologies to better understand and ameliorate patient experiences and outcomes.
Today's episode features iMotions Product Specialist Dr. Pernille Bülow and Deputy CEO of Smart Eye (formerly Co-Founder and CEO of Affectiva) Dr. Rana el Kaliouby.
The two had a great discussion Emotion AI in mental health applications, the role of biosensors like eye tracking in mental health research and where this technology is heading in the future.
Links of interest:
Today's episode features iMotions Product Specialist Dr. Pernille Bülow and Dr. Erick Bourassa, an Associate Professor in the Department of Biological Sciences and Physician Assistant Studies at Mississippi College.
iMotions is a fully-integrated, hardware-agnostic software platform that allows researchers to use the power of any neuroscience technology, as well as traditional surveys and focus groups, to gain unparalleled insight into what people actual think and feel. At iMotions, Pernille consults and trains academic and commercial researchers on multimodal biometric data collection and study design.
We are joined by iMotions customer Dr. Bourassa, who joined MC’s faculty in 2014 and currently teaches Vertebrate Histology, Pharmacology, Virology, Cell Physiology, Pharmacology of Infectious Diseases, and Pharmacology & Therapeutics. One area of his research expertise includes alterations of normal attention and memory in patients with anxiety disorders. Currently, his research efforts are focused on identifying the abnormal neural circuitry that induces and maintains anxiety in patients with anxiety disorders.
The three talked about mental health in education and performance research, the role of biosensors in mental health research and Dr. Bourassa’s work in studying test anxiety in students using biosensors. Listen to learn more.
Links of interest:
Today’s episode features a Product Specialist at iMotions, Dr. Pernille Bülow. iMotions is a fully-integrated, hardware-agnostic software platform that allows researchers to use the power of any neuroscience technology, as well as traditional surveys and focus groups, to gain unparalleled insight into what people actual think and feel. Their customers use iMotions for everything from diagnosis of neurological diseases such as Parkinson, Schizophrenia, Autism and Alzheimers to personnel training, UX testing, advertising and military human research.
At iMotions, Pernille consults and trains academic and commercial researchers on multimodal biometric data collection and study design. Pernille finished her BS at UC Berkeley and completed her PhD in Neuroscience at Emory University where she studied brain mechanisms associated with the neurodevelopmental disorder Fragile X Syndrome. Pernille is passionate about sharing her knowledge to help others create ground-breaking science.
We talked about Emotion AI in mental health applications, the role of biosensors in mental health research and where this technology is heading in the future. Listen in to learn more.
Links of interest:
Today’s episode features Smart Eye Deputy CEO Dr. Rana el Kaliouby talking about humanizing the in-cabin experience. In a technology-dominated world there is an increasing need to bridge the gap between man and machine. Making our interactions with technology more seamless and intuitive will enable us to lead smarter, more productive, healthier and happier lives.
To help us achieve this ambitious goal, Rana explores how human-centric AI delivers better automotive safety and more engaging mobility experiences that enhance comfort, wellness and entertainment. Drawing from her decades of work on Emotion AI, Rana will discuss the future of the in-cabin experience and the imperative to realize this in an ethical manner. Listen in to learn more.
Links of interest:
Today’s episode features Graham Page, Global Managing Director, Media Analytics at Affectiva, a Smart eye company.
Graham and I discussed some takeaways from his recent presentation on how to brands can leverage Emotion AI for powerful campaigns during difficult times, plus a deep dive into our recent product release around conversational engagement and valence metrics. Listen to learn more.
Links of interest:
Every year 1.3 million people around the world die in road crashes according to the World Health Organization. More than 20% of these fatalities are estimated to be alcohol-related – a global problem that demands comprehensive solutions.
Driven by technical enhancements in Artificial Intelligence and global regulatory efforts Driver Monitoring Systems are fast becoming a leading human-centered automotive safety system. Initially focused on detecting distracted and drowsy driving, the foundations of these systems may offer keys to detecting and mitigating driver impairment.
Today the prevalent measure to determine alcohol intoxication – Blood Alcohol Concentration (BAC) – is a standard that was developed decades ago. Critical research is being conducted by government bodies, the automotive industry, technology companies and academia to determine more effective approaches.
In a recent event hosted by Smart Eye, we explored the state of alcohol intoxication research and opportunities to leverage evolving DMS technology to enhance road safety. What is the state of the art in impairment detection? Can we do more to mitigate harm with advanced technologies that exist today? What are the technological, societal and political challenges that will need to be overcome? When we will get there?
We gathered industry experts and leading researchers for a live panel discussion on the current state of intoxicated driving research. Listen to learn more.
Links of interest:
Today’s episode features Kevin Maney, partner at Category Design Advisors. A journalist who has been covering tech and society for 30 years, and the author of Play Bigger: How Pirates, Dreamers, and Innovators Create and Dominate Markets. Kevin and his team guide leadership teams to help them define, develop and ultimately dominate a new category of business.
It was a great opportunity to speak with someone who has covered tech and AI as a journalist for so long; we talked about his background, including how he stumbled into the tech beat, wrote the de-facto history of IBM, and how companies can approach the idea of category creation. Listen to learn more.
Links of Interest:
Today’s episode features Detlef Wilke, Vice President of Automotive Solutions at Smart Eye. With a degree in electrical engineering, Detlef has over 25 years of experience within the automotive industry with deep technical expertise in driver monitoring and interior sensing systems.
Listen to him speak about what he is working on at Smart Eye, how customers are using Smart Eye interior sensing technology, how it works, and what he sees as future trends in the automotive technology industry.
Links of interest:
Today’s episode features Duncan Southgate, Senior Director, Creative and Media Solutions - Insights Division at Kantar. Duncan has over 25 years brand, communications and media research experience gathered in various European, US and global roles with Millward Brown and Kantar. He is currently responsible for growing the company’s global creative and media effectiveness business and his core focus is how media context and creative content can best work together.
We talked all about attention: what role expressiveness plays in relation to attention, where emotional engagement comes into play, and even how can brands can “predict” attention and impact in their advertising.
Links of interest:
Today’s episode features the Founder & CEO of iMotions, and member of the Executive Management team in Smart Eye Group, Peter Hartzbech. Peter is an entrepreneur who is driven by using technology to make the world a better place. His company, iMotions, is a fully-integrated, hardware-agnostic software platform that allows researchers to use the power of any neuroscience technology, as well as traditional surveys and focus groups, to gain unparalleled insight into what people actual think and feel. Their customers use iMotions for everything from diagnosis of neurological diseases such as Parkinson, Schizophrenia, Autism and Alzheimers to personnel training, UX testing, advertising and military human research.
Peter and I talked about his backstory leading up to the genesis of iMotions, his prior synergies with Affectiva facial expression analysis and eye tracking technology in the Smart Eye group. Listen to learn more.
Links of Interest:
Today’s episode features Dr. Eric Daimler, who is an authority in the Artificial Intelligence community with over 20 years of experience in the field. He currently leads MIT’s first-ever spinout from its Math department and has co-founded six technology companies that have pioneered work in fields ranging from software systems to statistical arbitrage. As a Presidential Innovation Fellow during the Obama Administration, Eric helped drive the agenda for U.S. leadership in research, commercialization, and public adoption of AI. Eric is a passionate technologist, and we dove deep into conversations about AI - the potential, algorithm regulation and much more.
It was great speaking with Dr. Daimler on compositionality, his work at Conexus and I loved his points on having “circuit breakers” for AI, and his philosophy around lifesaving AI innovations should be quickly adopted and embraced, while emphasizing that it is important to be bringing more people into the conversation around AI so more people are comfortable with it - particularly with regard to bias and ethics in AI.
Links of interest:
Today’s episode features Neil Sahota, an IBM Master Inventor, United Nations A.I. Advisor, Chief Innovation Officer, and globally-recognized speaker and author. Neil is a founding member of the UN’s AI for Good Initiative, and I invited him to speak on how to “disrupt the box.” Through his work with Global Fortune 500 companies as a change maker, he created a disruptive thinking framework to show people how to think differently.
It was great speaking with Neil on his extensive background at IBM, and listening to his advice for entrepreneurs as an investor as well. His vast experience in the AI space was palpable throughout our conversation, yet he made the various topics very accessible and had some thought-provoking ideas on where he sees the future direction of AI heading. Listen in to learn more.
Links of interest:
The role of a pilot today is different compared to 50 years back. These days, most commercial aviation is largely automated – an innovation that has decreased the number of accidents drastically and made aviation one of the safest ways to travel. In one million take-offs, the average number of accidents is less than one. In the last few years, eye tracking technology has made its impact on the automotive industry. Gaze scanning has gone from being an exciting area of research, to being installed in new car models all over the world and taking its place as one of the most important technologies for road safety.
Right now, a lot is pointing to a similar development in the aviation industry. Is eye tracking technology about to grow wings and move on to saving lives up in the air?
This episode features Ulf Lofberg and Bjorn Lindahl of Smart Eye. Björn’s educational background is electrical engineering, and he also has a Master’s in advanced IT. Today, Björn is product manager for the Research Instruments business area at Smart Eye, which largely involves identifying and developing business opportunities with new applications for Smart Eye’s products. On the sales side, Ulf has a degree in electrical engineering and masters in telecommunications. After spending 20 years building mobile networks, 6 years ago he joined Smart Eye in a dramatic change of career path to work with the interesting technology we are building today.
Speaking with Ulf and Bjorn was a great opportunity to not only get a deeper technical understanding of how the Smart Eye products work for aviation, but also the practical applications of our technology and our vision for improving flight safety in the future of the aviation industry. Let’s listen in to learn more.
Links of interest:
Would you watch the news online or via a mobile app if it were presented by something that looks human, but is in fact an artificially intelligent virtual human?
Today’s episode features Leon Hawthorne. Leon is a media executive, journalist and academic; a former CEO of two satellite TV channels, three cable stations, a TV production company and a dozen web channels. He created web TV channels for Boot’s, Borders and Waterstones, and advised the CEOs of Hearst Magazines, the Independent and London Evening Standard on digital content strategies.
In his journalistic career, he was a World News Anchor for both CNN International and CNBC Europe. For BBC News, he was a member of the parliamentary lobby, attending daily briefings at 10 Downing Street, reporting politics and producing current affairs documentaries for BBC One and BBC Radio 4. Leon is presently on an academic sabbatical, researching for a PhD at City, University of London, while lecturing in Media and Corporate Communication.
In our conversation, we discussed his PhD Research: ‘Talking Heads: The use of virtual human presenters in the delivery of personalised news content’. The experiment itself uses AI to detect how participants really feel about the images they see, instead of relying wholly on answers participants give on a questionnaire. After being granted permission, the cloud-based software accesses the participant’s webcam to analyse their microexpressions, as they watch the videos. Microexpressions are small, rapid movements of the facial muscles that psychologists believe betray subconscious emotional reactions.
The technology for the experiment was developed by Affectiva Inc., the pioneer of Emotion AI. The research is interested particularly in seeing how opinions vary, depending on the age and sex of participants, and also on how much they use smartphones and other new technologies. Anyone aged over 18, who has access to a computer with a webcam, can take part in the 10-minute online experiment.
Links of interest:
Today’s episode features Magnus Brunzell, Vice President and head of business unit Fleet & Aftermarket at Smart Eye. With a Masters in engineering, Magnus has years of experience working with automotive tier-1 suppliers such as Delphi/Aptiv. He has had roles in project management, engineering, sales & marketing and managing director. I invited him to talk today about Smart Eye’s latest Product, Applied AI Systems, or AIS.
AIS is actually more of a line of products, with several variants to meet different needs. It is a Driver Monitoring System that detects the driver's face, eyes and gaze while driving and can give warnings for behaviors such as distraction and drowsiness, and fills a void in the market for namely commercial vehicles and the aftermarket. Listen in to learn more.
Links of interest:
Today’s episode features Deepak Varna, Head of Neuroscience Insights at Kantar–North America. Deepak has been working with Kantar for over 5 years to improve insight delivery from neuroscience tools, including client-specific training programs, especially focusing on facial coding. He has over 13 years of global experience in neuroscience in the areas of advertising, brand equity, and shopper solutions. He also has close to 20 years of experience in traditional qualitative and quantitative marketing research techniques. Deepak was an Executive Vice President – Client Services at Nielsen NeuroFocus where he managed the International business & managed relationships with several global clients making neuroscience a protocol in the areas of copy testing, packaging, product testing and POS testing.
In our conversation, we dove deeper into a recent webinar Kantar held on Diverse Reactions to inclusive advertising, some insights from which were uncovered using Affectiva’s Emotion AI methodology. Let’s listen in to learn more.
Links of interest:
Today's episode features our global team of working women at Smart Eye in honor of International Women’s Day. We posed three questions to our team, asking them to talk about their role, background and career journey, as well as what it means to be a woman in tech to them. Also in response to this year’s International Women’s Day theme of “Break the Bias” what is one recommendation they have to raise awareness against bias, or that people can do to take action for gender equality. Listen in to learn more.
Links of interest:
Today’s episode features Graham Page, Global Managing Director, Media Analytics at Affectiva and Vera Sidlova, Global Brand Manager, Creative at Kantar. Kantar is the world’s leading data, insights and consulting company. Combining their expertise in human understanding with advanced technologies, Kantar’s 30,000 people help the world’s leading organizations succeed and grow. Vera has been focused on creative effectiveness research and helps manage Kantar’s creative solutions portfolio.
Vera and Graham gave me a sneak peek around an upcoming joint webinar event on how brands can make their sustainability campaigns resonate, and shared some preliminary Emotion AI analysis findings from the Kantar and Affectiva databases. Listen to learn more.
To learn more, register for the sustainability in advertising webinar here:
https://kantar-webinars.zoom.us/webinar/register/4916437935917/WN_mvMk-gsqRsCnnpVqc7kEyA?
Today we have a very special episode featuring Dr. Taniya Mishra, CEO and Founder of SureStart. Dr. Mishra, who actually worked at Affectiva prior to starting her own company over a year ago, has been an AI scientist for over 10 years, with more than 88 awarded patents. Her company’s mission is to build early opportunity pipelines for a highly diverse tech workforce through technical skills training and project-based learning.
In our discussion, we covered Taniya’s background, her experience in starting the EMPATH internship program at Affectiva, and her passion about educating the talent of tomorrow. Let’s listen in to learn more.
Links of interest:
The next 10-20 years promise to see amazing expansion of human behavioral research, using sensing technologies, both commercially and academically. Technological advances will make it easier, faster, and more cost-effective to understand what’s driving human behavior and decision-making, of which we know more than 95% occurs below conscious awareness.
Smart Eye’s recent acquisition of Emotion AI pioneer Affectiva, and iMotions, the leading biosensor software platform have all joined forces to create a true, integrated powerhouse in delivering unparalleled insights into human behavior.
So today we have a very special episode featuring Dr. Rana el Kaliouby, former Co-Founder and CEO of Affectiva, currently Deputy CEO of Smart Eye, moderating a discussion with Smart Eye CEO Martin Krantz and iMotions CEO Peter Hartzbech. The three discussed how this new global organization will deliver unparalleled technology and a comprehensive analysis platform for research to be conducted faster, cheaper, easier, and in more challenging environments, and the opportunities for human behavioral research in automotive, academia and more.
Links of interest:
Today's episode features Dr. Rana el Kaliouby, former Co-Founder and CEO of Affectiva, currently Deputy CEO of Smart Eye, speaking with Elizabeth Bramson-Boudreau, the Chief Executive Officer and publisher of MIT Technology Review.
Elizabeth leads the growth, expansion, and modernization of the Review’s media platforms and products. Elizabeth also serves as the Chairman and President of MIT Enterprise Forum - a non-profit organization with chapters worldwide. Prior to this role, Elizabeth was the global managing director of the Economist Corporate Network.
Dr. Rana el Kaliouby is a scientist, entrepreneur and an AI thought leader on a mission to humanize technology before it dehumanizes us. She is formerly the Co-Founder and CEO of Affectiva, an MIT spin-off and category defining AI company. She is also an executive fellow at the Harvard Business School where she teaches about AI and startups.
Elizabeth and Rana had a great conversation on how technology can still be a force for good—and how journalism can help keep companies accountable and help them deliver on their promises. Let’s listen in to learn more.
Links of interest:
Today’s episode features Graham Page, Global Managing Director, Media Analytics at Affectiva.
In our conversation, we dove deeper into a recent press release announcing the latest version of Affectiva’s category-defining Emotion AI product. This update to its media analytics offering makes several new features available to its market research customers, including:
The unique insights we’ll be able to gather from these updates can be practically applied to improve brand experiences and communications - building a more positive and meaningful connection with consumers, and enabling optimization and action.
Links of interest:
So today’s episode features Solmaz Shahmehr, VP and Head of Business Area Research Instruments at Smart Eye.
Solmaz has been with Smart Eye since 2009 in different capacities - she is responsible for directing the Research Instruments organization, and charged with delivering a sustainable, profitable business model there. She also leads global sales, product development and product management, marketing, support & ops.
In our conversation, we talked a lot about the Research Instruments business unit of Smart eye: how our products there (like Smart Eye Pro) track, measure and analyze human eye movements to create a deeper understanding of human behavior, intentions and interactions all over the world. We also discussed the recently announced Smart Eye acquisition of iMotions, and what that means for the future of biometric research. Let’s listen in to learn more.
Links of interest:
Today’s episode features Brian Pluckebaum, Sr. Automotive Marketing Manager at Omnivision. Brian is responsible for regional marketing, product marketing, and building business successes with key partners in OmniVision’s automotive segment. Previously, Pluckebaum worked in engineering and product marketing roles at NEC, Renesas, ST, and most recently with Telechips. With more than 17 years’ experience in the semiconductor industry, he has held a number of positions in engineering, applications, and marketing.
We talked a lot about the technical workings and capabilities of Omnivision sensors within automotive specifically, and how a powerful collaboration between Smart Eye and Omnivision is shaping the future of interior sensing. Listen in to learn more.
Links of interest:
Today’s episode features two industry experts from a company called VI-grade.
Guido Bairati: As Vice President Global Sales & Marketing, Guido helps VI-grade customers streamline their design process and bridge the gap between physical testing and simulation by implementing tools that allow CAE Engineers and Test Drivers to work together earlier in the automotive development phase.
Antonio (Tony) Spagnuolo: VP, Business Development at VI-grade. Tony has over 30 years’ experience in the international simulation industry. Prior to VI-grade, Tony lead the worldwide sales team for HPC cloud computing start-up, Rescale. He also lead the aerospace sales team and MSC Software and managed operations at Nevada Automotive Test Center, an innovator in highly mobile ground vehicle technology for commercial and defense applications.
In our conversation, we talked a lot about how driving simulators can help accelerate product development, the complex “human” dynamics involved in vehicle simulation, and the role eye tracking and interior sensing plays in that. Listen to learn more.
Links of interest:
Today’s episode features Vera Sidlova, Vera Sidlova, Global Brand Manager, Creative at Kantar. Kantar is the world’s leading data, insights and consulting company. They understand more about how people think, feel, shop, share, vote and view than anyone else. Vera has been focused on creative effectiveness research and helps manage Kantar’s creative solutions portfolio.
Vera discussed how Affectiva’s advanced facial coding technology is integrated into Kantar’s methodology for ad testing to provide guidance on creative by understanding consumers’ unfiltered, moment-by-moment, emotional reactions to content. We also covered some joint research we did together, as well as some exciting studies coming up.
Links of interest:
The Power of Inclusive Portrayal in Advertising - How to Get it Right Playbook
Trends Analysis Report on Emotions in Advertising
Contact us for an Emotion AI demo
Today’s episode features our Media Analytics leaders Graham Page and Alex Duckett joined by Two Ears One Mouth Director, Sarah Gorman. For over 10 years, Two Ears One Mouth has been using statistical evaluation and emotionally intelligent observation to drive more informed decision making, and guide some of the worlds brightest brands to an even stronger future. Sarah is an experienced researcher who has consistently delivered strategic insight in the areas of brand and communications.
Even as the world continues to reopen post-lockdown, it is safe to say that COVID has forever changed the research landscape. Entire methodologies shut down almost overnight, and for qualitative researchers that typically collect data through in person sessions, there was an accelerated need for virtual alternatives.
With the adoption of new technology in a space that has typically remained true to traditional face to face interactions, there is of course some reluctance to change and fear of technology replacing the need for qualitative researchers entirely. Listen to the conversation to learn more.
Today’s episode features Dougal Hawes, the Managing Director of Smartbox. Smartbox is doing some amazing work delivering AAC assistive communication solutions alongside training, support & repairs.
Dougal discussed the launch of SmartBox’s new product, Lumin-i. Lumin-i is powered by Smart Eye eye tracking technology, and this new assistive communication solution helps people with disabilities communicate using just their eye movements.
Read the joint press release here.
We are very excited to kick off our season finale episode featuring a conversation between Affectiva co-founder and now Deputy CEO Dr. Rana el Kaliouby, and Smart Eye CEO Martin Krantz.
You may have seen, but we were excited to announce that Smart Eye, the global leader in eye tracking and driver monitoring systems, acquired Affectiva! We are thrilled to join forces with Smart Eye, as our two companies combined will form a global AI powerhouse. By merging our highly skilled teams and industry-leading technologies, we’ll bring to market unmatched AI solutions for the automotive industry, media analytics and beyond—better and faster than any of our competitors can.
About Smart Eye: Headquartered in Gothenburg, Sweden, Smart Eye was founded by CEO Martin Krantz and his father more than two decades ago, with a mission to bridge the gap between human and machine.
Guided by his vision of bridging the gap between humans and machines, Martin contributes his 20+ years of experience to automotive innovation and a growing number of research projects, leading to new insights and technological advancement. Recognized on Bloomberg’s 2021 list of 50 Global Leaders, Martin is a world renowned expert in eye tracking and its commercial applications.
Listen in on Rana and Martin's conversation to learn more about our amazing synergies with Smart Eye, our shared vision, and the opportunities we see to humanize technology together.
Advertising has the ability to influence consumer behavior and can be instrumental in creating and reinforcing positive or negative stereotypes. The advertising industry plays an important role in leading change, and you don’t have to be an activist brand to start positively representing people in ads—it's relevant for all brands.
Today’s episode features our CMO Gabi Zijderveld, Graham Page, Affectiva’s Global Managing Director of Media Analytics, talking with Vera Sidlova, Global Brand Manager of Creative at Kantar, about some of the latest evidence about the power of inclusivity in advertising - and how to get it right.
We've got some great examples and learnings to share in our upcoming webinar on 5/27 where experts from Kantar and Affectiva will explore how inclusion and diversity in advertising has evolved, and the impact positive representation can have.
Today’s episode features Dr. Rana el Kaliouby and Graham Page from Affectiva talking with CloudArmy’s President / CSO Neuroscience, Thom Noble, about how Emotion AI paired with neuro implicit technologies can provide market researchers with richer and deeper insights.
Affectiva is excited to partner with CloudArmy, a cloud-based neuroscience research technology company. By integrating our technologies, we’re able to deliver enhanced analytics that provide a ‘full brain’ picture of consumer response, design and creativity.
Make sure you watch the kiss in the the Cadbury chocolate ad case study, which sparked some public controversy, and will be a key point during the discussion!
Today’s episode features Affectiva CEO Dr. Rana el Kaliouby interviewing Dr. Jinmo Lee, Senior Research Engineer at Hyundai.
Dr. Lee was always interested in working in industry to see how his research outcomes turn into products that change customers' lives. Hyundai offered him a job as a senior research engineer for the vehicle aeroacoustics, and today he is in charge of future mobility UX concept design and engineering at Hyundai.
Then in 2018, MIT Media Lab and Hyundai launched the Special Interest Group for Emotion Navigation. Together with Hyundai and the Media Lab, Dr. Lee derived automotive AI concepts for sensing occupant emotion to help achieve their desired emotional state by providing an optimized in-cabin environment.
Affectiva CEO Dr. Rana el Kaliouby interviews Jennifer Haroon, Executive in Residence at Greylock Partners. Jennifer has an illustrious career in automotive: most recently she was COO and interim CFO at Nauto, and prior the Head of Business Operations for Waymo.
This conversation was the result of Affectiva’s recent collaboration with the organization Women in Autonomy to help drive change by highlighting the success of amazing women in automotive. While many women are rising through the ranks and doing incredible things in this industry, they are still underrepresented in the highest positions.
So, we are hosting a new series of livestream events called “Women at the Wheel," together with Women in Autonomy, where Rana will have a dynamic conversation with a female leader in automotive, exploring their unique journey, the challenges they have had to navigate, how they’ve sought out mentors and allies, and paths to leadership.
Read more about Women in Autonomy.
Dr. Rana el Kaliouby and Graham Page from Affectiva talk with iMotions Vice President Of Product Management, Olee Jensen, about the launch of iMotions’ new online data collection tool.
This last year, many organizations have been forced to pivot and incorporate new technologies into their businesses as a result of the pandemic. Media analytics researchers that previously gathered data from in-person participant studies suddenly found themselves looking for a new solution in order to continue to move projects forward safety and efficiently.
Recognizing this immediate need within the market and for their clients, iMotions created their new Online Data Collection tool. Together utilizing Affectiva's Emotion AI technology, this tool enables research organizations to continue to execute studies that integrate biometric, behavioral and self-report data virtually.
iMotions is a wonderful partner of ours, and this innovative product launch of theirs is really exciting for market researchers navigating data collection not just in times of pandemic, but could also pave the way for the future of data collection.
Today’s episode features Imat-uve CEO Hans Peter Schlegelmilch as well as software engineer Dennis Boghoff. Imat-uve is an innovative, independent development and engineering company based in Germany. The company has an innovation team who develops new product concepts. One such concept they are working on is around fragrance diffusing, using a person’s emotional and cognitive state (via Affectiva's Emotion AI) to trigger scents within vehicles.
Listen for more on what Imat-uve believes our combined technologies mean for the future of mobility.
Today’s episode features Affectiva CSO Andy Zeilman moderating a panel of experts: Dr. Bryan Reimer, Research Scientist at MIT, Caroline Chung, Senior Business Development Manager at Veoneer, and Richard Schram, Technical Director at Euro NCAP. This panel explored how AI and computer vision can improve the comfort, health and wellbeing of not only the driver, but also back seat passengers. They also touched on what personalized experiences vehicles will be able to deliver when they are able to sense human emotions, cognitive states and reactions.
Let’s listen in on their interview to learn more on how Euro NCAP, MIT, Affectiva and Veoneer are thinking about the next generation mobility experiences and the AI systems that fuel them.
Watch the full recording here
Today’s episode features Affectiva CEO Dr. Rana el Kaliouby moderating a panel of experts: Dr. Bryan Reimer, Research Scientist at MIT, Caroline Chung, Sr. Business Development Manager at Veoneer, and Richard Schram, Technical Director at Euro NCAP.
This panel explored how AI and computer vision can aid in child presence detection, seat belt detection, seat configuration, airbag deployment, and more. They also touched on multi-sensor approaches to developing advanced safety features and the importance of sensor placement, as well as the challenges car manufacturers and technology providers are facing as they bring these capabilities to market.
Download to watch the full recording here
Today’s episode features Affectiva CEO Dr. Rana el Kaliouby interviewing Sean Batir, Senior Machine Learning Engineer at BMW as part of her Virtual Girl Decoded Book tour.
Every day, there are more and more exciting developments in the automotive industry, with a lot of automation driving change in this space. Traditionally, the focus has been on what's going on outside of the vehicle. At Affectiva, we're passionate about turning it inward and really trying to understand what is happening inside the vehicle with our In-Cabin Sensing (ICS) technology: what's happening with the driver? What's happening with the other occupants in the car? What is the state of the cabin? How can we use that information to re-imagine what the future of our driving and riding experiences look like?
You can also read more at the episode blog post here: https://blog.affectiva.com/bmw-how-in-cabin-sensing-helps-build-the-ultimate-in-vehicle-experience
In our latest Affectiva Asks podcast episode, we feature Terence Scroope, Vice President of Insights and Solutions at Unruly. We spoke about some of the recent research work he has done, especially around gender stereotypes in US advertising, ads surrounding the Black Lives Matter movement, and much more.
Read more at the blog here: https://blog.affectiva.com/balancing-long-term-brand-effects-and-short-term-activation-with-unruly
Today’s episode features Dr. Ned Sahin, Founder and CEO of Brain Power. Dr. Sahin has a PH.d in cognitive neuroscience from Harvard, he has a Master’s from MIT, an Undergrad degree from Williams College and spent some time at Oxford. He was also recently featured in Affectiva CEO Dr. Rana el Kaliouby's book, Girl Decoded, as early on in her career, Rana had explored how her Emotion AI technology could have applications to those on the autism spectrum.
This episode features a number of guests from Affectiva CEO Dr. Rana el Kaliouby’s virtual book tour. What is a Virtual Book Tour? Rana launched her book “Girl Decoded” in April 2020, in the midst of the COVID pandemic: so all of her planned book promotional events and travel were (rightfully) cancelled.
Yet we wanted make the most out of this by creating a standing Virtual Book Tour, where every week, Rana livestreams on her social profiles with some amazing guests. We’ve seen some lively conversations in these last few months, which have explored technology innovation, ethics in AI, advancing women in tech, and leadership.
So for today’s episode, we thought we could take a look back at some of the most memorable moments and guests from her book tour so far. Enjoy!
Today’s episode features John Pelliccio, Head of Product Communications, Automotive Systems at Bose Corporation. During the interview, he talks to us about his background, the challenges auto manufacturers face with regard to audio within the vehicle, and how understanding something as simple as audio can make a huge difference in your automotive experience.
We have a very special episode featuring Dr. Rana el Kaliouby, Co-Founder and CEO of Affectiva. Rana is also an AI Thought Leader, machine Learning Scientist, and now a published author. During the interview, she talks to us about her background and personal journey co-founding Affectiva, her new book, “Girl Decoded” which launched this month, and some of her personal stories around leadership - including our interview about 4 years ago!
In our latest Affectiva Asks podcast, we talk about human-centric AI with six speakers at Affectiva’s 2019 Emotion AI Summit: Rudina Seseri of Glasswing Ventures, Dr. Cory Kidd of Catalia Health, Dana Lowell of Faurecia, John Suh of Hyundai CRADLE, David Woessner of Local Motors, and Terah Lyons of The Partnership on AI.
In our latest Affectiva Asks podcast, we interview Affectiva Senior Product Manager Abdelrahman Mahmoud. During the interview, he talks to us a bit about his background in software engineering, what he sees as challenges within the automotive industry and how Affectiva’s Human Perception AI technology aims to enhance the occupant experience in next generation vehicles.
Download this eBook to learn more on how Human Perception AI technology can improve the in-cabin experience: http://go.affectiva.com/occupant-experience
Today’s episode features Dr. Jessica Wilson, Senior Product Specialist at iMotions. During the interview, she talks to us about how commercially available tools in neuroscience can help you define human behavior for yourself, how iMotions and Affectiva are working together, as well as a teaser of her upcoming technical workshop at the Emotion AI Summit, “More Data, Better Data: Defining Human Behavior with Biometrics”.
Read more at www.EmotionAISummit.com
Today’s episode features Danny Lange, VP of AI and Machine Learning at Unity. During the interview, he talks to us about his work with car companies at Unity, some of the projects and challenges he has seen OEMs and Tier 1s encounter, and his thoughts around technical hurdles to overcome around the design of autonomous vehicles.
Learn more about Unity's work in automotive here: https://unity.com/solutions/automotive-transportation-manufacturing
Today’s episode features our CEO Dr. Rana el Kaliouby interviewing Lisa Feldman Barrett, distinguished Professor of Psychology at Northeastern University, where she focuses on the study of emotion and she directs the Interdisciplinary Affective Science Lab. During their interview, they discuss Dr. Barrett’s background, her recent research paper on challenges Inferring Emotion from Human Facial Movement, and how Affectiva is a leader in mapping contextual variability with multiple signals in our technology for detecting human emotions.
Today’s special crossover episode features Affectiva CEO Dr. Rana el Kaliouby interviewing Rob May, co-founder and CEO of Talla. During our interview, he talks to us a bit about his engineering background, his current work in the customer support automation space, and trends he is seeing as an angel investor.
Listen to Rob's interview of Rana on the Tall AI at Work podcast here: https://talla.com/resource/episode-50-humanizing-technology-with-rana-el-kaliouby-at-affectiva/
Today’s episode features Affectiva's Product Manager, Mike Gionfriddo. During the interview, he talks to us about his background in startups focused on in-vehicle technology, what he sees as challenges within the automotive industry and how Affectiva’s Human Perception AI technology plays into that.
Mike’s work at Affectiva really focuses on how the company can improve road safety, specifically with regard to our Human Perception AI offering. What I found most interesting about our chat was the the balance that must be struck with AI between helping OEMs and Tier 1s keep costs down with a minimal processing footprint, while providing accurate, real time estimates of emotions and cognitive states with highly efficient models.
Today’s episode features Affectiva's Co-Founder and CEO Dr. Rana el Kaliouby interviewing Karl Iagnemma, President of Autonomous Mobility at Aptiv. During their interview, he talks to us a bit about his background, his early work with nuTonomy (which has since been acquired by Aptiv), and how autonomous vehicle technology will transform the future of transportation and mobility.
Today’s episode features Gabi Zijderveld, Affectiva's Chief Marketing Officer and Head of Product Strategy interviewing Bryan Reimer. Bryan is a Research Scientist at MIT AgeLab & Associate Director of the New England University Transportation Center. Bryan is a thought leader in driver safety and the future of mobility. During their interview, he talks to us a bit about his work at MIT, some of the challenges of AI in automotive and how OEMs and Tier 1s can overcome them.
Today’s episode features Karen Osorio, Senior R&D Scientist at Procter and Gamble, and President + Co-Founder of Bag in The Back, an organization with the mission to increase awareness to parents and caregivers about the dangers of vehicular heatstroke.
What will the experience of future drivers or passengers look like in the cars of tomorrow? This is the question that Adam Emfield, senior manager of user experience at Nuance Automotive, studies. He heads the design research, innovation and in-vehicle experience (DRIVE) lab, which explores user experience questions around multimodal and intelligent automotive cockpits of the future.
Welcome to Affectiva Asks, our new podcast focused on all things related to human-centric AI. Affectiva Director of Marketing Ashley McManus joins CMO Gabi Zijderveld to introduce more about our podcast, why we are doing this, what type of content we'll be covering, and what else to expect.
How can we create trust in mobility? That is the question Ola Boström, the Vice President of Research wrestles with at Veoneer. Veoneer designs, compiles and sells software, hardware and systems for active safety, autonomous driving, occupant protection and brake control.
We recently caught up with Ola after a component of Veoneer’s CES 2019 demo featured Affectiva’s Emotion AI technology, to learn a little bit more about his work within the space of automotive safety electronics.
En liten tjänst av I'm With Friends. Finns även på engelska.