Prince, Enamul HoqueVaradarajan, Kaavya2024-07-182024-07-182024-04-112024-07-18https://hdl.handle.net/10315/42147Smartwatches are increasingly popular for collecting and exploring personal data, including health, stocks, and weather information. However, the use of micro-visualizations to present such data faces challenges due to limited screen size and interactivity. To address this problem, we propose integrating natural language (voice) with micro-visualizations (charts) to enhance user comprehension and insights. Leveraging a large language model like ChatGPT, we automatically summarize micro-visualizations and combine them with audio narrations and interactive visualizations to aid users in understanding the data. A user study with sixteen participants suggests that the combination of voice and charts results in superior accuracy, preference, and usefulness compared to presenting charts alone. This highlights the efficacy of integrating natural language with visualizations on smartwatches to improve user interaction and data comprehension.Author owns copyright, except where explicitly noted. Please contact the author directly with licensing requests.Artificial intelligenceInformation technologyIntegrating Natural Language and Visualizations for Exploring Data on SmartwatchElectronic Thesis or Dissertation2024-07-18Data visualizationSmartwatchChatGPTVoice interactionCharts