BEGIN:VCALENDAR VERSION:2.0 X-WR-CALNAME:EventsCalendar PRODID:-//hacksw/handcal//NONSGML v1.0//EN CALSCALE:GREGORIAN BEGIN:VTIMEZONE TZID:America/New_York LAST-MODIFIED:20240422T053451Z TZURL:https://www.tzurl.org/zoneinfo-outlook/America/New_York X-LIC-LOCATION:America/New_York BEGIN:DAYLIGHT TZNAME:EDT TZOFFSETFROM:-0500 TZOFFSETTO:-0400 DTSTART:19700308T020000 RRULE:FREQ=YEARLY;BYMONTH=3;BYDAY=2SU END:DAYLIGHT BEGIN:STANDARD TZNAME:EST TZOFFSETFROM:-0400 TZOFFSETTO:-0500 DTSTART:19701101T020000 RRULE:FREQ=YEARLY;BYMONTH=11;BYDAY=1SU END:STANDARD END:VTIMEZONE BEGIN:VEVENT CATEGORIES:College of Engineering,Thesis/Dissertations DESCRIPTION:Committee Members:Dr. Firas Khatib, Computer and Information Sc ience Department, 糖心logo入口Dr. Christopher Hix enbaugh, Naval Undersea Warfare Center Date & Time: 05/14/2026 (Thursday), 10:30 AM - 11:30 AM (Eastern Time)Room: 聽DION 311 Abstract: The second-g eneration Neural Networks have evolved in recent years, which have become more complex architectures such as spiking neural networks and quantum neu ral networks. However, the computational resource restriction of neural ne tworks on edge devices is still challenging. The thesis investigates stabl e learning and compute-resource efficiency on spiking neural networks and quantum neural networks. Other common qualities like high performance (e.g ., high accuracy, high reward), robustness, convergence, predictability, a nd fast running times were also considered in one or more studies. The con tributions of the thesis have several folds. The first study was using aud io data; one reason was to verify if a trend called temporal information c oncentration is present in the spiking neural network. We also gathered ot her findings, such as dataset complexity impacting Fisher Information, rel ated to temporal information dynamics. The second study on multimodal spik ing neural networks explored the effects of audio and image noise. The res ults show the multimodal model outperformed its unimodal counterparts, but certain configurations of image noises, audio noises, and noise levels pe rformed better than others. A third study on spiking neural networks revea led that temporal information concentration was not present in quantizatio n-aware-training variants, but an increase in Fisher Information was found in those variants. In one of the quantum neural network studies with rein forcement learning, we found faster initial convergence, longer decreasing in standard deviation and policy entropy, and a few correlations as well related to average reward and policy entropy. In the second study on quant um neural networks, structured pruning is found to sharpen decisiveness an d reveal bad pruning paths, while overparameterization can help exploratio n. All these studies try to address maintaining or improving stable learni ng, if the models are computation-resource efficient enough to be realisti c. All CIS and Data Science Graduate Students are encouraged to attend. Fo r further questions please contact Dr. Yuchou Chang at ychang1@umassd.edu\ nEvent page: /events/cms/stable-and-compute-resource -efficient-learning-with-spiking-and-quantum-neural-networks-methods-and-i nsights.php X-ALT-DESC;FMTTYPE=text/html:

糖心logo入口

Committee Members:
Dr. Fir as Khatib\, Computer and Information Science Department\, University of Ma ssachusetts Dartmouth
Dr. Christopher Hixenbaugh\, Naval Undersea War fare Center

\n

Date & Time: 05/14/2026 (Thursday)\, 10:30 AM - 11:30 AM (Eastern Time)
Room: 聽DION 311

\n

Abstract:

\n

The seco nd-generation Neural Networks have evolved in recent years\, which have be come more complex architectures such as spiking neural networks and quantu m neural networks. However\, the computational resource restriction of neu ral networks on edge devices is still challenging. The thesis investigates stable learning and compute-resource efficiency on spiking neural network s and quantum neural networks. Other common qualities like high performanc e (e.g.\, high accuracy\, high reward)\, robustness\, convergence\, predic tability\, and fast running times were also considered in one or more stud ies. The contributions of the thesis have several folds. The first study w as using audio data\; one reason was to verify if a trend called temporal information concentration is present in the spiking neural network. We als o gathered other findings\, such as dataset complexity impacting Fisher In formation\, related to temporal information dynamics. The second study on multimodal spiking neural networks explored the effects of audio and image noise. The results show the multimodal model outperformed its unimodal co unterparts\, but certain configurations of image noises\, audio noises\, a nd noise levels performed better than others. A third study on spiking neu ral networks revealed that temporal information concentration was not pres ent in quantization-aware-training variants\, but an increase in Fisher In formation was found in those variants. In one of the quantum neural networ k studies with reinforcement learning\, we found faster initial convergenc e\, longer decreasing in standard deviation and policy entropy\, and a few correlations as well related to average reward and policy entropy. In the second study on quantum neural networks\, structured pruning is found to sharpen decisiveness and reveal bad pruning paths\, while overparameteriza tion can help exploration. All these studies try to address maintaining or improving stable learning\, if the models are computation-resource effici ent enough to be realistic.

\n

All CIS and Data Science Graduate Stud ents are encouraged to attend.

\n

For further questions please contac t Dr. Yuchou Chang at ychang1@umassd.edu

Event page:

DTSTAMP:20260426T011646 DTSTART;TZID=America/New_York:20260514T103000 DTEND;TZID=America/New_York:20260514T113000 LOCATION:Dion 311 SUMMARY;LANGUAGE=en-us:Stable and Compute-Resource Efficient Learning with Spiking and Quantum Neural Networks: Methods and Insights UID:c2842657704d7ce8fd35cf5f75b173e3@www.umassd.edu END:VEVENT END:VCALENDAR