In this series of seminars dedicated to the memory of Mauriana Pesaresi, a doctoral student of the Computer Science Department of the University of Pisa, first-year PhD students in Computer Science will present an open research problem related to their field of study. After each seminar, a panel discussion will follow.
Browse past editions: 2022 Edition, 2023 Edition, 2024 Edition, 2025 Edition.
For any further information, you can reach us out via email.
16:00-17:00
Systems Biology studies biological processes from a holistic perspective, focusing on interactions between components rather than on single entities taken in isolation. The systems that may arise are many depending on the entities taken in consideration and the kinds of interactions between them. In this seminar, we focus on metabolism specifically, pointing out why its study is relevant and the kinds of questions that can be posed regarding the dynamics emerging from the interactions. Furthermore, we highlight known limitations of traditional approaches and present some alternative methods developed to overcome them. The talk concludes by listing some open challenges in the field and their relevance from a concrete point of view.
16:00-17:00
No abstract available
16:00-17:00
No abstract available
16:00-17:00
No abstract available
15:00-16:00
Many of the most optimistic asymptotic speedups in quantum algorithms for linear algebra and quantum machine learning rely on a strong input assumption: coherent access to large classical datasets in superposition via quantum random access memory (qRAM). In this seminar, qRAM will be presented as an open problem at the theory-engineering boundary, spanning architecture, fault tolerance, and algorithmic complexity.
15:00-16:00
No abstract available
16:00-17:00
No abstract available
16:00-17:00
No abstract available
16:00-17:00
No abstract available
16:00-17:00
No abstract available
16:00-17:00
This presentation provides a comprehensive overview of the current state of Serverless for HPC, based on a Systematic Literature Review (SLR) of 122 primary studies published between 2017 and early 2025. We examine the convergence of HPC and Cloud-Native paradigms, identifying key use cases dominated by generic framework development (55.7%) and Machine Learning inference (19.7%). Finally, we analyze the critical barriers hindering widespread adoption—including cold start latency, data communication bottlenecks, and state management inefficiencies—and outline the necessary research directions to enable next-generation, high-performance serverless platforms.
16:00-17:00
In this talk we focus on discrete linear optimization. Starting from a mathematical formulation, we discuss the computational complexity of these problems, highlighting both tractable cases and sources of hardness. We then present two main solution paradigms for discrete optimization problems: dynamic programming and branch-and-bound methods. For dynamic programming, we provide a formal mathematical framework and explain its underlying principles. For branch and bound, we introduce linear relaxations, cutting-plane ideas, and Lagrangian relaxation, discussing their role in improving bounds and algorithmic performance.
16:00-17:00
This seminar investigates the functional architecture of Large Language Model (LLM) based AI agents, detailing their core components including planning, memory, and tool usage. We will differentiate between standalone AI agents and complex agentic AI systems, utilizing case studies to illustrate these distinctions. The presentation concludes by introducing edge computing, examining the challenges involved in enabling agentic frameworks in decentralized, resource-constrained edge environments.
16:00-17:00
Modeling graphs demands a careful balance between long-range propagation of information across nodes and the controlled dissipation of noisy or redundant signals to ensure stable learning and generalization. This challenge is exacerbated in dynamic graphs, where structural and temporal information interact, leading to uncontrolled information accumulation and amplifying noise, thereby affecting generalization. By framing graph neural networks (GNNs) as dynamical systems, this seminar investigates how existing techniques integrate to govern the temporal evolution of node states, with a primary focus on highlighting the unresolved theoretical and architectural bottlenecks that remain open problems in current research.
15:00-16:00
Message-passing Graph Neural Networks have become a dominant paradigm for learning on graphs. At the same time, a growing body of work describes their limitations through notions such as oversmoothing, oversquashing, heterophily, and long-range information loss. While these concepts are widely used, their definitions and explanations often overlap. In this talk, we revisit these failure modes from multiple perspectives and ask a central question: what exactly breaks when deep message passing fails? We examine how these phenomena are defined and measured, and discuss whether commonly accepted explanations truly isolate the underlying mechanisms. Instead of proposing a new technique, the goal is to understand why deep message passing can struggle, and to open a discussion that connects graph learning with ideas from different areas.
16:00-17:00
Diffusion models are powerful generative tools, but in many structured domains realism is not enough: outputs must also satisfy explicit constraints. This seminar introduces the basics of diffusion models and examines how they can be guided toward valid solutions under geometric, relational, and symbolic requirements. Using floorplan generation as a running example, we explore methods such as conditional guidance, projected diffusion, and neurosymbolic approaches, while highlighting the trade-off between validity and diversity, especially in few-shot and out-of-distribution settings. The talk concludes by arguing that the real challenge is not just generating realistic samples, but combining probabilistic generation with stronger guarantees of correctness.