University Courses

The contribution of the SAG group to educational activities reflects its core mission: to build a new generation of AI researchers and engineers through a deep and structured understanding of intelligent systems. Faculty members contribute to both undergraduate and graduate programs, offering courses that span the entire AI pipeline—from foundational programming and data technologies to the most advanced topics in machine learning, deep learning, and language understanding.

This broad yet coherent teaching portfolio ensures that students gain both the theoretical foundations and the practical skills needed to work with real-world AI systems. Our courses are closely aligned with the group’s research areas, allowing students to engage with state-of-the-art technologies and methodologies while developing a critical and creative mindset. The SAG laboratory is often the hosting infrastructure for stage, thesis and PhD Thesis enabling a fruitful and stimulating environment about AI research for students, PhD students, researchers as well as industry innovation managers.

Artificial Intelligence

Instructor: Roberto Basili
Program: Bachelor’s in Computer Science, Year 3

This course introduces the foundations of Artificial Intelligence as an area, including autonomy in AI agent, automated reasoning and Machine Learning. On the one hand, the course emphasizes the aspects related to logic and inference in the knowledge representation problems for AI agents based on symbolic languages. The course introduces the notion of intelligent agents, planning, and problem solving, along with a basic exposure to natural language processing and machine learning techniques. Theoretical insights are complemented by hands-on exercises using logic programming languages (e.g., Prolog) as well as Machine Learning frameworks (e.g. Weka, PyTorch).

Deep Learning

Instructor: Roberto Basili

Program: Master’s in Computer Science, Year 2

The course focuses on the paradigms of deep neural networks, covering algorithmic models such as multilayer perceptrons, convolutional neural networks up to transformers and foundational models. The course explores the deep neural approaches to language and vision tasks, with a special emphasis on effective paradigms such as attention in MLPs, encoding networks in language processing, decoder-only architectures and generative models for which prompting, instruction tuning, and adaptation are discussed. Applications are used as a basis for understanding the basic evolution of the field and range from advanced text processing and IR, biomedical AI to cybersecurity and FinTech.

Information Retrieval

Instructor: Danilo Croce

Program: Master’s in Computer Science, Year 2

This course covers algorithms and models for indexing, ranking, and retrieving information from large document collections. Topics include vector space models, probabilistic retrieval, relevance feedback, and evaluation metrics. Students also gain hands-on experience with tools like Lucene, SOLR, and Python-based IR systems.

Operating Systems and Networks

Instructor: Danilo Croce

Program: Bachelor’s in Computer Science, Year 2

Students learn the core principles of operating systems, including process management, memory organization, file systems, and I/O handling. The course also covers fundamental networking concepts, from the application layer to physical infrastructure. It builds a solid foundation for understanding distributed and concurrent systems.

Data and Knowledge-based Systems

Instructor: Roberto Basili

Program: Bachelor’s in Internet Engineering, BioInformatics , Year 2

The course introduces the basics of Data Management principles, from conceptual design to three-tier data-driven application engineering and release. The conceptual design is studied through the use of Entity-Relationship languages as a basis for domain modeling and DB engineering. SQL is introduced as a general data definition and query language for relational databases. The course also carries out technology practices lessons on modeling of real-world applications as well as knowledge-based systems models. For this reason a basic introduction of languages useful to interact with RDBMs such as XML, JDBC or PHP is also carried out, in laboratory work dedicated to design and test simple Web-based applications, and data-driven architectures. 

Programming and Programming Lab (Python)

Instructor: Daniele Margiotta (SAG collaborator)

Program: Master’s in Bio Informatics

The course introduces students to Python programming, focusing on variables, control structures, functions, and data manipulation. Lab sessions support the development of basic applications and algorithmic thinking. It establishes the computational foundation needed for more advanced AI and data courses.

Mobile Application Development

Instructor: Danilo Croce

Program: Bachelor’s in Computer Science, Year 3

This course focuses on building Android applications using Java, covering mobile interfaces, graphics, communication management, and server-side integration (JSP, REST). Students design and develop realistic mobile apps, learning to manage architectural constraints and deliver professional-quality solutions.

TeachingLabs – Educational Resources on LLMs and NLP

GitHub repositories:

  • Advances in AI 2024 – Summer School Materials:
    •  https://github.com/crux82/advances-in-ai-2024
  • CLiC-it 2023 Tutorial – Instruction Tuning with LLMs:
    •  https://github.com/crux82/CLiC-it_2023_tutorial
  • BISS 2024 – Bertinoro International Spring School Materials:
    •  https://github.com/crux82/BISS-2024

TeachingLabs is a curated set of educational resources developed within national events and advanced schools on Artificial Intelligence and NLP. These include lecture materials, interactive notebooks, and practical labs focused on large language models (LLMs), instruction tuning, prompt engineering, and semantic reasoning. The resources were used in high-level training contexts such as CLiC-it 2023, the AIxIA Summer School 2024, and the Bertinoro International Spring School. These hands-on materials support students and researchers in exploring state-of-the-art NLP tools like BERT, LLaMA, and PEFT-based architectures.