8:30-9:00 am
Registration and Breakfast, Culinary Arts Center
9:00-9:50 am
Welcome and Morning Keynote, Culinary Arts Center
Implementing Predictive Analytics so they Help, Never Harm, Students
- Iris Palmer, Senior Advisor for Higher Education and Workforce, New America
As higher education grapples with promoting student success using fewer resources, predictive analytics—the use of past data to forecast future outcomes—is a promising solution. But like all powerful tools, it must be used well. New America has conducted research into what it looks like to implement predictive analytics ethically. This talk will present some of the challenges of implementing predictive analytics from recruiting and enrollment through graduation. It will also provide guiding practices for ensuring these tools are used ethically.

10:00-10:30 am
Session I – Power Talks – Culinary Arts Center
Ethics for a New Era - the AIR Statement of Ethical Principles
This session will introduce AIR's new Statement of Ethical Principles, providing background on how and why it was developed. The Principles themselves will be explained and the ways in which the Statement will be implemented will be discussed.
MHEC Data Suppression Policy
-
Barbara Schmertz, Director, Office of Research and Policy Analysis, Maryland Higher Education Commission

10:45-11:45 am
Session II – Chesapeake Hall
Making Your Data Count: A Taxonomy, Process, and Rubric to Achieve Broader Institutional Impact
-
Dr. Jennifer Harrison, Associate Director for Assessment, University of Maryland Baltimore County
-
Dr. Sherri Braxton, Senior Director of Instructional Technology, University of Maryland Baltimore County
Assessment technologies can help contextualize learning analytics with student learning outcome evidence, but how can institutions integrate these data? Institutions need tools that integrate multiple measures of student success—especially direct evidence—to deepen insights about student learning. To bridge student success and outcomes data, we need software that enables institutions to aggregate outcomes data by rolling up direct evidence to the institutional level. Our session explores technologies that enable institutions to systematize outcomes data, so direct learning evidence can add depth and nuance to learning and predictive analytics and deepen our understanding of student learning. Our goal is to help faculty, staff, and other campus leaders create a culture of data-informed decision making by interacting with three tools we created to help institutional leaders begin to systematize learning assessment data: a taxonomy, a process, and a rubric.
Implementing a Data Mart to Support Self-Service Reporting
-
David Miller, Senior Database Report Writer, University of Maryland Global Campus
-
Renée Teate, Director of Data Science, HelioCampus
Assessment technologies can help contextualize learning analytics with student learning outcome evidence, but how can institutions integrate these data? Institutions need tools that integrate multiple measures of student success—especially direct evidence—to deepen insights about student learning. To bridge student success and outcomes data, we need software that enables institutions to aggregate outcomes data by rolling up direct evidence to the institutional level. Our session explores technologies that enable institutions to systematize outcomes data, so direct learning evidence can add depth and nuance to learning and predictive analytics and deepen our understanding of student learning. Our goal is to help faculty, staff, and other campus leaders create a culture of data-informed decision making by interacting with three tools we created to help institutional leaders begin to systematize learning assessment data: a taxonomy, a process, and a rubric.
IPEDS Update
NCES will provide an update on changes happening in the IPEDS data collection, and with changes happening to IPEDS tools including the new Data Explorer.

12:00-12:50 pm
Lunch and Business Meeting, Culinary Arts Center

12:50-1:30 pm
Afternoon Keynote, Culinary Arts Center
Using Data for Student Success: How Can We Move the Needle on Equity?
Institutional researchers are on the front-lines of data use and analysis on campus. They play a pivotal role in helping their community understand the student pathways so we can move toward equitable student success. In order to close attainment and achievement gaps on campus and nationwide, we must harness this information to understand how we can improve processes and programs and better serve all students.

1:45-2:45 pm
Session III – Chesapeake Hall
Replacing Institutional Myths with Facts: Best Practices in Sharing Information
-
Vinnie Maruggi, Director of Institutional Research, Planning and Effectiveness, Chesapeake College
-
Michelle Appel, Director of Assessment and Decision Support, University of Maryland College Park
-
Natalie Crespo, Director of Institutional Research, Carroll Community College
-
Caleb Rose, Senior Researcher for Institutional Effectiveness, Frederick Community College
Institutional research has advanced to the stage where institutions can easily reach into their data, slice and dice it, and make it look great. Yet the wealth of IR data often goes unnoticed while staffers continue to cite inaccurate statistics that makes IR-ers cringe. This session will provide brief discussions by four MdAIR members who will offer examples of what their institutions do (and don’t do) to get information to decision makers who really need it. Ample time will be allowed for audience members to share their experiences, making it a blend of panel and roundtable discussion.
An "Eventful" Journey - How UMBC moved from Event to Engagement Tracking
-
Kevin Joseph, Senior Director Business Intelligence, University of Maryland Baltimore County
-
Ken Schreihofer, Manager BI and Student Success Technologies, University of Maryland Baltimore County
In 2015, UMBC's idea of tracking attendance at events started and ended at tally clickers. The Divisions of Information Technology and Student Affairs embarked on a project to build an event attendance tracking system that has since spread across the university to many administrative and academic purposes. With that as a launching point, we've built out a model for engagement that goes beyond event attendance, allowing us to achieve a richer understanding of student behavior. This session will cover the history of this project, some use cases and example reports, and our future direction.
Improve Institutional Effectiveness with VSA Analytics: Create custom peer groups and benchmark KPIs using data on more than 4,400 institutions
VSA Analytics provides access to a robust benchmarking tool for institutions to create custom peer groups to examine trends on enrollment, retention/graduation rates, net price, debt, and R&D expenditures. This session demonstrates how examining multiple metrics on equity and student success across institutions can lead to insight and promote strategic planning. Supported by the Association of Public and Land-grant Universities (APLU) and the American Association of State Colleges and Universities (AASCU), VSA Analytics offers 9 years of data from 4 national data sources (IPEDS, the College Scorecard, NSF, and the Student Achievement Measure) on over 4,400 institutions.