In December 2025 I participated in the Advent of OR (https://adventofor.com) which was a 24 day exercise that guided participants through an optimization project. And instead of just solving problems and creating models, the Advent of OR walked through an entire project life cycle, using the INFORMS Analytics Framework.
While I am not a student or early career who was the target audience, I took part, and I had three goals.
1. Use a new programming toolkit. I used VS Code with R and Quarto. I usually use R Studio and I wanted to try R on VS Code. And I think Quarto is the future replacing Jupyter Notebooks for Python and a natural evolution from R Markdown.
2. Practice in optimization. In the Operations Research world, I am NOT an optimization person. My thesis was applied probability (queuing) and my methods research has been in simulation (one stream in ranking & selection and another stream in Bayesian methods for input modeling)
3. Using Generative AI. I wanted to see how generative AI does in an operations research project. And I wanted to do it right in a setting where I can give it references to guide it. Note: I have found that Generative AI favors descriptive statistics, machine learning, and hypothesis based statistics over other forms of analytics, so it needs some guidance.
Toolkit
I had to set up VS Code with the R extensions, Quarto (and extension), ompr and the glpk with associated R ROI packages, and to make sure everything worked, I download the repository for OR_using_R by Tim Anderson. Then to render the book (meaning I made sure all the code ran) I had to install texlive with xetex and extra fonts. Generative AI (I had Gemini CLI installed) was very helpful in all of the system administration tasks since it could figure out what was needed every time there was an error message.
Data analysis and Optimization
Working with the data sets, it read in the data, (I had to give it some corrections along the way to help it recognize the data types. When the data files were read in, it recognized that the data sets did not correspond in granularity. In the R markdown file it created, in addition to generating the code that read in the data and created summaries, it also identified a number of questions and concerns about the data and created questions for the stakeholder. This was a good set of questions that corresponded to what others put forward.
It also did well with the optimization. Given an optimization textbook, I first asked the generative AI for a mathematical formulation based on the project description. Similarly, it created a process for determining what kind of problem this was and worked through that process to determine that this was a linear programming optimization problem.
Next was a LP formulation using OMPR. The first formulation was straight forward. I went and had the GenAI break out the formulation into its own R script to enforce a separation of concerns between the data handling, optimization model, and output processing.
I generally also ask for docstrings as I go, and the Gen AI did this for both the model as well as various handling functions. I generally read the docstrings to ensure they say what I expected them to say. When it did not, since the docstrings were written based on the code, I took it to mean that the code was not right (this exposed a mistake in the initial formulation of the LP in OMPR). Similarly, I had the Gen AI write unit tests for the constraints and a mock problem to test the optimization.
Agile Operations Research
One of the aspects of having the Advent of OR over 24 days is that it rotates topics between the art of modeling, implementing and managing models, and interactions with stakeholders. There are a couple of very important points. First is that interacting with stakeholders is not something that is at the beginning and end of project and ignored in the middle. There needs to be stakeholder engagement throughout the modeling process. A second point that has come out in the conversations on LinkedIn is that the most common cause of project failure across data analytics are communication failures, in particular between the analyst and the end customer. While this can have many causes (including management inserting themselves in between the analyst and end customer), as analysts we must have that direct interaction from the beginning of the project (business problem formulation in the INFORMS Analytics Framework)
In the early stages of the project, one factor we need to face is failure of imagination. The first level for analytics is that our stakeholders often do not know what is possible across the full range of analytics. Often a problem is presented as a request for a tool, but for the results of the project to have any value, it has to address the end problem, so business problem formulation has to start with the end problem, determine what kind of information from data can help the decision makers address the problem, and then we can start discussing what methods can provide results in the form that will be useful. Currently, because of media hype, the initial request can be for a dashboard, or a predictive model, or a generative AI tool. As operations research analysts we can also bring to bear statistics, forecasting, optimization, simulation, and queueing; and different ways of applying those methods to give different kinds of results that can be delivered to decision makers to make better decisions.
After the business problem formulation, the next big change in the project will occur when presenting the first minimum viable model to the end user. This is the first model that uses a minimum acceptable subset of the data and model that covers the most essential aspects of the smallest version of the problem. The reason this is important is before this, all conversations are abstract and theoretical. The first time a model with outputs is presented to a n end user, the end user will start to imagine how they would use these results in real situations that have happened in the past. And they will start telling about all of the considerations they account for, the information they need to gather to make decisions, and who they need to consult and coordinate with. And this can change the entire project. And from experience, I do not think it matters how much work is done at higher levels to define the project, the first time a model is presented to an end user the project will change so that the outcomes can be usable to the business. So it is best to make that happen as early as possible so that change causes the least disruption to the work in progress.
The idea of rapid cycles of iteration and feedback from the customer, and the willingness to accept changes to the project due to that interaction are the hallmarks of agile development methodologies in the software development world. Having regular rounds of model iteration where additional elements are added to the model, and getting feedback from stakeholders to confirm that the project is on the right track to produce something useful. And just like the software development world has experienced, this is more likely to lead to useful product, and actually faster than attempting to follow a rigid path that leads to something irrelevant.
The Advent of OR exercise provided an opportunity to see how a simulated project evolves over days and weeks. And that includes working with our business partners. Over on the LinkedIn pages, some of the conversation has gone to applying Agile methodologies in analytics. Because giving our projects the best chance of success requires regular direct contact between the analyst and the end decision maker to make sure that we are building the right product to meet the needs of the business.
Project failure in software development in general and data analytics in particular is most often caused by communication gaps between the analyst and the end customer. Engagement should not be limited to the project’s start and end; it must be an ongoing dialogue. The alternative, which happens way too often, is a model and product that does not solve the problem faced by the decision maker, and is therefore unused.
The most significant shift in any project occurs when the customer sees the first version of a model with the most important factors included. Before this stage, requirements are often abstract, and stakeholders may suffer from a failure of imagination. Once a customer interacts with concrete outputs, they begin to envision real-world events and identify missing constraints or data needs. Since almost all projects change once they meet reality, it is vital to deliver this first version as early as possible. This will lead to higher quality discussions on what the real problem is, what the real needs of decision makers are, what type of analytical results will meet those needs, and what kind of analytics can meet those needs, early enough to make changes to project scope to incorporate the new understanding of the real situation a needs. And while projects may come to us asking for a dashboard, predictive, or generative AI models, this is a chance to recognize if a business partner's needs can be best met by statistical, forecasting, optimization, simulation, or queuing models. The first product, however minimal, confirms that you are engaged with the right problem, and provides the business partner with a sense of what is possible.
In software development, even apart from data analysis projects, they have observed the same problem, projects done where the developers are not in direct and regular communication with the end customer fail. And so Agile methods were developed. The hallmarks of Agile development -- rapid iterations, frequent communications with stakeholders, and a willingness to change based on discovery and feedback -- are essential for a modeling project. Rather than following a rigid path that risks producing something irrelevant, agile cycles ensure the project stays aligned with business needs. And actually meeting real needs are how we get more relevance to the business and trust with our business partners.
Conclusion
The Advent of OR proved to be a valuable exercise, offering a full-cycle project experience that highlighted two critical modern aspects of Operations Research: the integration of Generative AI and the necessity of an Agile approach. Generative AI demonstrated significant utility in accelerating system setup and basic modeling tasks, freeing up the analyst for higher-level problem-solving. More importantly, the experience reinforced that project success hinges on continuous, direct stakeholder engagement, mirroring the principles of Agile development. By prioritizing early delivery of a Minimum Viable Model, analysts can gain crucial feedback that aligns the project with real business needs, ultimately reducing the risk of communication-based failure and ensuring the final product is relevant and utilized.
No comments:
Post a Comment