“O” represents Big-O notation. Big-O notation is used to articulate how long an algorithm takes to run, and is used to compare the efficiency of different computer processing solutions to a problem. With Big-O notation, the runtime is expressed in terms of how quickly the runtime grows relative to the input.
Generally described, an array is an arrangement (e.g., a data structure) of items at equally spaced addresses in computer memory. For example, an array data type is used in a computer programming language to specify a variable that can be indexed at runtime. A linked list is a data structure that represents a group of nodes which together represent a sequence and the needed memory is typically allocated while a computer program is running.
Assume a first computing array (or first linked list) is of size N and each element of the first computing array is a second array (or second linked list) of size M. As N and M grow, iterating over the N×M iteration space will quickly exceed the cache size of a computer, driving up the runtime, and take up significant memory. Algorithms with a runtime of O(MN) are particularly problematic for a computer to process in a timely manner.