Optimisation Is Necessary In Software To Be At The Cutting Edge
Through-out the history of computer software, to be at the cutting edge, and at the state-of-the-art, software optimisation has played an integral part to push the boundaries of what is possible on the hardware that was available at the moment of the software’s release and to be commercially successful. This may involve smart algorithms that either approximate values to a high enough accuracy but only take a small amount of time to compute. An example of this is Quake’s 3D Engine for computing unit normal vectors, which are used for lighting, where an approximation is used that only takes a short number of cycles to compute compared to calculating the actual value firsthand. Approximations can speed up time to compute values that otherwise will be slow if calculated from first-principles.
There are a number of examples in computer graphics that software optimisation is used. An example us the Binary Space Partition (BSP) algorithm which culls the number of polygons that need to be drawn in real-time 3D rendering. Another example is minimum bounding boxes for approximating the 3D space that a 3D object occupies to reduce the time for computing collision detection. Another example is using pre-computed values of mathematical functions that otherwise would take a longer time to compute in real-time, such as trigonometric tables or floating point exponent calculations. A common algorithm and data structure are hash tables and functions for approximating the identity of a large object in memory rather than checking every bit of information for a comparison. A lot of software and hardware advancements require optimisation to be at the start-of-the-art and to be commercially successful. So being up to date with the latest research in software engineering, mathematics and computer science plays a vital role for being at the cutting edge when developing software products.