Algorithmic Complexity and Big-O Notation

Initializing live version
Download to Desktop

Requires a Wolfram Notebook System

Interact on desktop, mobile and cloud with the free Wolfram Player or other Wolfram Language products.

In computer science and mathematics, big-O notation is used to describe algorithmic complexity, a measure of the computational cost required to process data and return a result.

Contributed by: Daniel de Souza Carvalho (August 2015)
Open content licensed under CC BY-NC-SA


Snapshots


Details

detailSectionParagraph


Feedback (field required)
Email (field required) Name
Occupation Organization
Note: Your message & contact information may be shared with the author of any specific Demonstration for which you give feedback.
Send