Data Structures - Tools to solve problems

Data Structures - Tools to solve problems

In this article, I'm going to show what is a Data Structure and some types that improve to solve algorithms and problems in a code.


14 min read

Featured on Hashnode

I will not go deep dive into all the concepts, all the data structures mentioned here is very complex, but it will help you to have an introduction and understand the approaches, it's like a guide using my personal annotations and I hope that helps you more than helped me.

Data Structure

Focusing on Software Engineering, the Data Structures are ways to manipulate data and a collection of data values, the relationship among them, and the operations that can be applied to this same data.

To solve an algorithm exercise or just and to solve a performance problem within your code exists some concepts that help to understand quickly.

Complexity Analysis

Complexity analysis has the function of determining the resources needed to execute/solve a given algorithm.

  • Time Complexity - measure how fast an algorithm or a solution problem runs.
  • Space Complexity - just a measure of how much memory or space an algorithm uses up.

When you come across a statement, this analysis is one of the first things that must be done, after a lot of training you will figure out how it speeds up the final resolution!


Memory is probably one of the most difficult things to explain and also to understand, this must be deep and complex.

When we run some code, the memory is responsible to store this variable, and each type of variable have specific spaces on the memory, for example:

  • Variables - foobar = 1 this line inside of the computer it's a binary, can be an 000 0001 for example. This binary could be represented for a bits size and byte size.

To understand more about converting a number or a string a binary, I recommend you to back some steps of programming and see about this, helps a lot. In the resources, I'm going to put some links to forums to help with these questions.

Big O Notation

when I started studying ways to measure the complexity of algorithms, I always saw the Big O notation and never understood the real application, but in short, it is quite simple:

It's how exactly measure complexity for exercises or code interviews.

In the example below the O represents that this simple function has a simple complexity, with one as a total number of elements:

  • f1(a) => 1 + a [0] O(1)

Now on this second example, the sum could be a lot of responses, because of this my elementary operation is N

  • f2(a) => sum(a) O(N)

And have some other notations that describe different times of executions:

  • O(1): Constant
  • O(log n): Logarithm
  • O(n): Linear
  • O(n log n): Linearithmic
  • O(n2), (n3): Square and Cubic
  • no(1): Polynomial
  • 2o(n): Exponential

In big notation, we measure the time speak of an algorithm with respect to their size if an input.

Asymptotic Analysis

We can put asymptotic analysis with the big O notation, because both have the same approach, what change is the study's behavior.

And in the asymptotic, we care about the big values of the input that is necessary to run an algorithm.

The asymptotic analysis studies the behavior of a function or case : f(n) n -> ♾️


Did you know what the expression log(n) means?

The logarithm of a number b in base a is equal to the exponent x to which the base must be raised so that the power ax is equal to b, a, and b being real and positive numbers and a ≠ 1.

In this way, the logarithm is an operation in which we want to discover the exponent that a given base must have in order to result in a certain power and that power metrify the difficulty of a distinguished algorithm.

The definition can be exemplified this way:

log^b (x) = y iif (if and only if) b^y = x

In computer science, we always assume that the base is two, who calls binary logarithm. A good tip to resolve a logarithm, is asking yourself: Two to the power of what is equal to that number?


Honestly, arrays one of the subjects that I like to talk about and study :p Arrays are present and a lot of algorithms exercises and code interviews so understand how arrays work under the hood it's very nice.

Basically, arrays are a collection of a number [3,4,5,6], the most recognize manipulation of these arrays it's a push and pops them. But how all of this is store in memory?

Imagine the computer's memory divided into small slots, and in these small slots are saved all the numbers inside the array become binary, and that binary occupies those spaces.

int array[2] = {3, 5}

In this line of code, we have a static array (on the static, I will go deeper later), within this static array, whole numbers are created. The language compiler will see that we need to create an array of integers and for that he needs to ask for memory, but how much memory will be needed? If we check the size of the integer size of (array [0]); we will get 2 bytes, so to store our array, we need 4 bytes 2 * 2 = 4. The computer's memory as stated in the previous topics is organized in blocks in which each block stores 8 bits having numbers of indices.

Static and Dynamic arrays

Static Array: An array is static when I needed to declare the same. An example of this is the memory slots are stated above, it's when I need to specify the length and this length never changes.

Dynamic Array: An dynamic array is one that does not change in size. Are respectively vectors and array lists.

Is an array that under the hood will allow you to have faster insertions. at the end of the array.

Allocate the double space that is needed: [3,2,1 _ _ _]

Linked Lists

Linked lists are very similar to an array. The difference between them it's how to implemented or rather in the memory.

Let's see the difference:

I want to store an array with size 4

[5, 6, 7, 8]

In the liked lists each integer takes 8 memory slots

3 --> 1 --> 4 --> 2

Linked lists are store each size of memory in a list, and they're gonna connect the element using pointers(a slot of memory that points another).

Hash Tables

Hash tables are data structures where it's able to store pairs of keys and values.

Hash: Transform a key and a value, in this case below transform a string to an index:

"foo" => 1 [3,1,2] 301 % 2 100 * 3 + 1

Basically hash tables goals is, using a simple key, to do a quick search and get the desired value. It is sometimes translated as a security table.

Stacks and Queues

let's see the concept and code separately to make it a little clearer:

Stack: Books on the table last in, first out, or LIPO data structure. Implemented on the code, it's like:

class example {
public void push(int number) {

// internal methods the we can use when dealing with a stack
public int peek() {
   return stack.get(stack.size());

public int pop() {
   return stack.remove(stack.size());

// introducing the variable using the method .put
    newMax.put("maxNumber", number);

// implementation code using if sentences

// adding a number on the stack


In the example below using the image, it's clear to see how they stack works:

stack-operations.png image from (

Queue: Opposite from the stack. We can imagine people waiting for tickets, the first person that bought the ticket is the first to out or FIFO.

I see it a nice example from Programiz that provides a nice implementation of Queue, that it's very clear to visualize how to use in the real world:

public class Queue {
  int SIZE = 5;
  int items[] = new int[SIZE];
  int front, rear;

  Queue() {
    front = -1;
    rear = -1;

  // check if the queue is full
  boolean isFull() {
    if (front == 0 && rear == SIZE - 1) {
      return true;
    return false;

  // check if the queue is empty
  boolean isEmpty() {
    if (front == -1)
      return true;
      return false;

  // insert elements to the queue
  void enQueue(int element) {

    // if queue is full
    if (isFull()) {
      System.out.println("Queue is full");
    else {
      if (front == -1) {
        // mark front denote first element of queue
        front = 0;

      // insert element at the rear
      items[rear] = element;
      System.out.println("Insert " + element);

  // delete element from the queue
  int deQueue() {
    int element;

    // if queue is empty
    if (isEmpty()) {
      System.out.println("Queue is empty");
      return (-1);
    else {
      // remove element from the front of queue
      element = items[front];

      // if the queue has only one element
      if (front >= rear) {
        front = -1;
        rear = -1;
      else {
        // mark next element as the front
      System.out.println( element + " Deleted");
      return (element);

  // display element of the queue
  void display() {
    int i;
    if (isEmpty()) {
      System.out.println("Empty Queue");
    else {
      // display the front of the queue
      System.out.println("\nFront index-> " + front);

      // display element of the queue
      System.out.println("Items -> ");
      for (i = front; i <= rear; i++)
        System.out.print(items[i] + "  ");

      // display the rear of the queue
      System.out.println("\nRear index-> " + rear);

  public static void main(String[] args) {

    // create an object of Queue class
    Queue q = new Queue();

    // try to delete element from the queue
    // currently queue is empty
    // so deletion is not possible

    // insert elements to the queue
    for(int i = 1; i < 6; i ++) {

    // 6th element can't be added to queue because the queue is full


    // deQueue removes element entered first i.e. 1

    // Now we have just 4 elements


See the imagine to see the queue concept:

queue.png Image from (


There are more Data Structures and insertion algorithms like String, Graphs, Insertion Sort, Bubble Sort, Trees... It's a big list :P

I wrote in this article some of the data structures that I like a lot and I also did not put all of them in order not to elaborate too much on the subject, but if you found this article explanatory leave in the comments that I can make of the other data structures.

Thank you for reading until the end!!