'memcpy() creates segmentation fault after too many iterations

I am trying to create a multithreading library in c. Here is the link to whole project (because pasting the code here would be too much text).

In the file tests/MultithreadingTests.c I am testing to the functionality of lib/systems/multithreading/src/ThreadPool.c. The function add_work adds any routine function the the work queue which utilises the functionality of lib/sds/lists/src/Queue.c and lib/sds/lists/src/LinkedList.c. In MultithreadingTests.c, NUM_TESTS defines the number of jobs I am adding to the work queue to be performed by NUM_THREADS

I am facing a weird issue with the code. If NUM_TESTS any number is less than 349,261, the code works perfectly fine but any number higher than or equal to 349,261 results in segmentation fault. I tried to check where exactly the segmentation fault is happening and found that it happens in the lib/sds/lists/src/Node.c at line number 29 at memcpy(node->data, data, size);

The flow of code for the error is

  • tests/MultiThreadingTests.c line 95 at pool->add_work(pool, new_thread_job(routine, &arguments[i]));
  • lib/systems/multithreading/src/ThreadPool.c line 150 thread_pool->work.push(&thread_pool->work, &job, sizeof(job));
  • lib/sds/lists/src/Queue.c line 54 return q->list.insert(&q->list, q->list.length, data, size);
  • lib/sds/lists/src/LinkedLists.c line 107 Node *node_to_insert = new_node(data, size);
  • lib/sds/lists/src/Node.c line 29 memcpy(node->data, data, size);

I am not sure why this issue is happening only when the number of jobs is higher than or equal to 349,261 but not when its smaller.



Sources

This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.

Source: Stack Overflow

Solution Source