FreeRTOS Support Archive
The FreeRTOS support forum is used to obtain active support directly from Real
Time Engineers Ltd. In return for using our top quality software and services for
free, we request you play fair and do your bit to help others too! Sign up
to receive notifications of new support topics then help where you can.
This is a read only archive of threads posted to the FreeRTOS support forum.
The archive is updated every week, so will not always contain the very latest posts.
Use these archive pages to search previous posts. Use the Live FreeRTOS Forum
link to reply to a post, or start a new support thread.
[FreeRTOS Home] [Live FreeRTOS Forum] [FAQ] [Archive Top] [January 2012 Threads]
Hy, my actual project needs to use more than 255kB of heap. That heap can be found in my external SRAM. For using it I have to set the Task Stack Size of the RTOS to that huge Value. Now I want to know if FreeRTOS hasproblems with huge task stacks or is there any limitation of task stack? Or is it just my external memmory which limitates me?
Daniel
“Hy, my actual project needs to use more than 255kB of heap. That heap can be found in my external SRAM. For using it I have to set the Task Stack Size of the RTOS to that huge Value”
Why? Task stacks are allocated from the FreeRTOS heap, so if the 255kB heap in the external RAM is being used as the FreeRTOS heap (if pvPortMalloc() allocates RAM from there), then creating a task with a stack that size will use all the heap, and further calls to pvPortMalloc() will fail. Normally heap usage is separate from stack usage. Maybe you are right, but from your comment, there is no hint as to why you need to use all the heap as stack, leaving no heap.
“Now I want to know if FreeRTOS hasproblems with huge task stacks or is there any limitation of task stack?”
The stack size parameter in xTaskCreate() is unsigned short, so 0xffff is the largest stack. If you want to make it bigger, then change the data type in the function parameter.
Copyright (C) Amazon Web Services, Inc. or its affiliates. All rights reserved.