Memory usage

classic Classic list List threaded Threaded
3 messages Options
Reply | Threaded
Open this post in threaded view
|

Memory usage

Tiago Katcipis
Im working on an streaming server and i need to have several pipes running at the same time, i started to do some tests to see how much memory and how much threads would something like that use on gstreamer. So i wrote a very simple application that starts a lot of pipelines, where the pipelines will be simply an faksrc ! fakesink. I started the tests and i realized that even for a simple task as running an fakesrc ! fakesink im using a lot of memory.

with 1 pipe:

Tasks:   2 total,   1 running,   1 sleeping,   0 stopped,   0 zombie
Cpu(s): 51.5%us,  1.0%sy,  0.0%ni, 47.5%id,  0.0%wa,  0.0%hi,  0.0%si,  0.0%st
Mem:   2027232k total,   963800k used,  1063432k free,    75976k buffers
Swap:   995988k total,        0k used,   995988k free,   430676k cached

  PID USER      PR  NI  VIRT  RES  SHR S %CPU %MEM    TIME+  COMMAND                                                                                        
 6402 katcipis  20   0 26204 4060 2548 R 99.9  0.2   0:48.04 multiple_pipes                                                                                 
 6400 katcipis  20   0 26204 4060 2548 S  0.0  0.2   0:00.04 multiple_pipes 

with 2 pipes:

Tasks:   3 total,   2 running,   1 sleeping,   0 stopped,   0 zombie
Cpu(s): 96.8%us,  1.8%sy,  0.0%ni,  1.3%id,  0.0%wa,  0.0%hi,  0.0%si,  0.0%st
Mem:   2027232k total,   965044k used,  1062188k free,    76804k buffers
Swap:   995988k total,        0k used,   995988k free,   430752k cached

  PID USER      PR  NI  VIRT  RES  SHR S %CPU %MEM    TIME+  COMMAND                                                                                        
 6541 katcipis  20   0 42600 4092 2544 R 95.3  0.2   0:16.22 multiple_pipes                                                                                 
 6542 katcipis  20   0 42600 4092 2544 R 95.3  0.2   0:16.36 multiple_pipes                                                                                 
 6539 katcipis  20   0 42600 4092 2544 S  0.0  0.2   0:00.04 multiple_pipes

with 3 pipes:

Tasks:   4 total,   2 running,   2 sleeping,   0 stopped,   0 zombie
Cpu(s): 90.6%us,  3.1%sy,  0.0%ni,  6.2%id,  0.0%wa,  0.0%hi,  0.0%si,  0.0%st
Mem:   2027232k total,   964608k used,  1062624k free,    77528k buffers
Swap:   995988k total,        0k used,   995988k free,   430752k cached

  PID USER      PR  NI  VIRT  RES  SHR S %CPU %MEM    TIME+  COMMAND                                                                                        
 6563 katcipis  20   0 51820 4116 2544 S 62.5  0.2   0:17.44 multiple_pipes                                                                                 
 6566 katcipis  20   0 51820 4116 2544 R 62.5  0.2   0:16.36 multiple_pipes                                                                                 
 6565 katcipis  20   0 51820 4116 2544 R 50.0  0.2   0:13.94 multiple_pipes                                                                                 
 6561 katcipis  20   0 51820 4116 2544 S  0.0  0.2   0:00.04 multiple_pipes


i have another test where i run an gnomevfssrc and a decodebin and the numbers are even worse:

with 1 pipe:

Tasks:   2 total,   0 running,   2 sleeping,   0 stopped,   0 zombie
Cpu(s):  8.1%us,  2.7%sy,  0.0%ni, 89.2%id,  0.0%wa,  0.0%hi,  0.0%si,  0.0%st
Mem:   2027232k total,   972104k used,  1055128k free,    78080k buffers
Swap:   995988k total,        0k used,   995988k free,   430760k cached

  PID USER      PR  NI  VIRT  RES  SHR S %CPU %MEM    TIME+  COMMAND                                                                                        
 6603 katcipis  20   0 43724  12m 5212 S  0.0  0.6   0:00.10 multiple_pipes                                                                                 
 6605 katcipis  20   0 43724  12m 5212 S  0.0  0.6   0:00.12 multiple_pipes

with 2 pipes:

Tasks:   3 total,   0 running,   3 sleeping,   0 stopped,   0 zombie
Cpu(s): 15.7%us,  4.3%sy,  0.0%ni, 79.8%id,  0.0%wa,  0.2%hi,  0.0%si,  0.0%st
Mem:   2027232k total,   972604k used,  1054628k free,    78548k buffers
Swap:   995988k total,        0k used,   995988k free,   430760k cached

  PID USER      PR  NI  VIRT  RES  SHR S %CPU %MEM    TIME+  COMMAND                                                                                        
 6622 katcipis  20   0 53348  13m 5208 S  0.0  0.7   0:00.08 multiple_pipes                                                                                 
 6624 katcipis  20   0 53348  13m 5208 S  0.0  0.7   0:00.10 multiple_pipes                                                                                 
 6625 katcipis  20   0 53348  13m 5208 S  0.0  0.7   0:00.04 multiple_pipes

with 3 pipes:

Tasks:   4 total,   0 running,   4 sleeping,   0 stopped,   0 zombie
Cpu(s):  6.8%us,  4.5%sy,  0.0%ni, 88.6%id,  0.0%wa,  0.0%hi,  0.0%si,  0.0%st
Mem:   2027232k total,   974368k used,  1052864k free,    79328k buffers
Swap:   995988k total,        0k used,   995988k free,   430760k cached

  PID USER      PR  NI  VIRT  RES  SHR S %CPU %MEM    TIME+  COMMAND                                                                                        
 6960 katcipis  20   0 62136  13m 5208 S  9.6  0.7   0:00.04 multiple_pipes                                                                                 
 6957 katcipis  20   0 62136  13m 5208 S  0.0  0.7   0:00.10 multiple_pipes                                                                                 
 6959 katcipis  20   0 62136  13m 5208 S  0.0  0.7   0:00.08 multiple_pipes                                                                                 
 6962 katcipis  20   0 62136  13m 5208 S  0.0  0.7   0:00.02 multiple_pipes


i also realized that for every stream there is a thread, is there some way to use one thread for multiple pipes (or multiple streams)? is something that i did terribly wrong or it is normal this usage of memory? if it is normal, can i configure gstreamer to use less memory? considering that gstreamer works even in embedded systems i suppose im doing something very wrong on there is some way to use less resources.

im attaching the source code of my tests, the simple one using fakesrc and the complete one using gnomevfs and decodebin.

best regards,
Katcipis



------------------------------------------------------------------------------
Let Crystal Reports handle the reporting - Free Crystal Reports 2008 30-Day
trial. Simplify your report design, integration and deployment - and focus on
what you do best, core application coding. Discover what's new with
Crystal Reports now.  http://p.sf.net/sfu/bobj-july
_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.sourceforge.net/lists/listinfo/gstreamer-devel
Reply | Threaded
Open this post in threaded view
|

Re: Memory usage

Tiago Katcipis
sorry, i forgot the source code :-).

On Wed, Aug 19, 2009 at 10:41 AM, Tiago Katcipis <[hidden email]> wrote:
Im working on an streaming server and i need to have several pipes running at the same time, i started to do some tests to see how much memory and how much threads would something like that use on gstreamer. So i wrote a very simple application that starts a lot of pipelines, where the pipelines will be simply an faksrc ! fakesink. I started the tests and i realized that even for a simple task as running an fakesrc ! fakesink im using a lot of memory.

with 1 pipe:

Tasks:   2 total,   1 running,   1 sleeping,   0 stopped,   0 zombie
Cpu(s): 51.5%us,  1.0%sy,  0.0%ni, 47.5%id,  0.0%wa,  0.0%hi,  0.0%si,  0.0%st
Mem:   2027232k total,   963800k used,  1063432k free,    75976k buffers
Swap:   995988k total,        0k used,   995988k free,   430676k cached

  PID USER      PR  NI  VIRT  RES  SHR S %CPU %MEM    TIME+  COMMAND                                                                                        
 6402 katcipis  20   0 26204 4060 2548 R 99.9  0.2   0:48.04 multiple_pipes                                                                                 
 6400 katcipis  20   0 26204 4060 2548 S  0.0  0.2   0:00.04 multiple_pipes 

with 2 pipes:

Tasks:   3 total,   2 running,   1 sleeping,   0 stopped,   0 zombie
Cpu(s): 96.8%us,  1.8%sy,  0.0%ni,  1.3%id,  0.0%wa,  0.0%hi,  0.0%si,  0.0%st
Mem:   2027232k total,   965044k used,  1062188k free,    76804k buffers
Swap:   995988k total,        0k used,   995988k free,   430752k cached

  PID USER      PR  NI  VIRT  RES  SHR S %CPU %MEM    TIME+  COMMAND                                                                                        
 6541 katcipis  20   0 42600 4092 2544 R 95.3  0.2   0:16.22 multiple_pipes                                                                                 
 6542 katcipis  20   0 42600 4092 2544 R 95.3  0.2   0:16.36 multiple_pipes                                                                                 
 6539 katcipis  20   0 42600 4092 2544 S  0.0  0.2   0:00.04 multiple_pipes

with 3 pipes:

Tasks:   4 total,   2 running,   2 sleeping,   0 stopped,   0 zombie
Cpu(s): 90.6%us,  3.1%sy,  0.0%ni,  6.2%id,  0.0%wa,  0.0%hi,  0.0%si,  0.0%st
Mem:   2027232k total,   964608k used,  1062624k free,    77528k buffers
Swap:   995988k total,        0k used,   995988k free,   430752k cached

  PID USER      PR  NI  VIRT  RES  SHR S %CPU %MEM    TIME+  COMMAND                                                                                        
 6563 katcipis  20   0 51820 4116 2544 S 62.5  0.2   0:17.44 multiple_pipes                                                                                 
 6566 katcipis  20   0 51820 4116 2544 R 62.5  0.2   0:16.36 multiple_pipes                                                                                 
 6565 katcipis  20   0 51820 4116 2544 R 50.0  0.2   0:13.94 multiple_pipes                                                                                 
 6561 katcipis  20   0 51820 4116 2544 S  0.0  0.2   0:00.04 multiple_pipes


i have another test where i run an gnomevfssrc and a decodebin and the numbers are even worse:

with 1 pipe:

Tasks:   2 total,   0 running,   2 sleeping,   0 stopped,   0 zombie
Cpu(s):  8.1%us,  2.7%sy,  0.0%ni, 89.2%id,  0.0%wa,  0.0%hi,  0.0%si,  0.0%st
Mem:   2027232k total,   972104k used,  1055128k free,    78080k buffers
Swap:   995988k total,        0k used,   995988k free,   430760k cached

  PID USER      PR  NI  VIRT  RES  SHR S %CPU %MEM    TIME+  COMMAND                                                                                        
 6603 katcipis  20   0 43724  12m 5212 S  0.0  0.6   0:00.10 multiple_pipes                                                                                 
 6605 katcipis  20   0 43724  12m 5212 S  0.0  0.6   0:00.12 multiple_pipes

with 2 pipes:

Tasks:   3 total,   0 running,   3 sleeping,   0 stopped,   0 zombie
Cpu(s): 15.7%us,  4.3%sy,  0.0%ni, 79.8%id,  0.0%wa,  0.2%hi,  0.0%si,  0.0%st
Mem:   2027232k total,   972604k used,  1054628k free,    78548k buffers
Swap:   995988k total,        0k used,   995988k free,   430760k cached

  PID USER      PR  NI  VIRT  RES  SHR S %CPU %MEM    TIME+  COMMAND                                                                                        
 6622 katcipis  20   0 53348  13m 5208 S  0.0  0.7   0:00.08 multiple_pipes                                                                                 
 6624 katcipis  20   0 53348  13m 5208 S  0.0  0.7   0:00.10 multiple_pipes                                                                                 
 6625 katcipis  20   0 53348  13m 5208 S  0.0  0.7   0:00.04 multiple_pipes

with 3 pipes:

Tasks:   4 total,   0 running,   4 sleeping,   0 stopped,   0 zombie
Cpu(s):  6.8%us,  4.5%sy,  0.0%ni, 88.6%id,  0.0%wa,  0.0%hi,  0.0%si,  0.0%st
Mem:   2027232k total,   974368k used,  1052864k free,    79328k buffers
Swap:   995988k total,        0k used,   995988k free,   430760k cached

  PID USER      PR  NI  VIRT  RES  SHR S %CPU %MEM    TIME+  COMMAND                                                                                        
 6960 katcipis  20   0 62136  13m 5208 S  9.6  0.7   0:00.04 multiple_pipes                                                                                 
 6957 katcipis  20   0 62136  13m 5208 S  0.0  0.7   0:00.10 multiple_pipes                                                                                 
 6959 katcipis  20   0 62136  13m 5208 S  0.0  0.7   0:00.08 multiple_pipes                                                                                 
 6962 katcipis  20   0 62136  13m 5208 S  0.0  0.7   0:00.02 multiple_pipes


i also realized that for every stream there is a thread, is there some way to use one thread for multiple pipes (or multiple streams)? is something that i did terribly wrong or it is normal this usage of memory? if it is normal, can i configure gstreamer to use less memory? considering that gstreamer works even in embedded systems i suppose im doing something very wrong on there is some way to use less resources.

im attaching the source code of my tests, the simple one using fakesrc and the complete one using gnomevfs and decodebin.

best regards,
Katcipis





--
"it might be a profitable thing to learn Java, but it has no intellectual value whatsoever" Alexander Stepanov

------------------------------------------------------------------------------
Let Crystal Reports handle the reporting - Free Crystal Reports 2008 30-Day
trial. Simplify your report design, integration and deployment - and focus on
what you do best, core application coding. Discover what's new with
Crystal Reports now.  http://p.sf.net/sfu/bobj-july
_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.sourceforge.net/lists/listinfo/gstreamer-devel

multiple_pipes.c (8K) Download Attachment
multiple_pipes_simple.c (3K) Download Attachment
Reply | Threaded
Open this post in threaded view
|

Re: Memory usage

Tiago Katcipis
I did some research and found out that the problem was the default thread stack size defined on linux, on a CentOS using kernel 2.6.28 the stack size for each thread by default is 10mb (on ubuntu too, but i don't remember now the exact kernel), glib uses this default size to create the threads that will be used on g_thread_pool (gstreamer uses the g_thread_pool_new).

One way to fix this is using ulimit -s [new_stack_size], i tested and even with 64k as stack size it seems to work ok, and the usage of memory drops significantly :-).

isn't there a minimum thread stack size required by gstreamer to run safely? Or there is no danger on setting the thread size as i wish?

and i was still interested on some way of running multiple pipelines on the same thread, if it is possible.

best regards,
Katcipis

On Wed, Aug 19, 2009 at 10:43 AM, Tiago Katcipis <[hidden email]> wrote:
sorry, i forgot the source code :-).


On Wed, Aug 19, 2009 at 10:41 AM, Tiago Katcipis <[hidden email]> wrote:
Im working on an streaming server and i need to have several pipes running at the same time, i started to do some tests to see how much memory and how much threads would something like that use on gstreamer. So i wrote a very simple application that starts a lot of pipelines, where the pipelines will be simply an faksrc ! fakesink. I started the tests and i realized that even for a simple task as running an fakesrc ! fakesink im using a lot of memory.

with 1 pipe:

Tasks:   2 total,   1 running,   1 sleeping,   0 stopped,   0 zombie
Cpu(s): 51.5%us,  1.0%sy,  0.0%ni, 47.5%id,  0.0%wa,  0.0%hi,  0.0%si,  0.0%st
Mem:   2027232k total,   963800k used,  1063432k free,    75976k buffers
Swap:   995988k total,        0k used,   995988k free,   430676k cached

  PID USER      PR  NI  VIRT  RES  SHR S %CPU %MEM    TIME+  COMMAND                                                                                        
 6402 katcipis  20   0 26204 4060 2548 R 99.9  0.2   0:48.04 multiple_pipes                                                                                 
 6400 katcipis  20   0 26204 4060 2548 S  0.0  0.2   0:00.04 multiple_pipes 

with 2 pipes:

Tasks:   3 total,   2 running,   1 sleeping,   0 stopped,   0 zombie
Cpu(s): 96.8%us,  1.8%sy,  0.0%ni,  1.3%id,  0.0%wa,  0.0%hi,  0.0%si,  0.0%st
Mem:   2027232k total,   965044k used,  1062188k free,    76804k buffers
Swap:   995988k total,        0k used,   995988k free,   430752k cached

  PID USER      PR  NI  VIRT  RES  SHR S %CPU %MEM    TIME+  COMMAND                                                                                        
 6541 katcipis  20   0 42600 4092 2544 R 95.3  0.2   0:16.22 multiple_pipes                                                                                 
 6542 katcipis  20   0 42600 4092 2544 R 95.3  0.2   0:16.36 multiple_pipes                                                                                 
 6539 katcipis  20   0 42600 4092 2544 S  0.0  0.2   0:00.04 multiple_pipes

with 3 pipes:

Tasks:   4 total,   2 running,   2 sleeping,   0 stopped,   0 zombie
Cpu(s): 90.6%us,  3.1%sy,  0.0%ni,  6.2%id,  0.0%wa,  0.0%hi,  0.0%si,  0.0%st
Mem:   2027232k total,   964608k used,  1062624k free,    77528k buffers
Swap:   995988k total,        0k used,   995988k free,   430752k cached

  PID USER      PR  NI  VIRT  RES  SHR S %CPU %MEM    TIME+  COMMAND                                                                                        
 6563 katcipis  20   0 51820 4116 2544 S 62.5  0.2   0:17.44 multiple_pipes                                                                                 
 6566 katcipis  20   0 51820 4116 2544 R 62.5  0.2   0:16.36 multiple_pipes                                                                                 
 6565 katcipis  20   0 51820 4116 2544 R 50.0  0.2   0:13.94 multiple_pipes                                                                                 
 6561 katcipis  20   0 51820 4116 2544 S  0.0  0.2   0:00.04 multiple_pipes


i have another test where i run an gnomevfssrc and a decodebin and the numbers are even worse:

with 1 pipe:

Tasks:   2 total,   0 running,   2 sleeping,   0 stopped,   0 zombie
Cpu(s):  8.1%us,  2.7%sy,  0.0%ni, 89.2%id,  0.0%wa,  0.0%hi,  0.0%si,  0.0%st
Mem:   2027232k total,   972104k used,  1055128k free,    78080k buffers
Swap:   995988k total,        0k used,   995988k free,   430760k cached

  PID USER      PR  NI  VIRT  RES  SHR S %CPU %MEM    TIME+  COMMAND                                                                                        
 6603 katcipis  20   0 43724  12m 5212 S  0.0  0.6   0:00.10 multiple_pipes                                                                                 
 6605 katcipis  20   0 43724  12m 5212 S  0.0  0.6   0:00.12 multiple_pipes

with 2 pipes:

Tasks:   3 total,   0 running,   3 sleeping,   0 stopped,   0 zombie
Cpu(s): 15.7%us,  4.3%sy,  0.0%ni, 79.8%id,  0.0%wa,  0.2%hi,  0.0%si,  0.0%st
Mem:   2027232k total,   972604k used,  1054628k free,    78548k buffers
Swap:   995988k total,        0k used,   995988k free,   430760k cached

  PID USER      PR  NI  VIRT  RES  SHR S %CPU %MEM    TIME+  COMMAND                                                                                        
 6622 katcipis  20   0 53348  13m 5208 S  0.0  0.7   0:00.08 multiple_pipes                                                                                 
 6624 katcipis  20   0 53348  13m 5208 S  0.0  0.7   0:00.10 multiple_pipes                                                                                 
 6625 katcipis  20   0 53348  13m 5208 S  0.0  0.7   0:00.04 multiple_pipes

with 3 pipes:

Tasks:   4 total,   0 running,   4 sleeping,   0 stopped,   0 zombie
Cpu(s):  6.8%us,  4.5%sy,  0.0%ni, 88.6%id,  0.0%wa,  0.0%hi,  0.0%si,  0.0%st
Mem:   2027232k total,   974368k used,  1052864k free,    79328k buffers
Swap:   995988k total,        0k used,   995988k free,   430760k cached

  PID USER      PR  NI  VIRT  RES  SHR S %CPU %MEM    TIME+  COMMAND                                                                                        
 6960 katcipis  20   0 62136  13m 5208 S  9.6  0.7   0:00.04 multiple_pipes                                                                                 
 6957 katcipis  20   0 62136  13m 5208 S  0.0  0.7   0:00.10 multiple_pipes                                                                                 
 6959 katcipis  20   0 62136  13m 5208 S  0.0  0.7   0:00.08 multiple_pipes                                                                                 
 6962 katcipis  20   0 62136  13m 5208 S  0.0  0.7   0:00.02 multiple_pipes


i also realized that for every stream there is a thread, is there some way to use one thread for multiple pipes (or multiple streams)? is something that i did terribly wrong or it is normal this usage of memory? if it is normal, can i configure gstreamer to use less memory? considering that gstreamer works even in embedded systems i suppose im doing something very wrong on there is some way to use less resources.

im attaching the source code of my tests, the simple one using fakesrc and the complete one using gnomevfs and decodebin.

best regards,
Katcipis





--
"it might be a profitable thing to learn Java, but it has no intellectual value whatsoever" Alexander Stepanov



--
"it might be a profitable thing to learn Java, but it has no intellectual value whatsoever" Alexander Stepanov

------------------------------------------------------------------------------
Let Crystal Reports handle the reporting - Free Crystal Reports 2008 30-Day
trial. Simplify your report design, integration and deployment - and focus on
what you do best, core application coding. Discover what's new with
Crystal Reports now.  http://p.sf.net/sfu/bobj-july
_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.sourceforge.net/lists/listinfo/gstreamer-devel