Discussion:
[libtorrent] how to reduce memory consuming
linxs
2017-06-20 03:00:35 UTC
Permalink
System and software: arm linux, which has only 1G memory, libtorrent-1_1_3

I run client_test 3times to download 3 torrent files, each in seperated process.
The 3 client_test process consume almost all the 1G memory, with only about 25M free memory left.
how to reduce memory consuming and let about 100M memory free?

Thanks!
linxs
2017-06-20 03:25:59 UTC
Permalink
I run 3 client_test process to download the 3 torrent files.
Whether client_test is config to high performance mode or min_memory mode, almost the 1G memory is consumed.
And it seems that the memory will not be released until client_test exist.
Post by linxs
System and software: arm linux, which has only 1G memory, libtorrent-1_1_3
I run client_test 3times to download 3 torrent files, each in seperated process.
The 3 client_test process consume almost all the 1G memory, with only about 25M free memory left.
how to reduce memory consuming and let about 100M memory free?
Thanks!
------------------------------------------------------------------------------
Check out the vibrant tech community on one of the world's most
engaging tech sites, Slashdot.org! http://sdm.link/slashdot
_______________________________________________
Libtorrent-discuss mailing list
https://lists.sourceforge.net/lists/listinfo/libtorrent-discuss
Calum Lind
2017-06-20 09:32:23 UTC
Permalink
Did you read http://www.libtorrent.org/tuning.html
Post by linxs
I run 3 client_test process to download the 3 torrent files.
Whether client_test is config to high performance mode or min_memory mode,
almost the 1G memory is consumed.
And it seems that the memory will not be released until client_test exist.
Post by linxs
System and software: arm linux, which has only 1G memory,
libtorrent-1_1_3
Post by linxs
I run client_test 3times to download 3 torrent files, each in seperated
process.
Post by linxs
The 3 client_test process consume almost all the 1G memory, with only
about 25M free memory left.
Post by linxs
how to reduce memory consuming and let about 100M memory free?
Thanks!
-----------------------------------------------------------
-------------------
Post by linxs
Check out the vibrant tech community on one of the world's most
engaging tech sites, Slashdot.org! http://sdm.link/slashdot
_______________________________________________
Libtorrent-discuss mailing list
https://lists.sourceforge.net/lists/listinfo/libtorrent-discuss
------------------------------------------------------------
------------------
Check out the vibrant tech community on one of the world's most
engaging tech sites, Slashdot.org! http://sdm.link/slashdot
_______________________________________________
Libtorrent-discuss mailing list
https://lists.sourceforge.net/lists/listinfo/libtorrent-discuss
Arvid Norberg
2017-06-28 02:26:31 UTC
Permalink
Post by linxs
I run 3 client_test process to download the 3 torrent files.
Whether client_test is config to high performance mode or min_memory mode,
almost the 1G memory is consumed.
And it seems that the memory will not be released until client_test exist.
Do you know what kind of memory this is, that is being consumed?

If it is anonymous memory (i.e. heap allocations within the process) it's
most likely a bug and it can be tracked down with a heap profiler. If you
suspect this is the case, please do and post the results back. I think this
is unlikely though.

What I think is more likely is that the memory is part of the page cache.
This could be, for instance, because you're downloading from a network
that's faster than the drive you're saving the files to. In my experience
with some versions of linux, this may result in the kernel allocating new
dirty pages, backed by a slow device, until it runs out of memory, causing
the system to crawl to a halt more or less. In the background the dirty
pages are being flushed, but the downloading creates new one at a higher
rate.

Anyway, you may want to experiment with setting the file_pool_size to 1, to
force closing files more often.

You may also experiment with including fdatasync() calls after writes, to
see what happens.

But fundamentally, you'll have to do some more digging. Also, you would
likely use less memory by running a single process instead of 3.
--
Arvid Norberg
Loading...