[OmniOS-discuss] ARC, L2ARC and metadata questions
Henrik Johansson
henrikj at henkis.net
Sun Oct 26 10:03:09 UTC 2014
Hi,
I have a random read-only workload against 50M which are just above 400KB each. I have built the dataset on a set of mirrored(10 mirrors) devices for best performance but are looking into using a larger ARC or L2ARC to improve performance. I do realise that the dataset is way to large to cache, but caching metadata improves performance and latency for reading.
In a test on a subset and accesses 1M files with primarycache set to metadata for the dataset, it did improve performance but used 12-18GB of RAM in the ARC (will run it again when possible forgot the exact number). Anyway, it was far to much RAM to be realistic for us to put in each node for metadata caching.
I have three questions I need help with:
1. How do I calculate required ARC metadata size for a dataset, x bytes per block and file size / block size to get the number of blocks?
2. If I where to use a L2ARC for metadata I need file size / block size * no files * 200 bytes of RAM ( 400 / 128 * 50M * 200 = ~ 37GB? ) to address the larger L2ARC?
3. I have disabled atime and prefetch since it had 100% miss, are there anything else we could do to to improve performance?
Thanks
Henrik
More information about the OmniOS-discuss
mailing list