Does more RAM really help?

  • Thread starter Thread starter Fazal
  • Start date Start date
F

Fazal

The reason I ask this question is because about 3-4 years ago I was listening
to Scott Mueller (popular computer technician) who was explaning how RAM
works. He said that the L1 cache runs at about same speed as the
microprocessor and Intel claims to have a 9/10 'cache hit ratio'. If a 'cache
miss' occurs then the L2 cache also has a 9/10 'cache hit ratio.' He said
only if a cache miss occurs on both the L1 and L2 cache will the physical RAM
will be accessed (which ofcourse is a slower process). Therefore he reasoned
that if we add more RAM you will only be increasing your performance 1% of
the time. And this is why RAM speeds today are so much slower than CPU
speeds; they simply have no reason to keep up.
I was just wondering how logical is this arguement and is it the truth?

Thank you
 
"Fazal" <Fazal@discussions.microsoft.com> wrote in message
news:683727EA-947A-43FD-B3A5-90CF97C0C5DE@microsoft.com...
> The reason I ask this question is because about 3-4 years ago I was

listening
> to Scott Mueller (popular computer technician) who was explaning how RAM
> works. He said that the L1 cache runs at about same speed as the
> microprocessor and Intel claims to have a 9/10 'cache hit ratio'. If a

'cache
> miss' occurs then the L2 cache also has a 9/10 'cache hit ratio.' He said
> only if a cache miss occurs on both the L1 and L2 cache will the physical

RAM
> will be accessed (which ofcourse is a slower process). Therefore he

reasoned
> that if we add more RAM you will only be increasing your performance 1% of
> the time. And this is why RAM speeds today are so much slower than CPU
> speeds; they simply have no reason to keep up.
> I was just wondering how logical is this arguement and is it the truth?
>
> Thank you


What was missed was the fact that with most applications...a lot more RAM is
needed than is avail in L1/L2
so a fair amount of RAM is of course needed.
 
"Fazal" <Fazal@discussions.microsoft.com> wrote in message
news:683727EA-947A-43FD-B3A5-90CF97C0C5DE@microsoft.com...
> The reason I ask this question is because about 3-4 years ago I was
> listening
> to Scott Mueller (popular computer technician) who was explaning how RAM
> works. He said that the L1 cache runs at about same speed as the
> microprocessor and Intel claims to have a 9/10 'cache hit ratio'. If a
> 'cache
> miss' occurs then the L2 cache also has a 9/10 'cache hit ratio.' He said
> only if a cache miss occurs on both the L1 and L2 cache will the physical
> RAM
> will be accessed (which ofcourse is a slower process). Therefore he
> reasoned
> that if we add more RAM you will only be increasing your performance 1% of
> the time. And this is why RAM speeds today are so much slower than CPU
> speeds; they simply have no reason to keep up.
> I was just wondering how logical is this arguement and is it the truth?
>
> Thank you


I suspect that you have mis-understood Scott Mueller. You seem to be
comparing ham-hocks to beans; they are two completely different foods that,
taken together, perform an olfactory feat that neither food can obtain
independently of the other.

*CACHE* memory [L1 and, if a processor has it, L2] serves an entirely
different function than *Random Access Memory* (RAM) serves.

CACHE memory exists to store snippets of "action" code (VERB) that the
processor will soon use, or that will be reused multiple times in a sequence
of calculations.
RAM stores not just snippets of code, but entire programs of code (both NOUN
and VERB).

Think of yourself as a "computer", think of RAM as your office, and think of
CACHE memory a workspace where you use each of the individual tools within
your office. From your writing class you have 100 vocabulary words that you
must discover definitions for. From your math class you have 100 equations
upon which you must perform a Laplace Transform.
You (the processor) decide to do your vocabulary first, so you "pre-fetch" a
dictionary and place it in the CACHE (your desk). With your tool place in
CACHE, you do not need to get up from your desk and retrieve a dictionary
100 times for 100 words -- you use, 100 times, the dictionary that is placed
in your CACHE.
Having completed your vocabulary assignment, you discard the dictionary from
CACHE and replace it with a calculator -- you use, 100 times, the
calculator.

Steve
 
On Sun, 30 Sep 2007 15:20:00 -0700, Fazal
<Fazal@discussions.microsoft.com> wrote:

>The reason I ask this question is because about 3-4 years ago I was listening
>to Scott Mueller (popular computer technician) who was explaning how RAM
>works. He said that the L1 cache runs at about same speed as the
>microprocessor and Intel claims to have a 9/10 'cache hit ratio'. If a 'cache
>miss' occurs then the L2 cache also has a 9/10 'cache hit ratio.' He said
>only if a cache miss occurs on both the L1 and L2 cache will the physical RAM
>will be accessed (which ofcourse is a slower process). Therefore he reasoned
>that if we add more RAM you will only be increasing your performance 1% of
>the time. And this is why RAM speeds today are so much slower than CPU
>speeds; they simply have no reason to keep up.
>I was just wondering how logical is this arguement and is it the truth?
>
>Thank you


Why bring it up here? What exactly does your post have to do with the
OS and it's interaction with hardware? Oh, we both know the answer:
NOTHING.

Take this elsewhere.
 
NoConsequence, and NoManners either... If you don't know what this post has
to do with the OS and it's interaction with hardware, then by all means: take
it elsewhere.

Yes, more RAM really helps, Fazal. Simply put, more RAM means that the OS
can leave larger programs and more data in the RAM, instead of constantly
saving and loading these from and to your hard drive. Loading and saving to
and from RAM is much faster than loading/saving to or from a hard drive. The
end result will be a much better (and faster) performing computer (without
even getting a faster processor!).

Get it now, NoBrain? I'm sorry, I meant NoConsequence!

"NoConsequence" wrote:

> On Sun, 30 Sep 2007 15:20:00 -0700, Fazal
> <Fazal@discussions.microsoft.com> wrote:
>
> >The reason I ask this question is because about 3-4 years ago I was listening
> >to Scott Mueller (popular computer technician) who was explaning how RAM
> >works. He said that the L1 cache runs at about same speed as the
> >microprocessor and Intel claims to have a 9/10 'cache hit ratio'. If a 'cache
> >miss' occurs then the L2 cache also has a 9/10 'cache hit ratio.' He said
> >only if a cache miss occurs on both the L1 and L2 cache will the physical RAM
> >will be accessed (which ofcourse is a slower process). Therefore he reasoned
> >that if we add more RAM you will only be increasing your performance 1% of
> >the time. And this is why RAM speeds today are so much slower than CPU
> >speeds; they simply have no reason to keep up.
> >I was just wondering how logical is this arguement and is it the truth?
> >
> >Thank you

>
> Why bring it up here? What exactly does your post have to do with the
> OS and it's interaction with hardware? Oh, we both know the answer:
> NOTHING.
>
> Take this elsewhere.
>
>
 
"Og" <Og@yahoo.com> wrote in message
news:OApBql7AIHA.4444@TK2MSFTNGP03.phx.gbl...
> "Fazal" <Fazal@discussions.microsoft.com> wrote in message
> news:683727EA-947A-43FD-B3A5-90CF97C0C5DE@microsoft.com...
>> The reason I ask this question is because about 3-4 years ago I was
>> listening
>> to Scott Mueller (popular computer technician) who was explaning how RAM
>> works. He said that the L1 cache runs at about same speed as the
>> microprocessor and Intel claims to have a 9/10 'cache hit ratio'. If a
>> 'cache
>> miss' occurs then the L2 cache also has a 9/10 'cache hit ratio.' He said
>> only if a cache miss occurs on both the L1 and L2 cache will the physical
>> RAM
>> will be accessed (which ofcourse is a slower process). Therefore he
>> reasoned
>> that if we add more RAM you will only be increasing your performance 1%
>> of
>> the time. And this is why RAM speeds today are so much slower than CPU
>> speeds; they simply have no reason to keep up.
>> I was just wondering how logical is this arguement and is it the truth?
>>
>> Thank you

>
> I suspect that you have mis-understood Scott Mueller. You seem to be
> comparing ham-hocks to beans; they are two completely different foods
> that, taken together, perform an olfactory feat that neither food can
> obtain independently of the other.
>
> *CACHE* memory [L1 and, if a processor has it, L2] serves an entirely
> different function than *Random Access Memory* (RAM) serves.
>
> CACHE memory exists to store snippets of "action" code (VERB) that the
> processor will soon use, or that will be reused multiple times in a
> sequence of calculations.
> RAM stores not just snippets of code, but entire programs of code (both
> NOUN and VERB).
>
> Think of yourself as a "computer", think of RAM as your office, and think
> of CACHE memory a workspace where you use each of the individual tools
> within your office. From your writing class you have 100 vocabulary words
> that you must discover definitions for. From your math class you have 100
> equations upon which you must perform a Laplace Transform.
> You (the processor) decide to do your vocabulary first, so you "pre-fetch"
> a dictionary and place it in the CACHE (your desk). With your tool place
> in CACHE, you do not need to get up from your desk and retrieve a
> dictionary 100 times for 100 words -- you use, 100 times, the dictionary
> that is placed in your CACHE.
> Having completed your vocabulary assignment, you discard the dictionary
> from CACHE and replace it with a calculator -- you use, 100 times, the
> calculator.
>


That is a bit of a complicated explanation, though not inaccurate.

It would be quite possible to connect RAM memory to the CPU that is capable
of operating at the speed the CPU would expect it to. Such memory (known as
static memory) is relatively complicated, physically large and very, very,
expensive. However, it would not require any additional cache, and would
make for a very fast, but very expensive PC.

In practice, a compromise is made and the type of RAM fitted is dynamic
memory, which is relatively simple, small, cheap and slow. Its only
disadvantage is that this latter memory will only remember data for a few
milliseconds so it has to be constantly refreshed. This is all taken care
of in the memory chips themselves these days and so does not actually pose
any problems.

In order to speed up operations, when a memory access is made, that chunk of
memory is copied into a faster L2 cache. In turn a smaller chunk is copied
into an even faster L1 cache. The processor can now access the L1 cache at
full speed. As other chunks are required they are copied as required. This
provides a substantial speed advantage than if there were no cache, but is
not as fast as the first scheme. The caches are made from expensive static
RAM, but are universally built into the processor itself these days.

The process can be reversed if a memory write takes place, but the actual
write back to the L2 cache and then to the dynamic RAM is delayed until it
has to occur or after a suitable time interval. Motherboards can be
configured to perform the write back to the dynamic RAM immediatley.
 
Og wrote:

> "Fazal" <Fazal@discussions.microsoft.com> wrote in message
> news:683727EA-947A-43FD-B3A5-90CF97C0C5DE@microsoft.com...
>
>>The reason I ask this question is because about 3-4 years ago I was
>>listening
>>to Scott Mueller (popular computer technician) who was explaning how RAM
>>works. He said that the L1 cache runs at about same speed as the
>>microprocessor and Intel claims to have a 9/10 'cache hit ratio'. If a
>>'cache
>>miss' occurs then the L2 cache also has a 9/10 'cache hit ratio.' He said
>>only if a cache miss occurs on both the L1 and L2 cache will the physical
>>RAM
>>will be accessed (which ofcourse is a slower process). Therefore he
>>reasoned
>>that if we add more RAM you will only be increasing your performance 1% of
>>the time. And this is why RAM speeds today are so much slower than CPU
>>speeds; they simply have no reason to keep up.
>>I was just wondering how logical is this arguement and is it the truth?
>>
>>Thank you

>
>
> I suspect that you have mis-understood Scott Mueller. You seem to be
> comparing ham-hocks to beans; they are two completely different foods that,
> taken together, perform an olfactory feat that neither food can obtain
> independently of the other.
>
> *CACHE* memory [L1 and, if a processor has it, L2] serves an entirely
> different function than *Random Access Memory* (RAM) serves.
>
> CACHE memory exists to store snippets of "action" code (VERB) that the
> processor will soon use, or that will be reused multiple times in a sequence
> of calculations.
> RAM stores not just snippets of code, but entire programs of code (both NOUN
> and VERB).
>
> Think of yourself as a "computer", think of RAM as your office, and think of
> CACHE memory a workspace where you use each of the individual tools within
> your office. From your writing class you have 100 vocabulary words that you
> must discover definitions for. From your math class you have 100 equations
> upon which you must perform a Laplace Transform.
> You (the processor) decide to do your vocabulary first, so you "pre-fetch" a
> dictionary and place it in the CACHE (your desk). With your tool place in
> CACHE, you do not need to get up from your desk and retrieve a dictionary
> 100 times for 100 words -- you use, 100 times, the dictionary that is placed
> in your CACHE.
> Having completed your vocabulary assignment, you discard the dictionary from
> CACHE and replace it with a calculator -- you use, 100 times, the
> calculator.
>
> Steve
>
>


Uh, that explanation is rather off-base. Cache *is* RAM, and *is* used for
the same purpose as main RAM; the only difference is speed (most importantly,
access time). And, with common PCs, cache is used to store data as well
as code.
--
Cheers, Bob
 
Fazal wrote:

> The reason I ask this question is because about 3-4 years ago I was listening
> to Scott Mueller (popular computer technician) who was explaning how RAM
> works. He said that the L1 cache runs at about same speed as the
> microprocessor and Intel claims to have a 9/10 'cache hit ratio'. If a 'cache
> miss' occurs then the L2 cache also has a 9/10 'cache hit ratio.' He said
> only if a cache miss occurs on both the L1 and L2 cache will the physical RAM
> will be accessed (which ofcourse is a slower process). Therefore he reasoned
> that if we add more RAM you will only be increasing your performance 1% of
> the time.
>

True.

And this is why RAM speeds today are so much slower than CPU
> speeds; they simply have no reason to keep up.
>

False. Main memory (as opposed to cache RAM) is slower because:
(1) it is implemented in DRAM instead of SRAM to minimize heat; and
(2) it is off-chip -- many nanoseconds away from the CPU

> I was just wondering how logical is this arguement and is it the truth?
>
> Thank you


Adding RAM increases performance only if it reduces paging. Since the page
is on a HD, with an access time of several milliseconds, adding RAM can
replace those multi-mSec accesses with some multi-nSec accesses. And that
improvement of *a factor of a million* can be very noticeable, even in your
hypothetical case where only 1% of accesses miss in both caches.
--
Cheers, Bob
 
In article <uWdBXMBBIHA.4232@TK2MSFTNGP04.phx.gbl>,
BobwBSGS@TrashThis.comcast.net says...
> Uh, that explanation is rather off-base. Cache *is* RAM, and *is* used for
> the same purpose as main RAM; the only difference is speed (most importantly,
> access time). And, with common PCs, cache is used to store data as well
> as code.


Actually, the CPU doesn't address the main ram (what we can
change/swap), it can only access the CACHE. The memory controller feeds
the cache from the main RAM so that the CPU can use it.

--

Leythos
- Igitur qui desiderat pacem, praeparet bellum.
- Calling an illegal alien an "undocumented worker" is like calling a
drug dealer an "unlicensed pharmacist"
spam999free@rrohio.com (remove 999 for proper email address)
 
The analogy is corrupt. Seek help elsewhere by checking the purpose and
operation of a cpu L1 and L2 cache.

Dave
"Fazal" <Fazal@discussions.microsoft.com> wrote in message
news:683727EA-947A-43FD-B3A5-90CF97C0C5DE@microsoft.com...
> The reason I ask this question is because about 3-4 years ago I was
> listening
> to Scott Mueller (popular computer technician) who was explaning how RAM
> works. He said that the L1 cache runs at about same speed as the
> microprocessor and Intel claims to have a 9/10 'cache hit ratio'. If a
> 'cache
> miss' occurs then the L2 cache also has a 9/10 'cache hit ratio.' He said
> only if a cache miss occurs on both the L1 and L2 cache will the physical
> RAM
> will be accessed (which ofcourse is a slower process). Therefore he
> reasoned
> that if we add more RAM you will only be increasing your performance 1% of
> the time. And this is why RAM speeds today are so much slower than CPU
> speeds; they simply have no reason to keep up.
> I was just wondering how logical is this arguement and is it the truth?
>
> Thank you
 
"Fazal" <Fazal@discussions.microsoft.com> wrote in message
news:683727EA-947A-43FD-B3A5-90CF97C0C5DE@microsoft.com...
> The reason I ask this question is because about 3-4 years ago I was
> listening
> to Scott Mueller (popular computer technician) who was explaning how RAM
> works. He said that the L1 cache runs at about same speed as the
> microprocessor and Intel claims to have a 9/10 'cache hit ratio'. If a
> 'cache
> miss' occurs then the L2 cache also has a 9/10 'cache hit ratio.' He said
> only if a cache miss occurs on both the L1 and L2 cache will the physical
> RAM
> will be accessed (which ofcourse is a slower process). Therefore he
> reasoned
> that if we add more RAM you will only be increasing your performance 1% of
> the time. And this is why RAM speeds today are so much slower than CPU
> speeds; they simply have no reason to keep up.
> I was just wondering how logical is this arguement and is it the truth?
>
> Thank you


It depends on what applications your running or plan to run.
 
Back
Top