a. First you cannot explain the difference between sum and count without also discussing Obs or ''Observed Average''. The points below explain the definition from the documentation and I have tried to break it down to make it a bit more easily understood.
b. Obs: The observed average of all data points seen for that interval. For the Percentile Metric for the App Agent for Java, this is the percentile value. For a cluster or a time rollup, this represents the weighted average across nodes or over time. (Basically, this is the average)
c. Sum: The sum of all data point values seen for that interval. For the Percentile Metric for the App Agent for Java, this is the result of the percentile value multiplied by the Count. (obs x count=sum)
d. Count: The number of observations aggregated in that one point. For example, a count of 2 indicates that there were 2 1-minute data points aggregated into one point. So 2 data points in one minute totaling 4 is a sum of 8.
e. So to put this in laymans terms, the sum of the metric is multiplied by the average (obs) and the count (data points available in the time frame being evaluated)
i. So if the count was 2 and the Obs (average for the data points in the same time frame) was 6 the Sum would be 12