I am in the process of learning C, and have begun exploring the world of pointers and pointer arithmetic. For example, in the following code snippet:
int nums[] = {1, 2, 3};
nums is an Array variable and acts like a pointer that points to the first memory location of the array. I wrote the following sample code and am trying to understand why I am getting the results that I am getting:
#include <stdio.h>
#include <stdlib.h>
int main()
{
int nums[] = {1, 2, 3};
if(nums == &nums)
puts("nums == &nums");
else
puts("nums != &nums");
if((nums + 1) == (&nums + 1))
puts("(nums + 1) == (&nums + 1)");
else
puts("(nums + 1) != (&nums + 1)");
printf("nums: %i\n", nums);
printf("&nums: %i\n", &nums);
printf("nums + 1: %i\n", nums + 1);
printf("&nums + 1: %i\n", &nums + 1);
return 0;
}
I am getting that nums == &nums is true as expected; however, when I apply pointer arithmetic and add 1 to nums this result does not equal &nums + 1. In other words (nums + 1) != (&nums + 1) even though nums == &nums.
This is the output of the program that I get:
nums == &nums
(nums + 1) != (&nums + 1)
nums: 2345600
&nums: 2345600
nums + 1: 2345604
&nums + 1: 2345612
It appears that nums and nums + 1 are off set by 4 bytes; however, &nums and &nums + 1 are offset by 12. Why is it that this offset is by 12 bytes and not by 4?
%pto printf pointer values.