Segmentation fault (core dumped) error
I'm writing a program that uses a Hashtable of linked lists to count the frequency of words. The program will count all the words that I enter along with the frequency however after printing the hashtable, I get a segmentation fault (core dumped) error. When I valgrind my program, it shows that I get errors in three different places that are Invalid read of size 8. I'm not sure how to fix them though. Here are the three different places:
void freeTable(HashTablePtr table) {
int i;
ListPtr list;
if (table == NULL)
return;
for (i = 0; i < table->size; i++) {
list = table->table[i];
freeList(list);
}
free(table->table);
free(table);
}
HashTablePtr createTable(int tableSize) {
int i;
HashTablePtr table = (HashTablePtr) malloc(sizeof(HashTablePtr));
table->table = (ListPtr *) malloc(sizeof(ListPtr) * tableSize);
table->size = tableSize;
for (i = 0; i < table->size; i++) {
table->table[i] = createList();
}
return table;
}
void printTable(HashTablePtr table) {
ListPtr tempList;
NodePtr tempNode;
HashObjectPtr obj;
int i;
for (i = 1; i < table->size; i++) {
tempList = table->table[i];
if (tempList->size != 0) {
tempNode = tempList->head;
obj = tempNode->HashObject;
printf("%s\n\n", toString(obj));
}
}
}
I think that the error has to due with using these lines:
tempList = table->table[i]; table->table[i] = createList(); but I'm not sure how to fix it.Edit:
typedef struct hashtable HashTable;
typedef struct hashtable * HashTablePtr;
struct hashtable {
int size;
ListPtr *table;
};
Valgrind开发者_运维知识库 errors:
999 errors in context 5 of 9:
==73795== Invalid read of size 8 ==73795== at 0x400B7D: printTable (HashTable.c:96) ==73795== by 0x400766: main (wf.c:16) ==73795== Address 0x4c34048 is 0 bytes after a block of size 8 alloc'd ==73795== at 0x4A0515D: malloc (vg_replace_malloc.c:195) ==73795== by 0x400D05: createTable (HashTable.c:17) ==73795== by 0x400753: main (wf.c:14) ==73795== ==73795== ==73795== 1000 errors in context 6 of 9: ==73795== Invalid read of size 8 ==73795== at 0x400B2B: freeTable (HashTable.c:128) ==73795== by 0x40076E: main (wf.c:17) ==73795== Address 0x4c34048 is 0 bytes after a block of size 8 alloc'd ==73795== at 0x4A0515D: malloc (vg_replace_malloc.c:195) ==73795== by 0x400D05: createTable (HashTable.c:17) ==73795== by 0x400753: main (wf.c:14) ==73795== ==73795== ==73795== 1000 errors in context 7 of 9: ==73795== Invalid read of size 8 ==73795== at 0x400D4C: createTable (HashTable.c:25) ==73795== by 0x400753: main (wf.c:14) ==73795== Address 0x4c34048 is 0 bytes after a block of size 8 alloc'd ==73795== at 0x4A0515D: malloc (vg_replace_malloc.c:195) ==73795== by 0x400D05: createTable (HashTable.c:17) ==73795== by 0x400753: main (wf.c:14) ListPtr createList() {
ListPtr list;
list = (ListPtr) malloc(sizeof(List));
list->size = 0;
list->head = NULL;
list->tail = NULL;
return list;
}
The HashTablePtr table = (HashTablePtr) malloc(sizeof(HashTablePtr));
is almost certainly wrong. You want to allocate enough storage for a HashTable, but it seems you're allocating storage for just a pointer to a HashTable(your HashTablePtr
)
If you drop the habit of typedef'ing pointer and instead follow the approach of allocating in the following form, you won't get into that sort of problems:
HashTable *table = malloc(sizeof *table);
Size of a pointer is always a fixed size and is architecture dependent. On 64-bit platforms, for example, it is always 8 bytes. Basically, sizeof (ObjectType)
is not the same as sizeof (ObjectType *)
. So in this case you end up allocating less memory than you need which leads to segmentation faults.
精彩评论