我一直在为我的算法分析类开发一个程序,在那里我必须用Brute Force,贪婪,动态和分支定界策略来解决Knapsack problem。一切都完美的作品,当我在Visual Studio 2012中运行它,但如果我用gcc编译,在命令行中运行它,我得到了不同的结果:未使用Visual Studio时出现意外值输出
的Visual Studio:
+-------------------------------------------------------------------------------+
| Number of | Processing time in seconds/Maximum benefit value |
| +---------------+---------------+---------------+---------------+
| items | Brute force | Greedy | D.P. | B. & B. |
+---------------+---------------+---------------+---------------+---------------+
| 10 + 0/1290 + 0/1328 + 0/1290 + 0/1290 |
+---------------+---------------+---------------+---------------+---------------+
| 20 + 0/3286 + 0/3295 + 0/3200 + 0/3286 |
+---------------+---------------+---------------+---------------+---------------+
CMD:
+-------------------------------------------------------------------------------+
| Number of | Processing time in seconds/Maximum benefit value |
| +---------------+---------------+---------------+---------------+
| items | Brute force | Greedy | D.P. | B. & B. |
+---------------+---------------+---------------+---------------+---------------+
| 10 + 0/1290 + 0/1328 + 0/1599229779+ 0/1290 |
+---------------+---------------+---------------+---------------+---------------+
| 20 + 0/3286 + 0/3295 + 0/3200 + 0/3286 |
+---------------+---------------+---------------+---------------+---------------+
相同的数字总是显示“1599229779.”请注意,输出仅在Dynamic算法第一次运行时混乱。
这里是我的代码:
typedef struct{
short value; //This is the value of the item
short weight; //This is the weight of the item
float ratio; //This is the ratio of value/weight
} itemType;
typedef struct{
time_t startingTime;
time_t endingTime;
int maxValue;
} result;
result solveWithDynamic(itemType items[], int itemsLength, int maxCapacity){
result answer;
int rowSize = 2;
int colSize = maxCapacity + 1;
int i, j; //used in loops
int otherColumn, thisColumn;
answer.startingTime = time(NULL);
int **table = (int**)malloc((sizeof *table) * rowSize);//[2][(MAX_ITEMS*WEIGHT_MULTIPLIER)];
for(i = 0; i < rowSize; i ++)
table[i] = (int*)malloc((sizeof *table[i]) * colSize);
table[0][0] = 0;
table[1][0] = 0;
for(i = 1; i < maxCapacity; i++) table[1][i] = 0;
for(i = 0; i < itemsLength; i++){
thisColumn = i%2;
otherColumn = (i+1)%2; //this is always the other column
for(j = 1; j < maxCapacity + 1; j++){
if(items[i].weight <= j){
if(items[i].value + table[otherColumn][j-items[i].weight] > table[otherColumn][j])
table[thisColumn][j] = items[i].value + table[otherColumn][j-items[i].weight];
else
table[thisColumn][j] = table[otherColumn][j];
} else {
table[thisColumn][j] = table[thisColumn][j-1];
}//end if/else
}//end for
}//end for
answer.maxValue = table[thisColumn][maxCapacity];
answer.endingTime = time(NULL);
for(i = 0; i < rowSize; i ++)
free(table[i]);
free(table);
return answer;
}//end solveWithDynamic
只是有点解释。我在这个算法的内存消耗方面遇到了问题,因为我必须运行它以获得一组10,000个项目。我意识到我不需要存储整个桌子,因为我只看过前一列。实际上我发现只需要存储当前行和x + 1个附加值,其中x是当前itemType的权重。它将(itemsLength+1) * (maxCapacity+1)
元素所需的内存带到2*(maxCapacity+1)
,可能还有(maxCapacity+1) + (x+1)
(尽管我不需要优化它)。
此外,我在这个函数中使用了printf("%d", answer.maxValue);
,它仍然是“1599229779”。任何人都可以帮我弄清楚发生了什么事?谢谢。
'sizeof(int)'在gcc和visual studio中是一样的吗? – Evans 2013-04-30 14:15:03