2013-10-05 73 views
1

我试图端口下面的从Java到JavaScript的:的JavaScript和Java - 字符串编码

String key1 = "whatever"; 
String otherKey = "blah"; 
String key2;  
byte keyBytes[] = key1.getBytes(); 
for (int i = 0; i < keyBytes.length; i++) { 
    keyBytes[i] ^= otherKey.charAt(i % otherKey.length()); 
} 
key2 = new String(keyBytes); 

下面是我写的东西:

var key1 = "whatever"; 
var other_key = "blah"; 
var key2 = ""; 
for (var i = 0; i < key1.length; ++i) 
{ 
    var ch = key1.charCodeAt(i); 
    ch ^= other_key.charAt(i % other_key.length); 
    key2 += String.fromCharCode(ch); 
} 

然而,他们给出不同的答案。 ...

什么是catch,JavaScript字符串编码不同,我该如何纠正它们?

+3

Java字节是8位; javascript char是unicode 16位。 – fred02138

+2

代码的第一部分以多种方式崩溃,不应该在Java中使用,更不要说移植到Javascript。 –

+2

@JonSkeet请问你能解释一下为什么它会被破坏? :) –

回答

0

你在你的代码忘了一个charCodeAt(),如下:

var key1 = "whatever"; 
var other_key = "blah"; 
var key2 = ""; 
for (var i = 0; i < key1.length; ++i) 
{ 
    var ch = key1.charCodeAt(i); 
    ch ^= other_key.charAt(i % other_key.length).charCodeAt(0); 
    key2 += String.fromCharCode(ch); 
} 

,在Java中的隐式转换(字符到字节)操作,在此之前^=

我改变在java和javascript中查看该字节数组的代码。运行后的结果是一样的:

的Javascript:

function convert(){ 
    var key1 = "whatever"; 
    var other_key = "blah"; 

    var key2 = ""; 
    var byteArray = new Array(); 

    for (var i = 0; i < key1.length; ++i){ 
     var ch = key1.charCodeAt(i); 
     ch ^= other_key.charAt(i % other_key.length).charCodeAt(0); 

     byteArray.push(ch); 
     key2 += String.fromCharCode(ch); 
    } 

    alert(byteArray); 
} 

结果:21,4,0​​,28,7,26,4,26


Java:

static void convert() { 
    String key1 = "whatever"; 
    String otherKey = "blah"; 
    String key2; 
    byte keyBytes[] = key1.getBytes(); 
    for (int i = 0; i < keyBytes.length; i++) { 
     keyBytes[i] ^= otherKey.charAt(i % otherKey.length()); 
    } 
    System.out.println(Arrays.toString(keyBytes)); 
    key2 = new String(keyBytes); 
} 

结果:[21,4,0,28,7,26,4,26]