/[suikacvs]/markup/html/whatpm/t/tokenizer-result.txt
Suika

Contents of /markup/html/whatpm/t/tokenizer-result.txt

Parent Directory Parent Directory | Revision Log Revision Log


Revision 1.298 - (hide annotations) (download)
Sun Aug 16 04:06:34 2009 UTC (15 years, 11 months ago) by wakaba
Branch: MAIN
Changes since 1.297: +56 -24 lines
File MIME type: text/plain
++ whatpm/t/ChangeLog	16 Aug 2009 04:05:04 -0000
	* tree-test-1.dat, tree-test-3.dat, tree-test-flow.dat,
	tree-test-foreign.dat, tree-test-form.dat, tree-test-phrasing.dat,
	tokenizer-test-1.test, tokenizer-test-2.dat, tokenizer-test-3.dat:
	DOCTYPE names are now normalized to its lowercased form (HTML5
	revision 2502).

2009-08-16  Wakaba  <wakaba@suika.fam.cx>

++ whatpm/Whatpm/HTML/ChangeLog	16 Aug 2009 04:06:26 -0000
2009-08-16  Wakaba  <wakaba@suika.fam.cx>

	* Tokenizer.pm.src: Lowercase-fold doctype names (HTML5 revision
	2501, cf. HTML5 revision 3571).

1 wakaba 1.287 1..1129
2 wakaba 1.273 # Running under perl version 5.010000 for linux
3 wakaba 1.298 # Current time local: Sun Aug 16 12:55:33 2009
4     # Current time GMT: Sun Aug 16 03:55:33 2009
5 wakaba 1.1 # Using Test.pm version 1.25
6 wakaba 1.11 # t/tokenizer/test1.test
7 wakaba 1.20 ok 1
8 wakaba 1.298 not ok 2
9     # Test 2 got: "$VAR1 = [\n [\n qq'DOCTYPE',\n qq'html',\n undef,\n undef,\n 1\n ]\n ];\n" (t/HTML-tokenizer.t at line 205 fail #2)
10     # Expected: "$VAR1 = [\n [\n qq'DOCTYPE',\n qq'HTML',\n undef,\n undef,\n 1\n ]\n ];\n" (Correct Doctype uppercase: qq'<!DOCTYPE HTML>')
11     # Line 4 is changed:
12     # - " qq'HTML',\n"
13     # + " qq'html',\n"
14     # t/HTML-tokenizer.t line 205 is: ok $parser_dump, $expected_dump,
15     not ok 3
16     # Test 3 got: "$VAR1 = [\n [\n qq'DOCTYPE',\n qq'html',\n undef,\n undef,\n 1\n ]\n ];\n" (t/HTML-tokenizer.t at line 205 fail #3)
17     # Expected: "$VAR1 = [\n [\n qq'DOCTYPE',\n qq'HtMl',\n undef,\n undef,\n 1\n ]\n ];\n" (Correct Doctype mixed case: qq'<!DOCTYPE HtMl>')
18     # Line 4 is changed:
19     # - " qq'HtMl',\n"
20     # + " qq'html',\n"
21 wakaba 1.1 ok 4
22 wakaba 1.20 ok 5
23 wakaba 1.1 ok 6
24     ok 7
25     ok 8
26     ok 9
27     ok 10
28     ok 11
29     ok 12
30     ok 13
31     ok 14
32 wakaba 1.130 ok 15
33 wakaba 1.1 ok 16
34     ok 17
35     ok 18
36 wakaba 1.296 not ok 19
37     # Test 19 got: "$VAR1 = [\n [\n qq'Comment',\n qq' --comment '\n ]\n ];\n" (t/HTML-tokenizer.t at line 205 fail #19)
38     # Expected: "$VAR1 = [\n qq'ParseError',\n [\n qq'Comment',\n qq' --comment '\n ]\n ];\n" (Comment, two central dashes: qq'<!-- --comment -->')
39     # Line 2 is missing:
40     # - " qq'ParseError',\n"
41 wakaba 1.1 ok 20
42     ok 21
43 wakaba 1.25 ok 22
44     ok 23
45 wakaba 1.1 ok 24
46 wakaba 1.22 ok 25
47     ok 26
48     ok 27
49 wakaba 1.1 ok 28
50     ok 29
51     ok 30
52     ok 31
53     ok 32
54     ok 33
55 wakaba 1.18 ok 34
56 wakaba 1.1 ok 35
57     ok 36
58     ok 37
59 wakaba 1.8 ok 38
60 wakaba 1.28 ok 39
61     ok 40
62 wakaba 1.43 ok 41
63     ok 42
64 wakaba 1.286 ok 43
65 wakaba 1.11 # t/tokenizer/test2.test
66 wakaba 1.286 not ok 44
67     # Test 44 got: "$VAR1 = [\n qq'ParseError',\n qq'ParseError',\n [\n qq'DOCTYPE',\n undef,\n undef,\n undef,\n 0\n ]\n ];\n" (t/HTML-tokenizer.t at line 205 fail #44)
68 wakaba 1.47 # Expected: "$VAR1 = [\n qq'ParseError',\n qq'ParseError',\n [\n qq'DOCTYPE',\n qq'',\n undef,\n undef,\n 0\n ]\n ];\n" (DOCTYPE without name: qq'<!DOCTYPE>')
69 wakaba 1.20 # Line 6 is changed:
70 wakaba 1.8 # - " qq'',\n"
71 wakaba 1.20 # + " undef,\n"
72     ok 45
73     ok 46
74     ok 47
75     ok 48
76     ok 49
77     ok 50
78     ok 51
79 wakaba 1.97 ok 52
80     ok 53
81     ok 54
82     ok 55
83 wakaba 1.9 ok 56
84     ok 57
85 wakaba 1.1 ok 58
86     ok 59
87     ok 60
88 wakaba 1.19 ok 61
89 wakaba 1.1 ok 62
90     ok 63
91 wakaba 1.130 ok 64
92 wakaba 1.1 ok 65
93 wakaba 1.240 ok 66
94     ok 67
95     ok 68
96 wakaba 1.1 ok 69
97     ok 70
98 wakaba 1.34 ok 71
99     ok 72
100 wakaba 1.1 ok 73
101     ok 74
102 wakaba 1.21 ok 75
103     ok 76
104 wakaba 1.1 ok 77
105 wakaba 1.141 ok 78
106 wakaba 1.1 ok 79
107     ok 80
108 wakaba 1.34 ok 81
109 wakaba 1.286 # t/tokenizer/test3.test
110 wakaba 1.15 ok 82
111 wakaba 1.1 ok 83
112     ok 84
113 wakaba 1.25 ok 85
114     ok 86
115 wakaba 1.34 ok 87
116 wakaba 1.1 ok 88
117     ok 89
118     ok 90
119     ok 91
120 wakaba 1.296 not ok 92
121     # Test 92 got: "$VAR1 = [\n qq'ParseError',\n [\n qq'Comment',\n qq'--.'\n ]\n ];\n" (t/HTML-tokenizer.t at line 205 fail #92)
122     # Expected: "$VAR1 = [\n qq'ParseError',\n qq'ParseError',\n [\n qq'Comment',\n qq'--.'\n ]\n ];\n" (<!----.: qq'<!----.')
123     # Line 3 is missing:
124     # - " qq'ParseError',\n"
125 wakaba 1.1 ok 93
126     ok 94
127 wakaba 1.8 ok 95
128     ok 96
129     ok 97
130     ok 98
131     ok 99
132     ok 100
133 wakaba 1.96 ok 101
134     ok 102
135     ok 103
136     ok 104
137 wakaba 1.141 ok 105
138 wakaba 1.286 ok 106
139     ok 107
140     ok 108
141     not ok 109
142     # Test 109 got: "$VAR1 = [\n qq'ParseError',\n [\n qq'DOCTYPE',\n undef,\n undef,\n undef,\n 0\n ]\n ];\n" (t/HTML-tokenizer.t at line 205 fail #109)
143 wakaba 1.47 # Expected: "$VAR1 = [\n qq'ParseError',\n [\n qq'DOCTYPE',\n qq'',\n undef,\n undef,\n 0\n ]\n ];\n" (<!doctype >: qq'<!doctype >')
144 wakaba 1.43 # Line 5 is changed:
145     # - " qq'',\n"
146     # + " undef,\n"
147 wakaba 1.286 not ok 110
148     # Test 110 got: "$VAR1 = [\n qq'ParseError',\n [\n qq'DOCTYPE',\n undef,\n undef,\n undef,\n 0\n ]\n ];\n" (t/HTML-tokenizer.t at line 205 fail #110)
149 wakaba 1.47 # Expected: "$VAR1 = [\n qq'ParseError',\n [\n qq'DOCTYPE',\n qq'',\n undef,\n undef,\n 0\n ]\n ];\n" (<!doctype : qq'<!doctype ')
150 wakaba 1.43 # Line 5 is changed:
151     # - " qq'',\n"
152     # + " undef,\n"
153 wakaba 1.8 ok 111
154     ok 112
155     ok 113
156 wakaba 1.10 ok 114
157 wakaba 1.287 ok 115
158 wakaba 1.10 ok 116
159     ok 117
160     ok 118
161 wakaba 1.287 ok 119
162 wakaba 1.10 ok 120
163     ok 121
164 wakaba 1.39 ok 122
165 wakaba 1.18 ok 123
166 wakaba 1.287 ok 124
167 wakaba 1.18 ok 125
168     ok 126
169 wakaba 1.20 ok 127
170 wakaba 1.240 ok 128
171 wakaba 1.20 ok 129
172 wakaba 1.287 ok 130
173 wakaba 1.240 ok 131
174 wakaba 1.20 ok 132
175     ok 133
176     ok 134
177 wakaba 1.287 ok 135
178 wakaba 1.20 ok 136
179 wakaba 1.240 ok 137
180 wakaba 1.21 ok 138
181 wakaba 1.239 ok 139
182 wakaba 1.20 ok 140
183     ok 141
184 wakaba 1.28 ok 142
185 wakaba 1.286 ok 143
186     ok 144
187     ok 145
188     not ok 146
189     # Test 146 got: "$VAR1 = [\n qq'ParseError',\n qq'ParseError',\n [\n qq'StartTag',\n qq'z',\n {\n 0 => qq''\n }\n ]\n ];\n" (t/HTML-tokenizer.t at line 205 fail #146)
190 wakaba 1.247 # Expected: "$VAR1 = [\n qq'ParseError',\n [\n qq'StartTag',\n qq'z',\n {\n 0 => qq''\n }\n ]\n ];\n" (<z/0=>: qq'<z/0=>')
191     # Got 1 extra line at line 3:
192     # + " qq'ParseError',\n"
193 wakaba 1.130 ok 147
194 wakaba 1.239 ok 148
195 wakaba 1.22 ok 149
196     ok 150
197 wakaba 1.130 ok 151
198 wakaba 1.239 ok 152
199 wakaba 1.22 ok 153
200     ok 154
201     ok 155
202     ok 156
203 wakaba 1.28 ok 157
204     ok 158
205 wakaba 1.239 ok 159
206     ok 160
207 wakaba 1.28 ok 161
208     ok 162
209     ok 163
210     ok 164
211     ok 165
212     ok 166
213     ok 167
214     ok 168
215 wakaba 1.141 ok 169
216 wakaba 1.28 ok 170
217     ok 171
218     ok 172
219 wakaba 1.286 # t/tokenizer/test4.test
220 wakaba 1.28 ok 173
221 wakaba 1.293 not ok 174
222     # Test 174 got: "$VAR1 = [\n qq'ParseError',\n qq'ParseError',\n [\n qq'StartTag',\n qq'z',\n {\n qq'x' => qq'<'\n }\n ]\n ];\n" (t/HTML-tokenizer.t at line 205 fail #174)
223     # Expected: "$VAR1 = [\n qq'ParseError',\n [\n qq'StartTag',\n qq'z',\n {\n qq'x' => qq'<'\n }\n ]\n ];\n" (< in attribute value: qq'<z x=<')
224     # Got 1 extra line at line 3:
225     # + " qq'ParseError',\n"
226 wakaba 1.286 ok 175
227     ok 176
228     not ok 177
229     # Test 177 got: "$VAR1 = [\n qq'ParseError',\n qq'ParseError',\n [\n qq'StartTag',\n qq'z',\n {\n qq'=' => qq''\n }\n ]\n ];\n" (t/HTML-tokenizer.t at line 205 fail #177)
230 wakaba 1.247 # Expected: "$VAR1 = [\n qq'ParseError',\n [\n qq'StartTag',\n qq'z',\n {\n qq'=' => qq''\n }\n ]\n ];\n" (== attribute: qq'<z ==>')
231     # Got 1 extra line at line 3:
232     # + " qq'ParseError',\n"
233 wakaba 1.28 ok 178
234 wakaba 1.33 ok 179
235 wakaba 1.34 ok 180
236 wakaba 1.38 ok 181
237     ok 182
238 wakaba 1.43 ok 183
239     ok 184
240     ok 185
241     ok 186
242     ok 187
243     ok 188
244 wakaba 1.240 ok 189
245     ok 190
246 wakaba 1.43 ok 191
247     ok 192
248     ok 193
249     ok 194
250     ok 195
251     ok 196
252     ok 197
253 wakaba 1.96 ok 198
254     ok 199
255 wakaba 1.286 ok 200
256 wakaba 1.96 ok 201
257 wakaba 1.130 ok 202
258 wakaba 1.43 ok 203
259     ok 204
260     ok 205
261     ok 206
262     ok 207
263     ok 208
264     ok 209
265     ok 210
266     ok 211
267     ok 212
268     ok 213
269     ok 214
270 wakaba 1.240 ok 215
271     ok 216
272 wakaba 1.43 ok 217
273     ok 218
274     ok 219
275     ok 220
276 wakaba 1.141 ok 221
277 wakaba 1.286 ok 222
278 wakaba 1.298 not ok 223
279     # Test 223 got: "$VAR1 = [\n [\n qq'DOCTYPE',\n qq'html',\n qq'AbC',\n qq'XyZ',\n 1\n ]\n ];\n" (t/HTML-tokenizer.t at line 205 fail #223)
280     # Expected: "$VAR1 = [\n [\n qq'DOCTYPE',\n qq'HtMl',\n qq'AbC',\n qq'XyZ',\n 1\n ]\n ];\n" (Doctype public case-sensitivity (1): qq'<!DoCtYpE HtMl PuBlIc "AbC" "XyZ">')
281     # Line 4 is changed:
282     # - " qq'HtMl',\n"
283     # + " qq'html',\n"
284     not ok 224
285     # Test 224 got: "$VAR1 = [\n [\n qq'DOCTYPE',\n qq'html',\n qq'aBc',\n qq'xYz',\n 1\n ]\n ];\n" (t/HTML-tokenizer.t at line 205 fail #224)
286     # Expected: "$VAR1 = [\n [\n qq'DOCTYPE',\n qq'hTmL',\n qq'aBc',\n qq'xYz',\n 1\n ]\n ];\n" (Doctype public case-sensitivity (2): qq'<!dOcTyPe hTmL pUbLiC "aBc" "xYz">')
287     # Line 4 is changed:
288     # - " qq'hTmL',\n"
289     # + " qq'html',\n"
290     not ok 225
291     # Test 225 got: "$VAR1 = [\n [\n qq'DOCTYPE',\n qq'html',\n undef,\n qq'XyZ',\n 1\n ]\n ];\n" (t/HTML-tokenizer.t at line 205 fail #225)
292     # Expected: "$VAR1 = [\n [\n qq'DOCTYPE',\n qq'HtMl',\n undef,\n qq'XyZ',\n 1\n ]\n ];\n" (Doctype system case-sensitivity (1): qq'<!DoCtYpE HtMl SyStEm "XyZ">')
293     # Line 4 is changed:
294     # - " qq'HtMl',\n"
295     # + " qq'html',\n"
296     not ok 226
297     # Test 226 got: "$VAR1 = [\n [\n qq'DOCTYPE',\n qq'html',\n undef,\n qq'xYz',\n 1\n ]\n ];\n" (t/HTML-tokenizer.t at line 205 fail #226)
298     # Expected: "$VAR1 = [\n [\n qq'DOCTYPE',\n qq'hTmL',\n undef,\n qq'xYz',\n 1\n ]\n ];\n" (Doctype system case-sensitivity (2): qq'<!dOcTyPe hTmL sYsTeM "xYz">')
299     # Line 4 is changed:
300     # - " qq'hTmL',\n"
301     # + " qq'html',\n"
302 wakaba 1.286 not ok 227
303     # Test 227 got: "$VAR1 = [\n qq'ParseError',\n qq'ParseError',\n [\n qq'Comment',\n qq'doc'\n ],\n [\n qq'Character',\n qq'\\x{FFFD}'\n ]\n ];\n" (t/HTML-tokenizer.t at line 205 fail #227)
304 wakaba 1.130 # Expected: "$VAR1 = [\n qq'ParseError',\n [\n qq'Comment',\n qq'doc'\n ],\n qq'ParseError',\n [\n qq'Character',\n qq'\\x{FFFD}'\n ]\n ];\n" (U+0000 in lookahead region after non-matching character: qq'<!doc>\x{00}')
305     # Got 1 extra line at line 3:
306     # + " qq'ParseError',\n"
307     # Line 8 is missing:
308     # - " qq'ParseError',\n"
309 wakaba 1.43 ok 228
310     ok 229
311     ok 230
312     ok 231
313     ok 232
314     ok 233
315     ok 234
316     ok 235
317 wakaba 1.141 ok 236
318 wakaba 1.43 ok 237
319     ok 238
320     ok 239
321     ok 240
322     ok 241
323 wakaba 1.287 ok 242
324 wakaba 1.43 ok 243
325 wakaba 1.287 ok 244
326 wakaba 1.286 # t/tokenizer/contentModelFlags.test
327 wakaba 1.43 ok 245
328     ok 246
329     ok 247
330     ok 248
331 wakaba 1.141 ok 249
332 wakaba 1.43 ok 250
333     ok 251
334     ok 252
335     ok 253
336     ok 254
337     ok 255
338 wakaba 1.141 ok 256
339 wakaba 1.43 ok 257
340 wakaba 1.286 # t/tokenizer/escapeFlag.test
341 wakaba 1.43 ok 258
342     ok 259
343     ok 260
344     ok 261
345     ok 262
346 wakaba 1.206 ok 263
347 wakaba 1.43 ok 264
348     ok 265
349     ok 266
350 wakaba 1.286 # t/tokenizer/entities.test
351 wakaba 1.43 ok 267
352     ok 268
353     ok 269
354     ok 270
355     ok 271
356     ok 272
357     ok 273
358     ok 274
359     ok 275
360     ok 276
361     ok 277
362     ok 278
363     ok 279
364     ok 280
365     ok 281
366     ok 282
367     ok 283
368     ok 284
369     ok 285
370     ok 286
371     ok 287
372     ok 288
373     ok 289
374     ok 290
375     ok 291
376     ok 292
377     ok 293
378     ok 294
379     ok 295
380     ok 296
381     ok 297
382     ok 298
383     ok 299
384     ok 300
385     ok 301
386     ok 302
387     ok 303
388     ok 304
389     ok 305
390     ok 306
391     ok 307
392     ok 308
393     ok 309
394     ok 310
395     ok 311
396     ok 312
397     ok 313
398     ok 314
399     ok 315
400     ok 316
401     ok 317
402     ok 318
403     ok 319
404     ok 320
405     ok 321
406     ok 322
407     ok 323
408     ok 324
409     ok 325
410     ok 326
411     ok 327
412     ok 328
413     ok 329
414     ok 330
415     ok 331
416     ok 332
417     ok 333
418     ok 334
419     ok 335
420     ok 336
421     ok 337
422 wakaba 1.59 ok 338
423     ok 339
424     ok 340
425     ok 341
426     ok 342
427     ok 343
428     ok 344
429     ok 345
430     ok 346
431     ok 347
432 wakaba 1.62 ok 348
433     ok 349
434     ok 350
435     ok 351
436     ok 352
437     ok 353
438     ok 354
439     ok 355
440     ok 356
441     ok 357
442     ok 358
443     ok 359
444 wakaba 1.96 ok 360
445     ok 361
446     ok 362
447     ok 363
448 wakaba 1.129 ok 364
449     ok 365
450     ok 366
451     ok 367
452     ok 368
453     ok 369
454     ok 370
455     ok 371
456     ok 372
457     ok 373
458     ok 374
459     ok 375
460     ok 376
461     ok 377
462     ok 378
463     ok 379
464     ok 380
465     ok 381
466     ok 382
467     ok 383
468     ok 384
469     ok 385
470     ok 386
471     ok 387
472     ok 388
473     ok 389
474     ok 390
475     ok 391
476     ok 392
477     ok 393
478     ok 394
479     ok 395
480     ok 396
481 wakaba 1.130 ok 397
482     ok 398
483     ok 399
484     ok 400
485     ok 401
486     ok 402
487     ok 403
488     ok 404
489     ok 405
490     ok 406
491     ok 407
492     ok 408
493     ok 409
494     ok 410
495     ok 411
496     ok 412
497     ok 413
498     ok 414
499     ok 415
500     ok 416
501 wakaba 1.132 ok 417
502     ok 418
503     ok 419
504     ok 420
505 wakaba 1.136 ok 421
506     ok 422
507     ok 423
508     ok 424
509     ok 425
510     ok 426
511     ok 427
512     ok 428
513     ok 429
514     ok 430
515     ok 431
516     ok 432
517     ok 433
518     ok 434
519 wakaba 1.205 ok 435
520 wakaba 1.136 ok 436
521     ok 437
522     ok 438
523 wakaba 1.205 ok 439
524 wakaba 1.136 ok 440
525     ok 441
526     ok 442
527 wakaba 1.205 ok 443
528 wakaba 1.136 ok 444
529     ok 445
530 wakaba 1.205 ok 446
531 wakaba 1.136 ok 447
532     ok 448
533     ok 449
534     ok 450
535     ok 451
536     ok 452
537     ok 453
538     ok 454
539     ok 455
540     ok 456
541     ok 457
542     ok 458
543     ok 459
544     ok 460
545     ok 461
546     ok 462
547     ok 463
548     ok 464
549     ok 465
550     ok 466
551     ok 467
552     ok 468
553     ok 469
554     ok 470
555     ok 471
556 wakaba 1.141 ok 472
557 wakaba 1.195 ok 473
558     ok 474
559     ok 475
560     ok 476
561     ok 477
562 wakaba 1.205 ok 478
563     ok 479
564     ok 480
565     ok 481
566     ok 482
567     ok 483
568     ok 484
569     ok 485
570     ok 486
571     ok 487
572     ok 488
573     ok 489
574     ok 490
575     ok 491
576     ok 492
577     ok 493
578     ok 494
579     ok 495
580     ok 496
581     ok 497
582     ok 498
583     ok 499
584     ok 500
585     ok 501
586     ok 502
587     ok 503
588     ok 504
589     ok 505
590     ok 506
591     ok 507
592     ok 508
593     ok 509
594     ok 510
595     ok 511
596     ok 512
597     ok 513
598     ok 514
599     ok 515
600     ok 516
601     ok 517
602     ok 518
603     ok 519
604     ok 520
605     ok 521
606     ok 522
607     ok 523
608     ok 524
609     ok 525
610     ok 526
611     ok 527
612     ok 528
613     ok 529
614     ok 530
615     ok 531
616     ok 532
617     ok 533
618     ok 534
619     ok 535
620     ok 536
621     ok 537
622     ok 538
623     ok 539
624 wakaba 1.210 ok 540
625 wakaba 1.205 ok 541
626     ok 542
627     ok 543
628     ok 544
629     ok 545
630     ok 546
631     ok 547
632     ok 548
633     ok 549
634     ok 550
635     ok 551
636     ok 552
637     ok 553
638     ok 554
639     ok 555
640     ok 556
641     ok 557
642     ok 558
643     ok 559
644     ok 560
645     ok 561
646     ok 562
647     ok 563
648     ok 564
649     ok 565
650     ok 566
651     ok 567
652     ok 568
653     ok 569
654     ok 570
655     ok 571
656     ok 572
657     ok 573
658     ok 574
659     ok 575
660     ok 576
661     ok 577
662     ok 578
663     ok 579
664     ok 580
665     ok 581
666     ok 582
667     ok 583
668     ok 584
669     ok 585
670     ok 586
671     ok 587
672     ok 588
673     ok 589
674     ok 590
675     ok 591
676     ok 592
677     ok 593
678     ok 594
679     ok 595
680     ok 596
681     ok 597
682     ok 598
683     ok 599
684     ok 600
685     ok 601
686     ok 602
687     ok 603
688     ok 604
689     ok 605
690     ok 606
691     ok 607
692     ok 608
693     ok 609
694     ok 610
695     ok 611
696     ok 612
697     ok 613
698     ok 614
699     ok 615
700     ok 616
701     ok 617
702     ok 618
703     ok 619
704     ok 620
705     ok 621
706     ok 622
707     ok 623
708     ok 624
709     ok 625
710     ok 626
711     ok 627
712     ok 628
713     ok 629
714     ok 630
715     ok 631
716     ok 632
717     ok 633
718     ok 634
719     ok 635
720     ok 636
721     ok 637
722     ok 638
723     ok 639
724     ok 640
725     ok 641
726     ok 642
727     ok 643
728     ok 644
729     ok 645
730     ok 646
731     ok 647
732     ok 648
733     ok 649
734     ok 650
735     ok 651
736     ok 652
737     ok 653
738     ok 654
739     ok 655
740     ok 656
741     ok 657
742     ok 658
743     ok 659
744     ok 660
745     ok 661
746     ok 662
747     ok 663
748     ok 664
749     ok 665
750     ok 666
751     ok 667
752     ok 668
753     ok 669
754     ok 670
755     ok 671
756     ok 672
757     ok 673
758     ok 674
759     ok 675
760     ok 676
761     ok 677
762     ok 678
763     ok 679
764     ok 680
765     ok 681
766     ok 682
767     ok 683
768     ok 684
769     ok 685
770     ok 686
771     ok 687
772     ok 688
773     ok 689
774     ok 690
775     ok 691
776     ok 692
777     ok 693
778     ok 694
779     ok 695
780     ok 696
781     ok 697
782     ok 698
783     ok 699
784     ok 700
785     ok 701
786     ok 702
787     ok 703
788     ok 704
789     ok 705
790     ok 706
791     ok 707
792     ok 708
793     ok 709
794     ok 710
795     ok 711
796     ok 712
797     ok 713
798     ok 714
799     ok 715
800     ok 716
801     ok 717
802     ok 718
803     ok 719
804     ok 720
805     ok 721
806     ok 722
807     ok 723
808     ok 724
809     ok 725
810     ok 726
811     ok 727
812     ok 728
813     ok 729
814     ok 730
815     ok 731
816     ok 732
817     ok 733
818     ok 734
819     ok 735
820     ok 736
821     ok 737
822     ok 738
823     ok 739
824     ok 740
825     ok 741
826     ok 742
827     ok 743
828     ok 744
829     ok 745
830     ok 746
831     ok 747
832     ok 748
833     ok 749
834     ok 750
835     ok 751
836     ok 752
837     ok 753
838     ok 754
839     ok 755
840     ok 756
841     ok 757
842     ok 758
843     ok 759
844     ok 760
845     ok 761
846     ok 762
847     ok 763
848     ok 764
849     ok 765
850     ok 766
851     ok 767
852     ok 768
853     ok 769
854     ok 770
855     ok 771
856     ok 772
857     ok 773
858     ok 774
859     ok 775
860     ok 776
861     ok 777
862     ok 778
863     ok 779
864     ok 780
865     ok 781
866     ok 782
867     ok 783
868     ok 784
869     ok 785
870     ok 786
871     ok 787
872     ok 788
873     ok 789
874     ok 790
875     ok 791
876     ok 792
877     ok 793
878     ok 794
879     ok 795
880     ok 796
881     ok 797
882     ok 798
883     ok 799
884     ok 800
885     ok 801
886     ok 802
887     ok 803
888     ok 804
889     ok 805
890     ok 806
891     ok 807
892     ok 808
893     ok 809
894     ok 810
895     ok 811
896     ok 812
897     ok 813
898     ok 814
899     ok 815
900     ok 816
901     ok 817
902     ok 818
903     ok 819
904     ok 820
905     ok 821
906     ok 822
907     ok 823
908     ok 824
909     ok 825
910     ok 826
911     ok 827
912     ok 828
913     ok 829
914     ok 830
915     ok 831
916     ok 832
917     ok 833
918     ok 834
919     ok 835
920     ok 836
921     ok 837
922     ok 838
923     ok 839
924     ok 840
925     ok 841
926     ok 842
927     ok 843
928     ok 844
929     ok 845
930 wakaba 1.286 ok 846
931     ok 847
932     ok 848
933     ok 849
934     ok 850
935 wakaba 1.205 # t/tokenizer/xmlViolation.test
936 wakaba 1.286 not ok 851
937     # Test 851 got: "$VAR1 = [\n qq'ParseError',\n [\n qq'Character',\n qq'a\\x{FFFF}b'\n ]\n ];\n" (t/HTML-tokenizer.t at line 205 fail #851)
938 wakaba 1.206 # Expected: "$VAR1 = [\n qq'ParseError',\n [\n qq'Character',\n qq'a\\x{FFFD}b'\n ]\n ];\n" (Non-XML character: qq'a\x{FFFF}b')
939     # Line 5 is changed:
940     # - " qq'a\\x{FFFD}b'\n"
941     # + " qq'a\\x{FFFF}b'\n"
942 wakaba 1.286 not ok 852
943     # Test 852 got: "$VAR1 = [\n [\n qq'Character',\n qq'a\\x{0C}b'\n ]\n ];\n" (t/HTML-tokenizer.t at line 205 fail #852)
944 wakaba 1.206 # Expected: "$VAR1 = [\n [\n qq'Character',\n qq'a b'\n ]\n ];\n" (Non-XML space: qq'a\x{0C}b')
945     # Line 4 is changed:
946     # - " qq'a b'\n"
947     # + " qq'a\\x{0C}b'\n"
948 wakaba 1.286 not ok 853
949 wakaba 1.296 # Test 853 got: "$VAR1 = [\n [\n qq'Comment',\n qq' foo -- bar '\n ]\n ];\n" (t/HTML-tokenizer.t at line 205 fail #853)
950 wakaba 1.206 # Expected: "$VAR1 = [\n qq'ParseError',\n [\n qq'Comment',\n qq' foo - - bar '\n ]\n ];\n" (Double hyphen in comment: qq'<!-- foo -- bar -->')
951 wakaba 1.296 # Line 2 is missing:
952     # - " qq'ParseError',\n"
953     # Line 4 is changed:
954 wakaba 1.206 # - " qq' foo - - bar '\n"
955     # + " qq' foo -- bar '\n"
956 wakaba 1.286 ok 854
957 wakaba 1.205 # t/tokenizer-test-1.test
958     ok 855
959     ok 856
960     ok 857
961     ok 858
962     ok 859
963     ok 860
964     ok 861
965     ok 862
966     ok 863
967     ok 864
968     ok 865
969     ok 866
970     ok 867
971     ok 868
972     ok 869
973     ok 870
974     ok 871
975     ok 872
976     ok 873
977     ok 874
978     ok 875
979     ok 876
980     ok 877
981     ok 878
982     ok 879
983     ok 880
984     ok 881
985     ok 882
986     ok 883
987     ok 884
988     ok 885
989     ok 886
990     ok 887
991     ok 888
992     ok 889
993     ok 890
994     ok 891
995     ok 892
996     ok 893
997     ok 894
998     ok 895
999     ok 896
1000     ok 897
1001     ok 898
1002     ok 899
1003     ok 900
1004     ok 901
1005     ok 902
1006     ok 903
1007     ok 904
1008     ok 905
1009     ok 906
1010     ok 907
1011     ok 908
1012     ok 909
1013     ok 910
1014     ok 911
1015     ok 912
1016     ok 913
1017     ok 914
1018     ok 915
1019     ok 916
1020     ok 917
1021     ok 918
1022     ok 919
1023     ok 920
1024     ok 921
1025     ok 922
1026     ok 923
1027     ok 924
1028     ok 925
1029 wakaba 1.298 ok 926
1030     ok 927
1031     not ok 928
1032     # Test 928 got: "$VAR1 = [\n [\n qq'Comment',\n qq'--x'\n ]\n ];\n" (t/HTML-tokenizer.t at line 205 fail #928)
1033 wakaba 1.296 # Expected: "$VAR1 = [\n qq'ParseError',\n [\n qq'Comment',\n qq'--x'\n ]\n ];\n" (<!----x-->: qq'<!----x-->')
1034     # Line 2 is missing:
1035     # - " qq'ParseError',\n"
1036 wakaba 1.205 ok 929
1037     ok 930
1038     ok 931
1039     ok 932
1040     ok 933
1041     ok 934
1042     ok 935
1043     ok 936
1044     ok 937
1045 wakaba 1.281 ok 938
1046     ok 939
1047     ok 940
1048     ok 941
1049     ok 942
1050     ok 943
1051     ok 944
1052     ok 945
1053 wakaba 1.285 ok 946
1054 wakaba 1.205 ok 947
1055     ok 948
1056     ok 949
1057     ok 950
1058     ok 951
1059     ok 952
1060     ok 953
1061     ok 954
1062     ok 955
1063     ok 956
1064     ok 957
1065     ok 958
1066     ok 959
1067     ok 960
1068     ok 961
1069     ok 962
1070 wakaba 1.286 ok 963
1071     ok 964
1072 wakaba 1.290 ok 965
1073     ok 966
1074     ok 967
1075 wakaba 1.298 ok 968
1076     ok 969
1077     not ok 970
1078     # Test 970 got: "$VAR1 = [\n qq'ParseError',\n qq'ParseError',\n [\n qq'Character',\n qq'\\x{FFFD}\\x{DFFF}'\n ]\n ];\n" (t/HTML-tokenizer.t at line 205 fail #970)
1079 wakaba 1.285 # Expected: "$VAR1 = [\n qq'ParseError',\n [\n qq'Character',\n qq'\\x{FFFD}'\n ],\n qq'ParseError',\n [\n qq'Character',\n qq'\\x{DFFF}'\n ]\n ];\n" (surrogate character reference: qq'&#xD800;\x{DFFF}')
1080     # Lines 3-3 are missing:
1081     # - " [\n"
1082     # - " qq'Character',\n"
1083     # - " qq'\\x{FFFD}'\n"
1084     # - " ],\n"
1085     # Line 6 is changed:
1086     # - " qq'\\x{DFFF}'\n"
1087     # + " qq'\\x{FFFD}\\x{DFFF}'\n"
1088 wakaba 1.205 ok 971
1089     ok 972
1090     ok 973
1091     ok 974
1092     ok 975
1093     ok 976
1094     ok 977
1095     ok 978
1096     ok 979
1097     ok 980
1098     ok 981
1099     ok 982
1100     ok 983
1101     ok 984
1102     ok 985
1103     ok 986
1104     ok 987
1105     ok 988
1106     ok 989
1107     ok 990
1108     ok 991
1109     ok 992
1110     ok 993
1111     ok 994
1112     ok 995
1113     ok 996
1114     ok 997
1115     ok 998
1116     ok 999
1117     ok 1000
1118     ok 1001
1119     ok 1002
1120     ok 1003
1121     ok 1004
1122     ok 1005
1123     ok 1006
1124     ok 1007
1125     ok 1008
1126     ok 1009
1127     ok 1010
1128     ok 1011
1129     ok 1012
1130     ok 1013
1131     ok 1014
1132     ok 1015
1133     ok 1016
1134     ok 1017
1135     ok 1018
1136 wakaba 1.206 ok 1019
1137     ok 1020
1138     ok 1021
1139     ok 1022
1140     ok 1023
1141     ok 1024
1142     ok 1025
1143 wakaba 1.240 ok 1026
1144 wakaba 1.206 ok 1027
1145     ok 1028
1146     ok 1029
1147 wakaba 1.240 ok 1030
1148 wakaba 1.206 ok 1031
1149     ok 1032
1150     ok 1033
1151 wakaba 1.240 ok 1034
1152 wakaba 1.206 ok 1035
1153     ok 1036
1154 wakaba 1.240 ok 1037
1155 wakaba 1.205 ok 1038
1156     ok 1039
1157 wakaba 1.298 ok 1040
1158     ok 1041
1159     not ok 1042
1160     # Test 1042 got: "$VAR1 = [\n qq'ParseError',\n [\n qq'StartTag',\n qq'p',\n {\n qq'align' => qq'<div'\n }\n ]\n ];\n" (t/HTML-tokenizer.t at line 205 fail #1042)
1161 wakaba 1.293 # Expected: "$VAR1 = [\n [\n qq'StartTag',\n qq'p',\n {\n qq'align' => qq'<div'\n }\n ]\n ];\n" (< in before attribute value state: qq'<p align=<div>')
1162     # Got 1 extra line at line 2:
1163     # + " qq'ParseError',\n"
1164 wakaba 1.298 ok 1043
1165     not ok 1044
1166     # Test 1044 got: "$VAR1 = [\n qq'ParseError',\n [\n qq'StartTag',\n qq'p',\n {\n qq'align' => qq'left<div'\n }\n ]\n ];\n" (t/HTML-tokenizer.t at line 205 fail #1044)
1167 wakaba 1.293 # Expected: "$VAR1 = [\n [\n qq'StartTag',\n qq'p',\n {\n qq'align' => qq'left<div'\n }\n ]\n ];\n" (< in attribute value (unquoted) state: qq'<p align=left<div>')
1168     # Got 1 extra line at line 2:
1169     # + " qq'ParseError',\n"
1170 wakaba 1.205 ok 1045
1171     ok 1046
1172     ok 1047
1173     ok 1048
1174     ok 1049
1175     ok 1050
1176     ok 1051
1177     ok 1052
1178     ok 1053
1179     ok 1054
1180     ok 1055
1181     ok 1056
1182     ok 1057
1183     ok 1058
1184     ok 1059
1185     ok 1060
1186     ok 1061
1187 wakaba 1.206 ok 1062
1188     ok 1063
1189     ok 1064
1190     ok 1065
1191     ok 1066
1192     ok 1067
1193     ok 1068
1194 wakaba 1.227 ok 1069
1195     ok 1070
1196     ok 1071
1197     ok 1072
1198     ok 1073
1199 wakaba 1.247 ok 1074
1200     ok 1075
1201     ok 1076
1202     ok 1077
1203     ok 1078
1204     ok 1079
1205     ok 1080
1206 wakaba 1.281 ok 1081
1207     ok 1082
1208     ok 1083
1209     ok 1084
1210     ok 1085
1211     ok 1086
1212     ok 1087
1213     ok 1088
1214     ok 1089
1215     ok 1090
1216     ok 1091
1217     ok 1092
1218     ok 1093
1219     ok 1094
1220     ok 1095
1221     ok 1096
1222     ok 1097
1223 wakaba 1.285 ok 1098
1224     ok 1099
1225     ok 1100
1226     ok 1101
1227     ok 1102
1228     ok 1103
1229     ok 1104
1230     ok 1105
1231     ok 1106
1232     ok 1107
1233     ok 1108
1234     ok 1109
1235     ok 1110
1236     ok 1111
1237     ok 1112
1238     ok 1113
1239     ok 1114
1240     ok 1115
1241     ok 1116
1242     ok 1117
1243     ok 1118
1244     ok 1119
1245     ok 1120
1246     ok 1121
1247     ok 1122
1248     ok 1123
1249     ok 1124
1250     ok 1125
1251     ok 1126
1252     ok 1127
1253 wakaba 1.286 ok 1128
1254     ok 1129
1255 wakaba 1.290 ok 1130
1256     ok 1131
1257     ok 1132
1258 wakaba 1.293 ok 1133
1259     ok 1134
1260 wakaba 1.298 ok 1135
1261     ok 1136

admin@suikawiki.org
ViewVC Help
Powered by ViewVC 1.1.24