The tokenizer strips whitespace before each token, and the selector reconstruction heuristic only re-inserted spaces before Ident tokens. This caused descendant combinators to be lost when the right-hand side started with `.`, `#`, `:`, or `[` (e.g. `.foo .bar` became `.foo.bar`). Replace token-by-token selector text reconstruction with raw text extraction from the original CSS source using byte offsets already tracked by the tokenizer. This preserves whitespace, comments are stripped separately, and whitespace is normalized. Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
1.8 KiB
1.8 KiB