[fpc-pascal] Effect of blocking AI crawlers in forums and wikis
Wayne Sherman
wsherman at gmail.com
Fri Sep 5 23:05:12 CEST 2025
Is it possible to make the forum and wiki data available on a mirror
for the use of AI crawlers?
(are search engine crawlers also blocked?)
You asked ChatGPT5:
"If websites with large Pascal forums and wikis block AI data
harvesting bots, what will be the result for LLMs missing this
training data and what will be the effect on these LLMs for Pascal
related queries? Provide a simple concise answer with bullet points."
ChatGPT said:
- Thinner Pascal knowledge: fewer examples and patterns from
FPC/Lazarus communities.
- Outdated answers: misses recent compiler/RTL/LCL changes and
community best practices.
- Lower code quality: more non-idiomatic style, wrong units in uses,
bad type handling, bogus directives.
- More hallucinations: invented packages/components, incorrect API
names, fake compiler switches.
- Poor ecosystem awareness: weaker guidance on common libs (e.g.,
Synapse, Zeos, BGRABitmap) and tools (fpcupdeluxe, cross-compile
flows).
- Weaker troubleshooting: less helpful fixes for linker/Widgetset
errors, package management, and IDE quirks.
- Language bias: pushes solutions in C/Python/JS rather than solid Pascal.
- Narrower coverage: fewer edge cases (Windows/Gtk/Qt specifics,
generics/modes nuances).
- Less reliable citations: can’t cross-check forum/wiki “canonical” answers.
More information about the fpc-pascal
mailing list