Thread:CzechOut/@comment-4371972-20200506063642

Hi!

Following your answer on the French Disney Wiki, I have several comments about what you said.

First, is there any way to custom content for desktop  and  mobile versions? Inside the Disney Wiki, there are many features which are disabled in the mobile version: hover effects, previews, tabbers, tabviews… The mobile display then becomes corrupted notwithstanding attempts to make the content more dynamic on the desktop version.

Secondly, despite JS code does not affect mobile UX/UI, I can set the plurality of the labels by counting the number of link occurences inside corresponding values, on the desktop UI. However, I am intrusting if there is a way to use a magic word to get the numbers of links. Here, the problem is about getting genders (and maybe other data from the declared pages). On MediaWiki, I found the TextExtracts and InterwikiExtracts extensions. Are they available on Fandom? If they are, can we parse the extracted contents? Furthermore, you told about XSS vulnerabilities. I understand that someone can put an external link, which can damage the contents. So, may I check if the links are intern callings? If I may, how should I implement it? Otherwise, would it be possible to group the data in a JSON sheet, self-completed when changing pages?

Finally, regarding the page-naming policy, there are some things I don't understand. You told that "the manipulation of the H1 is a customisation violation which could have some SEO implications as well" but what about ? This magic word changes much more than the title of the page. Moreover, can we add elements after as contributors did there?

Thank you for your answers. 