@Rukshan
If you are making web apps using Javascript, one big problem that most developers face is making the app SEO friendly.
Google is now slowly beginning to support rendering javascript apps and making them visible on their search results.
However, Facebook and Twitter crawlers are still not rendering javascript, and sharing your javascript web app links on Facebook and Twitter doesn’t generate the nice previews that we all want.
When you write a javascript app using your favorite framework, you can change the meta tags of your website using javascript.
But the problem is, even though you change the meta tags using javascript, Facebook craweler, and many other social media crawlers don’t render javascript, therefore they don’t detect these meta tags, and without the meta tags Facebook is unable to generate the nice preview of your link when shared it on Facebook.
I’m working on a small javascript web app based on NuxtJs with an ExpressJS backend, running on Nginx server, and I came across this problem. I searched the internet, and StackOverflow for days, but I was unable to find a solution.
I came across some ways to fix this issue by using ExpressJs, but because of the way the ExpressJs backend is structured in the NuxtJs project, those solutions were not working. So I had to find a way to solve the problem using Nginx.
But after some more digging and experiments, I was able to find a solution using the Nginx.
The solution is to send your Facebook, Twitter crawlers to a server-rendered page while you send your normal visitors, and Google bot to your real website.
Remember that in your server-rendered page, you don’t have to display all the content, you only need to display the meta tags for Facebook and Twitter to generate their preview or cards.
Open your Nginx settings for your website, mine was at /etc/nginx/sites-enabled/yourdomain.com
Replace the yourdomain.com with your real site’s domain name.
sudo vim/etc/nginx/sites-enabled/yourdomain.com
How do you identify the Facebook crawler and Twitter bot? The Facebook crawler has its user agent as “facebookexternalhit” while Twitter bot has its user agent as “twitterbot”.
Modify your Nginx settings so it will redirect the Twitter bot and Facebook crawler to your server-rendered page, here is an example showing you on how to modify your Nginx settings,
server {
...
location / {
if ($http_user_agent ~* "baiduspider|twitterbot|facebookexternalhit|rogerbot|linkedinbot|embedly|quora link preview|showyoubot|outbrain|pinterest|slackbot|vkShare|W3C_Validator")
{
set $prerender 1;
}
if ($prerender = 1) {
rewrite ^/place/(.*)$ /api/pcrawler?id=$1 last;
}
....
}
}
Save your site settings file.
In the example above all the social media crawlers coming to mywebsite.com/place/123 will be redirected to mywebsite.com/api/pcrawler?id=123
As in the example above, create a route on your Express Server to handle the Facebook and Twitter bot and show a server-rendered page with the needed meta tags.
router.get('/api/pcrawler' , function(req,res,next) {
var id = req.query.id
// display the server generated meta tags for the crawlers
})
Finally, restart your Nginx server sudo systemctl restart Nginx
See if it’s working correctly by going to the Facebook share debugger, and paste your link, and let the crawler to scrape your website and see if it correctly displays the content.