Sitemap

Strapi.io — Boost graphql performance with Redis cache

2 min readMar 24, 2021
strapi.io

For you, good people, who’re using strapi.io as your backbone graphql server will understand how slow strapi’s grapqhl is.
In fact, it is a known unsolved issue. You can track the progress for the issue here: https://github.com/strapi/strapi/issues/8552. To fix this, the most sensible way is to implement a cache. But how?

Cache on service

Since I new to strapi.io, the first thing I do to speed up graphl query is by implementing Redis cache on the service level.

Unfortunately, query to DB is not really the issue here. The performance is better but not much. After doing more thorough research, I found out the problem is on the query to resolver mechanism. I couldn’t quite able to point any finger which functions that actually slowing the process, but I think the architecture seems to have lots of aspects that need to be improved.

Cache on Appollo

The second step that I did was implementing cache on the apollo middleware directly. By doing this, I prevent any query to resolver mechanism from being triggered, hence Appolo can return the requested value immediately.

This approach introduces a new problem, which is user permission validation. Since Apollo handling the request directly, the user authorization and authentication process were not triggered.

To fix this issue, I need to put auth check process on the apollo middleware. But, before I was able to do this, I need to customize the user-permission plugin.

Following is the gist of what I came out with.

Please note, I’m using this solution on production and I didn’t find any issue. But, if you have a better way, please let me know.

--

--

Sofyan Hadi Ahmad
Sofyan Hadi Ahmad

Written by Sofyan Hadi Ahmad

Innovation | Opensource | Human | Charity

Responses (1)