Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Protection against malicious queries #26

Closed
ivome opened this issue Mar 4, 2016 · 10 comments
Closed

Protection against malicious queries #26

ivome opened this issue Mar 4, 2016 · 10 comments

Comments

@ivome
Copy link
Contributor

ivome commented Mar 4, 2016

With GraphQL it is quite easy to create a query that crashes the server. Especially with graphs that have some kind of recursive structure.

How do you guys protect against that?

I like how they did it in the Scala project:
http://sangria-graphql.org/learn/#protection-against-malicious-queries

@vladar
Copy link
Member

vladar commented Mar 4, 2016

Yeah, Sangria provides some insights. I think we can implement similar concept, but I suspect that they do several executor passes on the query.

That may affect performance, because PHP is not a perfect tool for such tasks. Ideally we'd want this with some sort of PHP module for parser / executor.

Or at least have some profiling tool to make sure that new features do not affect performance significantly.

@vladar
Copy link
Member

vladar commented Mar 4, 2016

As a workaround for now (just an idea) - you can set up complexity tracking by wrapping all field "resolve" methods with some closure and using ResolveInfo to collect complexity data. Then throw if it reaches threshold.

But since you won't be able to get children complexity in parent resolve, you will have to set complexity multiplier in parent and apply it to all children scores when descending deeper during execution.

@smolinari
Copy link

I'd say, absolutely yes, to adding some sort of protection. It is a necessity. However, I'd also be worried about bogging down the system with the checks. It's a difficult split to make.

Scott

@ivome
Copy link
Contributor Author

ivome commented Mar 22, 2016

The idea with wrapping all resolve methods could work, but seems a little bit hacky and repetitive. I think the best place for that would be the executor.
Maybe we can add some functionality that enables us to use different executors that can optionally be passed to the execute function. That way we could have one that has complexity tracking built in and another one, that has the current implementation. With multiple executors we could also prepare for a future PHP module.
Or we just pass an additional parameter $maxQueryComplexity. Whenever that parameter is set, the complexity analysis is executed, otherwise it is ignored (default).

I think most projects that expose the server to the public need such an implementation anyways.

@vladar
Copy link
Member

vladar commented Mar 22, 2016

@ivome I think it makes sense to perform several passes on query - one for analysis (optional), another one for execution.

I am also monitoring graphql-js activity on optimizations subject (graphql/graphql-js#304). Will likely move forward when there is some clear vision on this subject in graphql-js, as this project tries to match reference implementation when possible.

@vladar
Copy link
Member

vladar commented Mar 25, 2016

@ivome Another idea is that you could probably write custom validation Rule that will test query fields against schema to check for complexity. It might be the way to go with this feature.

See DocumentValidator::validate() - last argument is the set or rules for validation. You can inject your own rule there if required.

Obviously you will have to avoid using GraphQL\GraphQL facade for your project in this case. Guess I will experiment with this first thing. In case if it makes sense - will add section to docs on custom query validators.

@mcg-web
Copy link
Collaborator

mcg-web commented Apr 3, 2016

@vladar I confirm that using DocumentValidator::validate() is the solution. Here two examples that implement Query Complexity and Depth security. I think that the correct place for these rules are here. If needed I can contribute this feature here...

@vladar
Copy link
Member

vladar commented Apr 4, 2016

@mcg-web That would be awesome! I would definately use such rules in my projects.

@mcg-web
Copy link
Collaborator

mcg-web commented Apr 4, 2016

Nice, I will soon contribute it.

@vladar
Copy link
Member

vladar commented Apr 15, 2016

Closing this as now we have validator rules for query depth and complexity, thanks to @mcg-web!

@vladar vladar closed this as completed Apr 15, 2016
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants