Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[SPARK-18061][SQL][Security] Spark Thriftserver needs to create SPNego principal #15594

Closed
wants to merge 1 commit into from
Closed
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Original file line number Diff line number Diff line change
Expand Up @@ -25,6 +25,7 @@ import scala.collection.JavaConverters._

import org.apache.commons.logging.Log
import org.apache.hadoop.hive.conf.HiveConf
import org.apache.hadoop.hive.conf.HiveConf.ConfVars
import org.apache.hadoop.hive.shims.Utils
import org.apache.hadoop.security.UserGroupInformation
import org.apache.hive.service.{AbstractService, Service, ServiceException}
Expand All @@ -47,6 +48,7 @@ private[hive] class SparkSQLCLIService(hiveServer: HiveServer2, sqlContext: SQLC
setSuperField(this, "sessionManager", sparkSqlSessionManager)
addService(sparkSqlSessionManager)
var sparkServiceUGI: UserGroupInformation = null
var httpUGI: UserGroupInformation = null

if (UserGroupInformation.isSecurityEnabled) {
try {
Expand All @@ -57,7 +59,24 @@ private[hive] class SparkSQLCLIService(hiveServer: HiveServer2, sqlContext: SQLC
case e @ (_: IOException | _: LoginException) =>
throw new ServiceException("Unable to login to kerberos with given principal/keytab", e)
}
}

// Also try creating a UGI object for the SPNego principal
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Indentation here is wrong. Also, following blocks have code indented with 4 spaces instead of 2, and wrong indentation.

val principal = hiveConf.getVar(ConfVars.HIVE_SERVER2_SPNEGO_PRINCIPAL)
val keyTabFile = hiveConf.getVar(ConfVars.HIVE_SERVER2_SPNEGO_KEYTAB)
if (principal.isEmpty() || keyTabFile.isEmpty()) {
getAncestorField[Log](this, 3, "LOG").info(
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This log message seems unnecessary.

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks @vanzin. The log message was added to replicate same behavior as the Hive Thriftserver code block. @steveloughran added the kerberos code and he added his comments above.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't know about the specific log policy here; but I do think using reflection to get a private stuff is dangerous. I know it happens a lot in this code but at some point it'd be nice to move it. you don't need to use reflection to get the same log as a parent, just go

LoggerFactory.getLog("org.apache.hive.thrift...whatever").info(s"something $key")

This is cleaner and not going to break if hive ever change logging frameworks.

s"SPNego httpUGI not created, spNegoPrincipal: $principal , ketabFile: $keyTabFile")
} else {
try {
httpUGI = HiveAuthFactory.loginFromSpnegoKeytabAndReturnUGI(hiveConf)
setSuperField(this, "httpUGI", httpUGI)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We have similar code in CLIService.java which is not setting the httpUGI field, do we need to make the behavior the same in both files ?

getAncestorField[Log](this, 3, "LOG").info("SPNego httpUGI successfully created.")
} catch {
case e : IOException =>
getAncestorField[Log](this, 3, "LOG").warn(s"SPNego httpUGI creation failed: $e")
}
}
}

initCompositeService(hiveConf)
}
Expand Down