spark日志报错:Using Spark’s default log4j profile

idea运行spark报错:Using Spark’s default log4j profile: org/apache/spark/log4j-defaults.properties

  • 错误信息:
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
22/05/24 09:41:52 INFO SparkContext: Running Spark version 3.0.0
22/05/24 09:41:52 INFO ResourceUtils: ==============================================================
22/05/24 09:41:52 INFO ResourceUtils: Resources for spark.driver:22/05/24 09:41:52 INFO ResourceUtils: ==============================================================
22/05/24 09:41:52 INFO SparkContext: Submitted application: DataDispose$
22/05/24 09:41:52 INFO SecurityManager: Changing view acls to: yikuai
22/05/24 09:41:52 INFO SecurityManager: Changing modify acls to: yikuai
22/05/24 09:41:52 INFO SecurityManager: Changing view acls groups to: 
22/05/24 09:41:52 INFO SecurityManager: Changing modify acls groups to: 
22/05/24 09:41:52 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users  with view permissions: Set(yikuai); groups with view permissions: Set(); users  with modify permissions: Set(yikuai); groups with modify permissions: Set()
22/05/24 09:41:54 INFO Utils: Successfully started service 'sparkDriver' on port 50862.
22/05/24 09:41:54 INFO SparkEnv: Registering MapOutputTracker
22/05/24 09:41:54 INFO SparkEnv: Registering BlockManagerMaster
22/05/24 09:41:54 INFO BlockManagerMasterEndpoint: Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information
22/05/24 09:41:54 INFO BlockManagerMasterEndpoint: BlockManagerMasterEndpoint up
22/05/24 09:41:54 INFO SparkEnv: Registering BlockManagerMasterHeartbeat
22/05/24 09:41:54 INFO DiskBlockManager: Created local directory at C:\Users\yikuai\AppData\Local\Temp\blockmgr-90db67ff-bb82-4670-bcc8-db1e2d3e3d42
22/05/24 09:41:54 INFO MemoryStore: MemoryStore started with capacity 1983.3 MiB
22/05/24 09:41:54 INFO SparkEnv: Registering OutputCommitCoordinator
22/05/24 09:41:54 INFO Utils: Successfully started service 'SparkUI' on port 4040.
22/05/24 09:41:54 INFO SparkUI: Bound SparkUI to 0.0.0.0, and started at http://DESKTOP-T2AEBTN:4040
22/05/24 09:41:54 INFO Executor: Starting executor ID driver on host DESKTOP-T2AEBTN
  • 分析:

需要修改spark的显示日志等级

  • 解决:

在resource文件夹下,创建一个log4j.properties文件,将以下内容复制到该文件即可

log4j.rootCategory=ERROR, console
log4j.appender.console=org.apache.log4j.ConsoleAppender
log4j.appender.console.target=System.err
log4j.appender.console.layout=org.apache.log4j.PatternLayout
log4j.appender.console.layout.ConversionPattern=%d{yy/MM/dd HH:mm:ss} %p %c{1}: %m%n# Set the default spark-shell log level to ERROR. When running the spark-shell, the
# log level for this class is used to overwrite the root logger's log level, so that
# the user can have different defaults for the shell and regular Spark apps.
log4j.logger.org.apache.spark.repl.Main=ERROR# Settings to quiet third party logs that are too verbose
log4j.logger.org.spark_project.jetty=ERROR
log4j.logger.org.spark_project.jetty.util.component.AbstractLifeCycle=ERROR
log4j.logger.org.apache.spark.repl.SparkIMain$exprTyper=ERROR
log4j.logger.org.apache.spark.repl.SparkILoop$SparkILoopInterpreter=ERROR
log4j.logger.org.apache.parquet=ERROR
log4j.logger.parquet=ERROR# SPARK-9183: Settings to avoid annoying messages when looking up nonexistent UDFs in SparkSQL with Hive support
log4j.logger.org.apache.hadoop.hive.metastore.RetryingHMSHandler=FATAL
log4j.logger.org.apache.hadoop.hive.ql.exec.FunctionRegistry=ERROR

参考文章:https://blog.csdn.net/OWBY_Phantomhive/article/details/123086181


本文来自互联网用户投稿,文章观点仅代表作者本人,不代表本站立场,不承担相关法律责任。如若转载,请注明出处。 如若内容造成侵权/违法违规/事实不符,请点击【内容举报】进行投诉反馈!

相关文章

立即
投稿

微信公众账号

微信扫一扫加关注

返回
顶部