首先在pom文件中添加如下配置(根据自己的项目设置scala版本和入口类完整路径),然后右键选择Maven -> Reload project
<properties>
<spark.version>3.1.2spark.version>
<scala.version>2.12scala.version>
properties>
<dependencies>
<dependency>
<groupId>org.apache.sparkgroupId>
<artifactId>spark-core_${scala.version}artifactId>
<version>${spark.version}version>
dependency>
<dependency>
<groupId>org.apache.maven.pluginsgroupId>
<artifactId>maven-assembly-pluginartifactId>
<version>3.1.0version>
dependency>
dependencies>
<build>
<plugins>
<plugin>
<groupId>org.scala-toolsgroupId>
<artifactId>maven-scala-pluginartifactId>
<executions>
<execution>
<goals>
<goal>compilegoal>
<goal>testCompilegoal>
goals>
execution>
executions>
plugin>
<plugin>
<artifactId>maven-assembly-pluginartifactId>
<configuration>
<archive>
<manifest>
<mainClass>testSparkmainClass>
manifest>
archive>
<appendAssemblyId>falseappendAssemblyId>
<descriptorRefs>
<descriptorRef>jar-with-dependenciesdescriptorRef>
descriptorRefs>
configuration>
<executions>
<execution>
<id>make-assemblyid>
<phase>packagephase>
<goals>
<goal>singlegoal>
goals>
execution>
executions>
plugin>
plugins>
build>
然后在右侧maven栏中选择package打包

package成功完成后就能看到target目录下打包出的jar文件

如果运行jar包时报无法加载启动类之类的错误,可能是项目没有编译成功,检查一下target的classes目录下是否有对应的启动类文件,如果没有则先在顶部的build -> build project 重新构建一下项目,让classes目录中有对应文件,再去打包项目即可。