![]() Also, learned how to use an alias on SQL queries after creating a table from DataFrame. This method is the SQL equivalent of the keyword used to provide a different column name on the SQL result. In this article, you have learned how to alias column names using an alias(). Alias of column names would be very useful when you are working with joins.ĭf4 = spark.sql("select subject.fee, subject.lang as language from courses as subject") Now let’s alias the name of the table in SQL and the column name at the same time. Once the session closed you can’t access this table 5. Additionally, several table operators can be combined next to one another. Note that the scope of the courses table is with the PySpark Session. The names of the left operand are used as columns name for the result. The following Query will combine the result of Education from 2015 and the. # Query using spark.sql() and use 'as' for alias However, you can use ALIAS columns on both statements to add your column name. This method is the SQL equivalent of the as keyword used to provide a different column name on the SQL result. () returns the aliased with a new name or names. Spark = ("local") \ĭata = ĭf = spark.createDataFrame(data).toDF(*columns) ![]() Spark.sql("select subject.fee, subject.lang as language from courses as subject").show() # Example 4 - Query using spark.sql() and use 'as' for alias Spark.sql("select fee, lang as language from courses").show() # Example 3 - Query using spark.sql() and use 'as' for alias ![]() # Example 2 - using col().alias() - col() return Column typeĭf.select("fee",col("lang").alias("language")).show() ![]()
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |