MySQL Assignment – 5

Basic Questions

  1. Create a table transactions partitioned by RANGE on YEAR(txn_date) for 2020, 2021, and 2022.
  2. Insert 10 sample rows into transactions with different years (2020–2022).
  3. Create a table orders partitioned by HASH on order_id into 4 partitions.
  4. Create a table employees partitioned by LIST of department values (HR, IT, Sales).
  5. Insert data into employees covering all department partitions.
  6. Write SQL to check which partition a specific row in transactions belongs to.
  7. Write SQL to move rows from transactions 2020 partition into an archive table.
  8. Create a replication user rep_user with replication privileges.
  9. Write SQL to check slave replication status.
  10. Write SQL to stop replication on a replica server.
  11. Write SQL to start replication again.
  12. Create an event clear_temp that deletes rows older than 7 days from temp_data.
  13. Disable the event clear_temp.
  14. Enable the event clear_temp.
  15. Create a FULLTEXT index on articles(title, content).
  16. Write SQL to perform a natural language full-text search for “machine learning”.
  17. Write SQL to perform a boolean full-text search for rows containing “AI” but not “ML”.
  18. Insert a JSON record into profiles with fields { “name”: “Ravi”, “email”: “ravi@test.com”, “phone”: “98765” }.
  19. Update the JSON profiles column to change the phone number.
  20. Extract the name field from the JSON profiles column using JSON_EXTRACT.

Intermediate Questions

  1. Create a table sales partitioned by RANGE on MONTH(sale_date).
  2. Insert 12 rows into sales covering all months of the year.
  3. Write SQL to merge old sales data (before 2021) into an archive table using partition exchange.
  4. Create a table regions partitioned by LIST (North, South, East, West).
  5. Insert 8 rows into regions with different values.
  6. Create an event update_reports that runs every hour to insert a row into reports_log.
  7. Drop the event update_reports.
  8. Create a trigger log_delete_order that inserts into orders_audit before deleting from orders.
  9. Create a trigger log_update_salary that inserts into salary_audit after updating employees.salary.
  10. Create a stored procedure insert_bulk_orders(IN n INT) that inserts n dummy rows into orders.
  11. Create a stored procedure archive_sales(IN cutoff DATE) that moves rows older than cutoff into sales_archive.
  12. Create a function get_order_count(cust_id INT) returning the number of orders for that customer.
  13. Create a function get_total_sales(month_no INT) returning total sales for that month.
  14. Create an index on sales(sale_date, amount) to optimize range queries.
  15. Drop the index on sales(amount).
  16. Create a JSON column order_meta in orders and insert { “status”: “delivered”, “rating”: 4 }.
  17. Update the rating inside order_meta JSON to 5.
  18. Add a generated column order_status extracted from JSON order_meta->”$.status”.
  19. Create a covering index on orders(customer_id, order_date, amount).
  20. Write SQL to query orders by customer_id and order_date optimized by covering index.

Advanced Questions

  1. Create a table bank_txns partitioned by RANGE on YEAR(txn_date) with subpartition by HASH on account_id.
  2. Insert 20 rows into bank_txns across multiple years and accounts.
  3. Write SQL to migrate all rows from the 2020 partition of bank_txns into txn_archive.
  4. Create a stored procedure transfer_money(IN from_id INT, IN to_id INT, IN amt DECIMAL(10,2)) with transaction and rollback on error.
  5. Create a function get_nth_highest_txn(IN n INT) returning nth highest transaction amount.
  6. Create a BEFORE INSERT trigger validate_txn that prevents inserting negative amounts into bank_txns.
  7. Create an AFTER UPDATE trigger audit_txn that logs old and new values into txn_audit.
  8. Create an event daily_archive that moves completed orders older than 30 days into orders_history.
  9. Write SQL to optimize a full-text search on articles for multiple keywords with relevance ranking.
  10. Create a product catalog schema using products table where dynamic attributes (color, size, weight) are stored in a JSON column, and add a JSON index for color.