Release V 5.3.0
Hot-fix: CSP (26-06-2024)
ETLJobs
Build/DataPipeline/ETLJobs
Deploy/DataPipeline/ETLJobs
script_to_run: DRUID_CONTENT_INDEXER
invoke_type: deploy
ETLDruidContentIndexer
NA
NA
Deploy/DataPipeline/ETLDruidContentIndexer
script_to_run: DRUID_CONTENT_INDEXER
invoke_type: execute-script
Data Products
Build/Lern/LernDataProducts
Deploy/Lern/LernDataProducts
CSP related changes.
cloud_store_group_id: org.sunbird
cloud_store_artifact_id: cloud-store-sdk_2.12
cloud_store_version: 1.4.6
Note: While deploy select set the module value as lern-dataproducts,cronjobs
Ed data product in dock env
Build/Lern/LernDataProducts
Deploy/Dock/DataPipeline/EdDataProducts
CSP related changes.
cloud_store_group_id: org.sunbird
cloud_store_artifact_id: cloud-store-sdk_2.12
cloud_store_version: 1.4.6
Note: While deploy select set the module value as dock-dataproducts
To run Ed related reports: Live ETB QR Code-Content Linkage Status
NA
NA
Deploy/Lern/LernAnalyticsReplayJobs
Add the etb-metrics in job_id list. - job_type: run-job - job_id: etb-metrics
To run Ed related reports: Course Adoption Report v2
NA
NA
Deploy/DataPipeline/Runreport
report_id:
course_adoption_by_batch
course_adoption_table_new
course_adoption_report_plays_and_time_spent
To run coKreat related report: Visitor's report
NA
NA
Deploy/DataPipeline/Runreport
report_id: vidyadaan_visitor
To run coKreat related report: Collection Level Content Gaps
NA
NA
Deploy/Dock/DataPipeline/AnalyticsReplayJobs
job_type: run-job
job_id: sourcing-metrics
To run coKreat related report: Folder Level (first level) Content Gaps
NA
NA
Deploy/Dock/DataPipeline/AnalyticsReplayJobs
job_type: run-job
job_id: sourcing-metrics
To run coKreat related report: Project level funnel report
NA
NA
Deploy/Dock/DataPipeline/AnalyticsReplayJobs
job_type: dock-run-job
job_id: funnel-report
To run coKreat related report: Content Details Report
NA
NA
Deploy/Dock/DataPipeline/AnalyticsReplayJobs
job_type: dock-run-job
job_id: content-details
Hot-fix: CSP (24-08-2023)
Batch Service
Build/Core/Lms
Deploy/Kubernetes/Lms
CSP related changes.
cloud_store_group_id: org.sunbird
cloud_store_artifact_id: cloud-store-sdk
cloud_store_version: 1.4.7
User&Org Service
Build/Core/Learner
Deploy/Kubernetes/Learner
CSP related changes.
cloud_store_group_id: org.sunbird
cloud_store_artifact_id: cloud-store-sdk
cloud_store_version: 1.4.7
Data pipeline
Build/Lern/FlinkJobs
Deploy/Lern/FlinkJobs
CSP related changes.
cloud_store_group_id: org.sunbird
cloud_store_artifact_id: cloud-store-sdk_2.12
cloud_store_version: 1.4.6
Data Products
Build/Lern/LernDataProducts
Deploy/Lern/LernDataProducts
CSP related changes.
cloud_store_group_id: org.sunbird
cloud_store_artifact_id: cloud-store-sdk_2.12
cloud_store_version: 1.4.6
Jenkins Configurations for csp support:
Configure the variables cloud_store_group_id, cloud_store_artifact_id and cloud_store_version with proper values in the Jenkins, it can configured in the global or to the individual service of build job. For lms, user-org, flinks-jobs, lerndataproducts build jobs configure like as we mentioned below.
Configure the following values
Name - Default Value - Description
cloud_store_group_id - ${cloud_store_group_id} - Set the Cloud store sdk group id. e.g. org.sunbird cloud_store_artifact_id - ${cloud_store_artifact_id} - Set the Cloud store sdk artifact id. e.g. cloud-store-sdk
cloud_store_version - ${cloud_store_version} - Set the Cloud store sdk version. e.g 1.4.6
Config changes in Lern common.yaml for data-products
Hot-fix: 5.3.1 (05-07-2023)
Batch Service
Build/Core/Lms
Deploy/Kubernetes/Lms
Document Release Version
Lern
27-May-2023
V 5.3.0
Lern
23-Jun-2023
V 5.3.1
Hot Fix :- ML PII Data Product (23-06-2023)
Details of Released Tag
Kafka Setup
Deploy/Lern/KafkaSetup
verify if kafka topic = programuser.info is created or not
Data pipeline
Build/Lern/FlinkJobs
Deploy/Lern/FlinkJobs
Add program-user-info into job list and deploy it.
Data Products
Build/Lern/LernDataProducts
Deploy/Lern/LernDataProducts
Add program-user-exhaust into job list of Deploy/Lern/LernAnalyticsReplayJobs for running it.
Cassandra Migration
Build/Core/Cassandra
Deploy/Kubernetes/Cassandra
add the sunbird_programs keyspace in Deploy Jenkins jobs
Analytics
Deploy/Kubernetes/Analytics
Deploy with release-6.0.0 branch
Summary of the Changes
Details of the Changes:
LR-491 User detail (PII) report for ML programs - Data Product LR-285 User detail (PII) report for ML programs - Flink Job
Default values for config
default config for services
Please define below variables
Cassandra Keyspace and Table for Program:-
Flink Job Configurations for Lern:
program-user-info
Data Security Policy setup
Configurations to be done by System admin:
Setup default 'Data Security Policy' settings using tenant preference API.
Details of Released Tag
Kafka Setup
Deploy/Lern/KafkaSetup
Data pipeline
Build/Lern/FlinkJobs
Deploy/Lern/FlinkJobs
Add legacy-certificate-migrator into job list and deploy it.
Data Products
Build/Lern/LernDataProducts
Deploy/Lern/LernDataProducts
Batch Service
Build/Core/Lms
Deploy/Kubernetes/Lms
User&Org Service
Build/Core/Learner
Deploy/Kubernetes/Learner
Analytics
Deploy/Kubernetes/Analytics
Deploy with release-6.0.0 branch
Summary of the Changes
Details of the Changes:
LR-436 OldCertificateMigration spark data-product LR-437 LegacyCertificateMigrator Flink job LR-438 Sunbird RC changes for updating schema for issued date LR-330 Certificate template font url migration LR-395, LR-465 PII data security LR-451 Local setup of Data-pipeline - Ubuntu & Mac - Github and Microsite update LR-443 Local setup of UserOrg - Ubuntu & Mac - Github and Microsite update LR-445 Local setup of LMS - Ubuntu & Mac - Github and Microsite update LR-422 Point the channel create API to content-service instead of learning-service LR-519 Textbook APIs code cleanup from Course-Batch service LR-486 Microsite update with Certificate generation flow diagram LR-520 Group service - activity type should be case insensitive LR-556 Local setup of LMS - Ubuntu & Mac - Mock service setup LR-456 Local setup of Sunbird-utils - Ubuntu & Mac - Github and Microsite update
New APIs to onboard
Env Configurations (Needs to be done before service deployment):
The below environment variable needs to be configured in the 'sunbird-lms-service.env' file dev ops repo. Ref: https://github.com/project-sunbird/sunbird-devops/blob/release-5.3.0-lern/ansible/roles/stack-sunbird/templates/sunbird_lms-service.env
exhaust_api_base_url
Obsrv exhaust API endpoint for batch service
exhaust_api_submit_endpoint
/request/submit
To submit job request from batch service
exhaust_api_list_endpoint
/request/list/
To list job request from batch service
sunbird_api_auth_token
"{{ core_vault_sunbird_api_auth_token }}"
Authentication token for APIs
content_read_url
/content/v3/read/
Exhaust Proxy API documentation
Data Security Policy setup
Configurations to be done by System admin:
Execute CURL for providing link to download "Decryption Tool". Tool reference: https://github.com/Sunbird-Lern/sunbird-utils/blob/release-5.3.0/decryption-tool/decryption-tool.zip
Setup default 'Data Security Policy' settings using tenant preference API.
Setup default 'PII data security settings' using tenant preference API.
Configurations that can be done by Tenants:
Use Tenant preference create API to create tenant specific 'Data Security Policy' settings similar to 'default' Data Security Policy settings but with tenant orgId.
In order to use "PUBLIC_KEY_ENCRYPTED_DATASET" security configuration for an exhaust report, tenant admin should have uploaded public pem key file using below API.
Steps to generate key pair for setting up Data Security policy configuration:
For Linux and Mac OS:
To generate Private Key
To generate Public Key
For Windows OS:
Please install GitBash: The Git installation package comes with SSH. Using Git Bash, which is the Git command line tool, you can generate SSH key pairs. Git Bash has an SSH client that enables you to connect to and interact with Triton containers on Windows.
To install Git:
Download and initiate the Git installer.
When prompted, accept the default components by clicking Next.
Choose the default text editor. If you have Notepad++ installed, select Notepad++ and click Next.
Select to Use Git from the Windows Command Prompt and click Next.
Select to Use OpenSSL library and click Next.
Select to Checkout Windows-style, commit Unix-style line endings and click Next.
Select to Use MinTTY (The default terminal of mYSYS2) and click Next.
Accept the default extra option configuration by clicking Install. When the installation completes, you may need to restart Windows.
Launching GitBash:
press Start+R to launch the Run dialog.
Type C:\Program Files\Git\bin\bash.exe and press Enter.
Generating Key pair:
To generate Private Key
To generate Public Key
Flink Job Configurations for Lern:
legacy-certificate-migrator
Prerequired deployments for RC migration
Step to migrate old certificates to RC
Sunbird Lern BB is using Sunbird RC for generating & issuing e-credentials in its use cases (e.g.: course completion certificate) for all the latest completed courses (post March-2022). All the old certificates were custom generated and stored in Cassandra and cloud storage.
Once we migrate these certificates then we no longer need to store certificates in Cassandra and all the certificates will be using Sunbird RC going forward.
Reference Link: https://project-sunbird.atlassian.net/wiki/spaces/UM/pages/3117416449/LR-4+Design+of+migrating+existing+certificate+in+to+RC
Note: After migrating old certificates to RC, certificate verification of old certificates will become invalid. To support to old certificate verification, Sunbird ED
building block is implementating in portal service in release 6.0. Kindly find the ticket in this link. So recommended to migrate the certificates after getting the old certification verification support as well.
Step 1
Create Kafka topic for only the purpose of this migration process
Topic name: {{env}}.legacy.certificate.migrate
Step 2
In the spark machine, update the old-certificate-migration-job
model config in mount/data/analytics/scripts/lern-model-config.sh
with correct values.
Sample model config:
Note: migration job can be run single batch with "batchId": "01320961460024934435"
and multiple batches with "batchId": "01320961460024934435,01220961460024934536"
and for all batches with "batchId": "all"
.
Step 3
Run the job with the below command in the spark machine.
Note: logs can be found in below locations,
Joblog: /mount/data/analytics/scripts/logs/joblog.log
Execution log: /mount/data/analytics/logs/lern-data-products/{current_date}-job-execution.log
Note:
Verification steps can be found in the design page: https://project-sunbird.atlassian.net/wiki/spaces/UM/pages/3117416449/LR-4+Design+of+migrating+existing+certificate+in+to+RC#Verification-steps-for-the-certificate-migration-process
Steps to Font URL migration
All the templates are having dev URLs configured for Fonts in all the environments as per our observation. All these font URLs have to be migrated to the new cname URL
Note: Before font url migration, make sure all the font files are available at cname mapped account or cloud storage container. To verify, where the font files are available, open any svg template file in editor and check the font URL's host.
Please use java 11 for running the scripts
Step 1:
Download SVG file migrator and uploader jars by below command,
Step 2:
Download the svg template files and update the font URLs in the template files.
Note: Before moving to next step, please verify atleast one svg file for whether the font URL got updated.
Step 3:
Upload the svg template files back to the cloud storage by below command.
Configuration for making content read URL dynamic: https://project-sunbird.atlassian.net/browse/LR-579 Please define the below URL in the sunbird_lms-service.env file this will make the content read endpoint URL Configurable.
Last updated
Was this helpful?