Patching failed with “The inventory pointed at location is not valid” and ORA-48146 in OCI DBCS

APPLIES TO:
Oracle Database Cloud Service – Version N/A to N/A [Release 1.0]
Linux x86-64

SYMPTOMS:
Customer tried to apply the patch from console and it was failing in two phases.

Patch precheck came up clean, but Server Patching phase failed during ‘apply clusterware patch’ phase.
And later DB Home patching failed during ‘db upgrade phase’.

Job details:

Job details
—————————————————————-
ID: <job_id>
Description: Server Patching
Status: Failure
Created: March 22, 2019 12:47:26 AM UTC
Message: DCS-10001:Internal error encountered: Failure : failed to apply 28822515 on /u01/app/12.2.0.1/grid.

Task Name Start Time End Time Status
—————————————- ———————————– ———————————– ———-
Server Patching March 22, 2019 12:47:26 AM UTC March 22, 2019 1:06:43 AM UTC Failure
Create Patching Repository Directories March 22, 2019 12:47:26 AM UTC March 22, 2019 12:47:26 AM UTC Success
Create Patching Repository Directories March 22, 2019 12:47:26 AM UTC March 22, 2019 12:47:26 AM UTC Success
Download latest patch metadata March 22, 2019 12:47:26 AM UTC March 22, 2019 12:47:26 AM UTC Success
Download latest patch metadata March 22, 2019 12:47:26 AM UTC March 22, 2019 12:47:27 AM UTC Success
Update Patching Repository March 22, 2019 12:47:27 AM UTC March 22, 2019 12:48:12 AM UTC Success
Update Patching Repository March 22, 2019 12:48:12 AM UTC March 22, 2019 12:48:59 AM UTC Success
task:TaskSequential_6471 March 22, 2019 12:48:59 AM UTC March 22, 2019 1:06:43 AM UTC Failure
Opatch updation March 22, 2019 12:49:01 AM UTC March 22, 2019 12:49:03 AM UTC Success
Opatch updation March 22, 2019 12:49:01 AM UTC March 22, 2019 12:49:04 AM UTC Success
Patch conflict check March 22, 2019 12:49:04 AM UTC March 22, 2019 12:50:11 AM UTC Success
Patch conflict check March 22, 2019 12:50:11 AM UTC March 22, 2019 12:51:24 AM UTC Success
task:TaskSequential_6525 March 22, 2019 12:51:24 AM UTC March 22, 2019 1:06:43 AM UTC Failure
apply clusterware patch March 22, 2019 12:51:24 AM UTC March 22, 2019 1:06:43 AM UTC Failure

Job details
—————————————————————-
ID: <job_id>
Description: DB Home Patching: Home Id is <home_id>
Status: Failure
Created: March 28, 2019 4:47:29 AM UTC
Message: DCS-10001:Internal error encountered: Failed apply all actions on db home /u01/app/oracle/product/12.1.0.2/dbhome_1.

Task Name Start Time End Time Status
—————————————- ———————————– ———————————– ———-
DB Home Patching March 28, 2019 4:47:29 AM UTC March 28, 2019 5:12:13 AM UTC Failure
DB Home Patching March 28, 2019 4:47:29 AM UTC March 28, 2019 5:12:13 AM UTC Failure
Create Patching Repository Directories March 28, 2019 4:47:29 AM UTC March 28, 2019 4:47:29 AM UTC Success
Create Patching Repository Directories March 28, 2019 4:47:29 AM UTC March 28, 2019 4:47:29 AM UTC Success
Download latest patch metadata March 28, 2019 4:47:29 AM UTC March 28, 2019 4:47:29 AM UTC Success
Download latest patch metadata March 28, 2019 4:47:29 AM UTC March 28, 2019 4:47:29 AM UTC Success
checking GiHome version March 28, 2019 4:47:29 AM UTC March 28, 2019 4:47:29 AM UTC Success
checking GiHome version March 28, 2019 4:47:29 AM UTC March 28, 2019 4:47:29 AM UTC Success
Update System version March 28, 2019 4:47:29 AM UTC March 28, 2019 4:47:29 AM UTC Success
Update System version March 28, 2019 4:47:29 AM UTC March 28, 2019 4:47:30 AM UTC Success
Update Patching Repository March 28, 2019 4:47:30 AM UTC March 28, 2019 4:50:04 AM UTC Success
Update Patching Repository March 28, 2019 4:50:04 AM UTC March 28, 2019 4:52:33 AM UTC Success
task:TaskSequential_15288 March 28, 2019 4:52:34 AM UTC March 28, 2019 5:12:13 AM UTC Failure
Validating dbHome available space March 28, 2019 4:52:34 AM UTC March 28, 2019 4:52:34 AM UTC Success
Validating dbHome available space March 28, 2019 4:52:34 AM UTC March 28, 2019 4:52:34 AM UTC Success
Opatch updation March 28, 2019 4:52:35 AM UTC March 28, 2019 4:52:38 AM UTC Success
Opatch updation March 28, 2019 4:52:35 AM UTC March 28, 2019 4:52:38 AM UTC Success
Patch conflict check March 28, 2019 4:52:38 AM UTC March 28, 2019 4:55:02 AM UTC Success
Patch conflict check March 28, 2019 4:55:02 AM UTC March 28, 2019 4:57:58 AM UTC Success
task:TaskSequential_15347 March 28, 2019 4:57:58 AM UTC March 28, 2019 5:12:13 AM UTC Failure
db upgrade March 28, 2019 4:57:58 AM UTC March 28, 2019 5:05:27 AM UTC Success
db upgrade March 28, 2019 5:05:27 AM UTC March 28, 2019 5:12:13 AM UTC Failure

CHANGES:
It seems like customer modified the /etc/oraInst.loc file by pointing the inventory_loc to oemagent inventory.

CAUSE:
There were two issues:

1. It seems like customer modified the /etc/oraInst.loc file by pointing the inventory_loc to oemagent inventory.

2. Also there was a difference in permission for ADR directory in node 1 (this was a 2 node RAC)

From dcs-agent.log, we can see that the Opatch auto failed to apply the patch because of the below error:

SEVERE: OPatchAuto failed.
oracle.dbsysmodel.driver.sdk.productdriver.ProductDriverException:
oracle.sysman.oii.oiii.OiiiInventoryDoesNotExistException: The inventory
pointed at location /u01/oemagent/oraInventory is not valid <————————- incorrect inventory location (it should be /u01/app/oraInventory)
at
…………
at
com.oracle.glcm.patch.auto.product.ProductSupportManager.getOptions(ProductSup
portManager.java:261)
at com.oracle.glcm.patch.auto.OPatchAuto.orchestrate(OPatchAuto.java:287)
at com.oracle.glcm.patch.auto.OPatchAuto.main(OPatchAuto.java:212)
Caused by: oracle.sysman.oii.oiii.OiiiInventoryDoesNotExistException: The
inventory pointed at location /u01/oemagent/oraInventory is not valid
at
oracle.sysman.oii.oiii.OiiiInstallAreaControl.initAreaControlWithAccessCheck(O
iiiInstallAreaControl.java:1380)
………..
oracle.dbsysmodel.driver.sdk.productdriver.ProductDriverException:
oracle.sysman.oii.oiii.OiiiInventoryDoesNotExistException: The inventory
pointed at location /u01/oemagent/oraInventory is not valid
When they revert this and re-apply patching, they found next error

ORA-48146: missing read,
write,
or exec permission on directory during ADR initialization
[/u01/app/oracle/diag/rdbms/<db_unique_name>]

SOLUTION:
Here we may need to do the following:

1. Change the inventory_loc back to /u01/app/oraInventory for them to correct the permission on both nodes and re-try patching
2. For permission issues, they may need to change the permission of node1 as like node2 and retry patching.

# On node1:

# sudo su – oracle
# chmod 755 /u01/app/oracle/diag/rdbms/<db_unique_name>
After that check the permission with

# ls -ld /u01/app/oracle/diag/rdbms/<db_unique_name>
It should be “drwxr-xr-x” with ownership of oracle and group asmadmin.

Recent Posts