Commonly used SQL is attached as an attachment, as well as below for easier copying.
TOAD Daily Workspace Setup:
- Open TOAD and paste in the copied SQL queries from below.
- Click the "Save Current W0orkspace" button.
- With the connection and sql queries saved to the workspace. Now when you select the workspace dropdown, you should see your newly created workspace.
- This next step is optional, but does save a few clicks everyday. You can have your workspace automatically save and open when you close/open TOAD. To set this up, go to Options ->General and check "Automatically save and restore workspaces".
Commonly used SQL queries:
-- Orphaned C_SDNSOX record detected. SELECT * FROM C_SDNSOX -- Check and Correct COMMIT Table SELECT * FROM C_SCOMMT -- Missing SO number in Esker converted order select * from c_eskerf where c_eskerf_id = '&ESKER_ID' --delete from c_eskerf where c_eskerf_id = '&ESKER_ID' select * from c_eskrer where c_eskrer_id = '&ESKER_ID' --delete from c_eskrer where c_eskrer_id = '&ESKER_ID' -- Erroneous SODELADD record select * from sodeladd where so = lpad('&SO_NUM',20) and planning_status = ' ' update sodeladd set planning_status = '4' where so = lpad('&SO_NUM',20) and so_line = '0001' and planning_status = ' ' -- Orders Shipped in Boxworks but not ship in G2 select * from c_upcnf where transfer_status = 'N' update c_upcnf set transfer_status = 'C' where e_upshcf_order_num = '' --------------------------------------------------------------------------- -- G2PROD - ERROR – Upload CPQ records to G2PROD Step 1: using SSMS ----------------------- use salesforce_backups select distinct Name from sf_update_cpq where LogID in ('###') order by 1 DECLARE @G2LogID nvarchar(50) set @G2LogID = convert(nvarchar(255),NEWID()) select @G2LogID as G2LogId insert into [G2PROD]..[GLOVIA_PROD].[MIS_SF_CPQ] (g2logid, sfcpqno, proc_date, proc_time) select distinct @G2LogID, Name, format(getdate(),'d'), format(getdate(),'HHmmss') from sf_update_cpq where LogID in ('###') order by 1 go -- End of Step 1. Step 2: Using TOAD ------------------ select * from mis_sf_cpq where g2logid = '# from Step 1' ; delete from c_sfqhdr a where a.c_sfqhdr_cpqno in (select b.sfcpqno from mis_sf_cpq b where b.g2logid = '# from Step 1'); delete from c_sfqdtl a where a.c_sfqdtl_cpqno in (select b.sfcpqno from mis_sf_cpq b where b.g2logid = '# from Step 1') -- End of Step 2. --------------------------------------------------------------------------- -- Verify Items with Negative commitment values ----------------------------------------------------- -- check ITEM_DET information select * from item_det where ccn = 'HG0101' and mas_loc = '11' and item = '&COM_ITEM' and revision = ' ' -- check ITEM_ML and ITEM_LOC tables select * from item_loc where ccn = 'HG0101' and mas_loc = '11' and item = '&COM_ITEM' and revision = ' ' select * from item_ml where ccn = 'HG0101' and mas_loc = '11' and item = '&COM_ITEM' and revision = ' ' -- Search the q_pikpak table for the true Commited Quantitity. select * from q_pikpak b where b.ccn = 'HG0101' and b.item = '&COM_ITEM' and b.revision = ' ' and b.mas_loc = '11' and b.cur_location = 'BOXWORKS' and b.shipped_date is null -- Search the xfr_tkt table for the true Commited Quantitity. select * from xfr_tkt c where c.ccn = 'HG0101' and c.item = '&COM_ITEM' and c.mas_loc = '11' and c.cur_location = 'BOXWORKS' -- Update commit qty using actual commitments from SO and TO. Be CAREFUL in updating ITEM_ML - this is by MAS_LOC. update item_det set com_qty = 8 where ccn = 'HG0101' and mas_loc = '11' and item = '&COM_ITEM' and revision = ' ' and location = 'BOXWORKS' and com_qty = -18 update item_loc set com_qty = 8 where ccn = 'HG0101' and mas_loc = '11' and item = '&COM_ITEM' and revision = ' ' and location = 'BOXWORKS' and com_qty = -18 -- Verify correct commit qty by MAS_LOC update item_ml set sum_com_qty = 8 where ccn = 'HG0101' and mas_loc = '11' and item = '&COM_ITEM' and revision = ' ' and sum_com_qty = -18 ---------------------------------------------------------------------------
Was this article helpful?
That’s Great!
Thank you for your feedback
Sorry! We couldn't be helpful
Thank you for your feedback
Feedback sent
We appreciate your effort and will try to fix the article