Publisher does not support the Fluid field type. Please do not contact asking when support will be available.

If you purchased an add-on from expressionengine.com, be sure to visit boldminded.com/claim to add the license to your account here on boldminded.com.

Ticket: Drafts not loading for existing entries

Status Resolved
Add-on / Version Publisher 2.5.2
Severity
EE Version 3.5.10

Derek Hogue

Jun 29, 2017

I just installed published on a pretty large site (~10,000 rows in channel_titles). Saving new entries as drafts works as expected - I can save a draft and it shows up as a draft, or save a published entry, and then save a draft, and I can subsequently toggle between them.

But if I save a draft of any previous entry (from before Publisher was installed), the behaviour gets weird.

1) The draft appears to save, and the preview of the draft is correct, but when I close the preview to go back to the publish screen, there is no option to toggle between Draft and Published (nor does the DRAFT banner background display.

2) In the entries list, the entry is highlighted as though it does have a draft, but the title displayed is the draft title, not the published title.

3) The entry on the front-end (not cached) continues to display the correct “original” (published) title.

4) If I save the draft again, I get another entry in publisher_data for that entry_id.

5) The entry in question does have a row in publisher_titles (and multiple rows in publisher_data).

Any ideas?

#1

BoldMinded (Brian)

Can you take a screenshot of your entries page with the toolbar?

It should only have 1 row in each of those tables for each language and status variant. So the unique key would be entry_id,status,lang_id. If you see more then 1 row with the same value in all 3 of those then something is definitely wrong.

#2

Derek Hogue

Comment has been marked private.

#3

BoldMinded (Brian)

I mean the toolbar when editing an entry. You’re saying that it isn’t letting you switch to the draft.

This: https://www.dropbox.com/s/v42ubr3rtya1lm2/Screenshot 2017-06-29 14.50.49.png?dl=0

#4

BoldMinded (Brian)

Can you provide CP access with directions what what to do. E.g. edit entry X, save it as X, etc etc?

#5

Derek Hogue

Comment has been marked private.

#6

BoldMinded (Brian)

What do you have on this settings page? https://www.dropbox.com/s/su7hpeb4znrkbbp/Screenshot 2017-06-29 15.02.08.png?dl=0

#7

Derek Hogue

I don’t have anything checked on the Approval Settings screen. I just now tried enabling that channel in those settings, and then adding a draft for another existing entry, but no change in behaviour.

#8

BoldMinded (Brian)

I’m actually seeing this locally too. I made a change to this in 2.5.2 and thought I did it correctly 😊

I’ll try to get this fixed soon.

#9

BoldMinded (Brian)

I take that back. When I create a new entry and save it as draft, so it has no open version yet, this is what I see

https://www.dropbox.com/s/hri65ck92saurg6/Screenshot 2017-06-29 15.13.48.png?dl=0

I think you’re going to need to provide a video or a detailed step by step of the actions you’re taking to reproduce this.

#10

Derek Hogue

New entries are fine. The issue is the entries which existed before I installed Publisher - those are the ones exhibiting this behaviour.

#11

BoldMinded (Brian)

What about the database tables? Did you find any duplicate rows that have the same entry_id, status, and lang_id?

#12

Derek Hogue

Wait - is publisher_data supposed to migrate content for every entry in channel_data upon install (much like it does from channel_titles to publisher_titles)?

#13

BoldMinded (Brian)

Yes, publisher_titles and publisher_data should basically be a clone of channel_titles and channel_data immediately after install.

#14

Derek Hogue

Well that’s probably the issue. That last migration to publisher_data didn’t happen. Too many rows (and columns) I suppose - I did get an internal server error after installing, but everything looked to be there so I assumed it was fine.

Anything to be done to improve the installation process on large sites?

#15

BoldMinded (Brian)

I can re-create this by manually deleting the rows in the exp_publisher_* tables. I think the issue is it didn’t finish the install correctly.

See this thread https://boldminded.com/support/ticket/1476

#16

Derek Hogue

Yup, makes sense. Even if there was a way to manually re-run the import after install, with a batched interface (PITA I know), that would be good.

#17

BoldMinded (Brian)

Sit tight, working on something that may help.

#18

BoldMinded (Brian)

Comment has been marked private.

#19

Derek Hogue

OK, uninstalled, then installed the new build, but get a fatal MySQL memory allocation error on install. Digging around to see what you’re doing, I see the note about the SQL files, so I looked in my cache and the publisher_titles file is there, but it’s clearly crapping out when trying to migrate the channel_data table, even just to write the SQL file. Then, whenever I reload the CP, it tries to continue with the install, append the publisher_titles SLQ file with more of the exact same data as before, and craps out.

#20

BoldMinded (Brian)

So its the initial query fetching the rows that need to be cloned that is the problem, not the insert query.

I’ll take another look at this, but probably won’t have anything to you for several days. Sorry.

#21

BoldMinded (Brian)

Comment has been marked private.

#22

Derek Hogue

Thanks. Getting an Internal Server Error on this one - looks like it did 500 into publisher_titles before dying.

#23

BoldMinded (Brian)

Ok, lets try a couple other things. In the legacy/models/publisher_entry.php file try adding this set_time_limit() call

https://www.dropbox.com/s/eph2kjfygvr8y09/Screenshot 2017-06-30 16.22.18.png?dl=0

If that doesn’t work, try lowering MAX_IMPORT_ENTRIES to 250, then run the import and see what happens.

Un-install Publisher before you do each one, don’t do them both at the same time until both fail separately. I’m curious to know what the sweet spot is here.

#24

Derek Hogue

Sadly, no joy. I tried lowering the limit constant to 250, then 100. Then I reverted to 500 and added the set_time_limit method. Still the same error. Then a limit of 100 and the set_time_limit change - same error.

It’s worth noting that each time, the channel_titles migration works, but the channel_data migration doesn’t work at all - not a single entry is migrated from channel_data to publisher_data.

#25

BoldMinded (Brian)

Comment has been marked private.

#26

Derek Hogue

Thanks Brian. This ended up working when I set the limit to 50 entries per batch. (It’s still running right now - so far 500 of over 10,000 entries processed.) I have something like ~500 columns in my channel_data table (big site with lots of channels, and thus lots of fields), so it’s pretty memory intensive.

Thanks for adding an automated method to run migrations for these larger sites.

#27

BoldMinded (Brian)

Glad to hear its working. I was wondering about the limit with lots of columns. I only had 8 columns in my channel_data table, so it was pretty quick. I’ll try lowering it to 50 per batch, or doing a count on the number of columns before starting the migration so it can batch as much as possible depending on the size of the table.

#28

BoldMinded (Brian)

The 2.6.0 release will have these changes in it. I lowered it to 50 per batch. Locally it goes through 10k entries in 2-3 minutes. I improved the visuals too. It reloads the progress in an iframe so you can actually see the # of entries left to import.

Thanks for being the guinea pig on this one 😊

#29

Derek Hogue

Awesome, glad to hear it!

Login to reply