/tnt/ - Tournaments & Events

This board is for hosting tournaments and other organized competitions, be it either events, contests, or anything where a winner must be determined through votes or otherwise. Just for this board, image duplicates are enabled and the bump limits are set extra high. Roleplaying is encouraged, unless event hosts ask otherwise.

Thread stats: 42 posts, 8 files (8 image(s))

Toggle poster info Replying to /tnt/12579 Close window
Automating host jobsAnonymous
save file
image:166597135792.png(1MB , 1200x1200 , beepboop.png)
With the way things have gone in ms. /co/ it's clear theres a lot of junk in hosting that could benefit from software automation. This thread will be the place to discuss such themes and/or contribute to related projects.
Of course since I'm not a lazy faggot I also went ahead and dipped my toes into this subject, even if I have no background in programming besides picking up a book once in a while about the theme.
Still I've managed to begin this(somewhat) functional project:
Probably looks horrible to the experienced eyes, but its functional. With what has been written so far one can change the settings in the main Archive-bot file and it'll do as it says, grabbing the first valid thread that fits the pattern, reading every single post looking for valid matches fitting the parameters and saving it all in a text file.
Default parameters are as follows:
thread_pattern = "mr-co"
board = "co"
sleep_minutes = 5
minreplies = 8
(the above can easily vary)
Post must have an image attached, at least 1 reply, more replies than minreplies setting and have a string that fits the search pattern of beginning with 'Nominating/nominating', followed by a 'from' and ending with a dot(.). If it doesn't fit one of these it's immediatelly thrown out.
If it passes its saved on a text file as "x( y)" where x is all between 'nominating' and 'from', and y is all between 'from' and the dot.
I've experienced some bugs which *should* be patched by now in the git files and still havent tested a long time run of it yet. It's also missing a feature to remove similar entries, which I'm planning on using difflib for.
Added the function to clean up repeats. Seems to miss when there are several repeats but with multiple runs on the file they all get removed.
This sounds amazing.
Added up a library to deal with updating the google spreadsheet. Since it requires some setting up with the credentials I'll update the readme with a link to a python book explaining it.
It should now work and do the things I had planned it to do, assuming there's no sneaky bugs(there probably are, shit hasn't been tested during an actual event). Now just gotta make it a bit more user-friendly and wip out a pre-compiled exe to leave there.
Thanks! I was kinda surprised no one else seemed to have done something like this, given how long the tourneys have been going.
Seconding this. I remember the topic of automating the nomination process coming up but I never thought anyone would actually take the initiative to do it. This is definitely something that could improve future tournaments. Very impressed.
This is gonna be hella useful since nominations keep rising every year. Just two questions: will the robot have a name? Will it be a host too?
>will the robot have a name?
Yes. Currently the github project name is more of a placeholder. I'm thinking of going with simply 'Jenny' since it was suggested during some talks in the ms /co/ thread that discussed automation, and it fits.
But if you have other suggestions, I'm open to it.
>Will it be a host too?
As in the bot also creating the threads? If so, no, there's no current plans for that. Mainly because from what I see that sounds like something that would get the mods pissed.
Tho I do know they're lenient if the one doing it uses a pass(some youtuber bought one, made an AI-text bot and had it make hundreds of posts and they didnt give a shit), that would be an uneeded expence for the one hosting it.
Also smaller but relevant point to this, drawfags feel more a compliment if their drawing becomes an OP by being chosen by a human, than if a bot did it by choosing randomly.
personally I'd call it J3nny just to differentiate from Jenny plus leet speak
also probably can't be a host but a judge for sure
Should it be used it ought to have a mention in the OPs for sure. If nothing else so anons dont immediatelly assume a fuck-up made by the bot was done by the host.
Small update, minor code shit, mostly text changes.
New repo name aswell.
Big thing added is the "j3nny.rar" file which contains a pyinstaller output for the bot, download it and follow the steps in the project README, no need to downloaded python or libraries, pyinstaller took care of that shit.
I have confidence in its current state, but its still worth being skeptical of how it'll do in a real nomination event, as it hasnt yet been tested on one.
Oh yeah, also the exe icon is just a crop of jenny's face I got from the first google result, can't be arsed getting anything prettier.
Find a nicer picture for the icon, or get some drawfag to do it, and I'll update it.
I will be testing this for Queen. Thank you for the update.
Update: NSA told me he isn't using J3nny yet.
Spideranon !!hAaBXjbZBz7
Seeing that I foolishly let a bug fly by that ruined its output, its for the best. Thankfully I dont have much to do tomorrow so I can test it on my own.

Anyone who downloaded the .rar, feel free to delete it. Even with the google credentials, you can download those again from google. And the .pickle files generated from it are temporary(as i've come to learn today) and could mess shit if left there to rot.
Can you test it with Queen /v/ and share the results? I wanted to use it. But since I didn't have experience with it I couldn't utilize it well.
Spideranon !!hAaBXjbZBz7
What did you have trouble with?
And sure, I've been testing with Queen /v/, current results:
Could've gotten more but a few times here and there I had to stop it to fix bugs, or delete some data and such.
Gonna just let it run and do its thing until nominations are over now, then I'll look over to see what I need to fix.
The same issues you had. I stopped using it for this year. Can you share the results for your doc? It's restricted. Thank you for doing this.
Spideranon !!hAaBXjbZBz7
Right, right. Set it to public. Doc only has the results for the last thread, I was doing some tweaks to make the output be a bit cleaner, so for speed's sake I deleted previous data.
Some of the "messier" posts seem to have skipped by the bot, but generally ones consisting of "nominating x from y" are always scraped and converted correctly.

By the way, assuming you gave it a trial before, were you building from the python files or the .rar?
I will pray everyday for you to die a slow and agonizing death
Sooo why don't you communicate more directly with the thread amid all shit you stirred? Clearly you followed it and you've seen the vast majority haven't agreed with you. In all you're serving the board, not your interpretation of a proper tournament (which isn't consistent anyways).
I was using the rar. I had issues since it would only get the nominating soandso from etc. A lot of anons were nominating by just saying the name and adding flavor text then the series. It threw it off so I just quit it. At that point I just wanted to keep up. I'll try it again after the Qualifiers. I'm not a programmer so I can't troubleshoot like you. Your list is longer. Some of the titles are off but really good. Just needs manual touches, but overall that's a lot better than collecting them by hand.
Spideranon !!hAaBXjbZBz7
I see. Originally I had made its analysis process strict to avoid false positives, but then that lead to several false negatives in this test. Now its been changed so that if the post has an image, above minimum replies and isn't too large(current top text size is 100 characters) it'll try to make it fit the "someone(some series)" clean up, otherwise adding the post as is.
Will lead to some false positives, specially if anons are made aware their posts are being scraped by the bot, but that'll be left for the human host to deal with.

Github already updated with the changes. Rar to-be-done at a later time, because one of the libraries(pygsheets) seems to be causing some troubles with the library I use to make the standalone .exe(pyinstaller).
Spideranon !!hAaBXjbZBz7
Changed the compiler to nuitka and fixed the problem it and pyinstaller were having with pygsheets. Seems some code needed to change in some files, and some stuff wasnt getting copied.
Release of the v1.0(XJ-1.0) on the github.
Are we using it for King of /v/? If yes, don't forget to put J3nny's name in the list of judges.
Spideranon !!hAaBXjbZBz7
>Are we using it for King of /v/?
Up for whoever will be hosting it to decide. I am not and have never been a host.
I had to fix up my computer. It took a good but for me to install J3nny the first time, but since my computer was wiped I can't use it. Can you please create a public doc using it with King /v/? I'll credit you and J3nny.
Friendly reminder that J3nny, if you use her, should be credited as a judge.
If Spideranon is here then yes. I can't figure out how to get it to work again. My brain fizzled out during installation the first time.
Spideranon !!hAaBXjbZBz7
Been busy and away from computer so no can do, will probably only manage to run it, if at all during the tail end of nominations. What was your intall issue, should be easy to do. Just download the newest release from github(the .rar) and run j3nny.exe. Had trouble finding the link, downloading or something else?
Also i will be clear, if you do manage to run it now BACKUP THE SHEET, as the bot clears the sheet its given and then updates it with info from nominations.txt, which will be empty on a first run
save file
image:167736519932.png(41kB , 550x350 , Figure_1.png)
Still not done with the scrapebot project. Plan is to make it so it scans the catalog for the thread, then stays on thread adding new entries until the thread dies, sleeping less the further down the catalog the thread is. Then it uses archive search(currently the only method) should it not find a new thread.
Furthermore it should shut itself down once the time for nominations is done, probably via getting info from an https://itsalmo.st/ link since afaik thats the default used for timing here. This time I'll only update the github/release a compiled version once its been tested on a contest proper, since last time a bug I missed made it give junk output.

Another point I see automation could come in handy, vote analysis. If my memory is correct, amid the drama in one of last year's tourneys, NSA gave some sheets with vote data for anons to verify on their own. However such raw data is a handful to digest.
A script to automatically quantify, analyze and graph it could solve that problem and give more fidelity to the votes. Pic related, data obviously just random shit I graphed and correlated(via Pearson's formula from numpy).
...can't find the post with the file he gave though, if someone could get me that that would be wonderful.

A program that gives a real time graph view of votes could also be made for use by hosts... if the forms give that info, idk if it does. I've also been thinking of the math that could be used to flag fraudulous votes or results. I considered Benford's law graph combined with a chi-squared test, but I *think* the sample size for contests are generally too small for Benford's to be accurate, someone correct me if I'm wrong. Possibly just going with the more basic probability and statistical tests on the data.
Found it
Spideranon !!hAaBXjbZBz7
Oh it was of the famed cheated round too, excellent. I'll play around with it.
Thanks for the help.
save file
image:167762587167.png(204kB , 1366x664 , Figure_1.png)
Hmm, in hindsight, plotting so many values turned the graph into a bit of a mess, more work may be needed to make them pretty. And in hindsight again I dont *really* know which may be useful for proving or rejecting a cheating hypothesis.
Maybe if there was a "trustworthy" sheet of votes to compare with, but dunno if NSA gave such a thing
save file
image:167762588689.png(177kB , 1366x664 , Figure_2.png)
save file
image:167762597643.png(16kB , 640x480 , Figure_4.png)
Big amount of votes are usually made right after polls are opened, kind of expected.
save file
image:167762643912.png(67kB , 1366x664 , Benford.png)
Decided back on this and ran a Benford's law on ALL votes. I thought samples may be small at first but I foolishly thought of running on a per character basis. With all data in the matrix as sample(24 hours times 32 characters = 768) the sample size is more than enough for Benford.
Graph came out anomalous, but I'm still not sure on this test. Maybe this is a result of the great amount of votes on hour 1 as shown above. Maybe I should play around with sampling and if done on a minute by minute it'd come out diferent.
And again, without a trustworthy sheet to compare to, I'm iffy about interpreting it.
save file
image:167762668358.jpg(724kB , 1953x1242 , C5BEB029-07EF-4E42-AD8E-8048F9A50710.jpeg)
From what I’m getting from a voting perspective is that the first five hours and the last four hours are the most voters come to vote and are the most important time periods for campaigners to campaign for their characters which is quite insightful.

Tho, those spikes if connected to a certain group of characters can help us distinguish cheaters in these tournaments. Just a personal thought.
save file
image:167788164661.png(139kB , 2512x628 , Figure_3.png)
If they are I cant particularly tell based on just the graphs. But maybe its just my interpretation, heres the heatmap of characters votes per hour and the sheet its mapped to for any would be interested.
If NSA had shared a 'non cheated' graph interpretation would be easier, cant tell what is and isnt an anomaly without something to compare with.
Small addendum, the X axis is wrong, values are actually sum of character votes instead of votes, shouldve been divided by 16(32 characters divided by 2 to find n of matches), bar proportions still apply either way.
Spideranon !!hAaBXjbZBz7

Update, now it looks for a valid thread in the catalog first and stays on scraping it until it dies, then falls back to the previous archive search and repeat until the end time(given by an almo.st lnk) is reached.
Likely final version, not much else planned but fixing stuff.
Only other thing going through my mind was using AI models for the Named Entity Recognition task to grab the names of characters and media. Still need to measure the cost benefit of that tho, distilled models can fail on more unorthodox character/media names but larger models are, obviously, larger.
So this bot is now capable of collecting nominations itself without someone needing to go one by one?
Spideranon !!hAaBXjbZBz7
Always was, but the previous method was dumber and required more handholding of whoever was using it. Now once inputs are given it can be let to do its thing and it'll collect nominations and stop itself when it must instead of going on forever.

Correctly gathering the nomination for posts that dont follow a predictable pattern is still something it cant do and requires user intervention, but I'm looking at NER AI models to see if that can be fixed.

Also worth noting a new release hasnt been compiled yet, just updated the git code itself.

load average: array(3) { [0]=> float(0.35) [1]=> float(0.48) [2]=> float(0.66) }

total load time: 0.0094s

code time: 0.0065s;

query time: 0.0029s

code time percentage: 69.1489%

query time percentage: 30.8511%

queries: 2

memory usage: 1.24MB

memory peak: 1.24MB


sectionstring(6) "thread"
pagenumberstring(0) ""
extstring(4) "html"
extendedparamsarray(2) { ["board"]=> string(3) "tnt" ["thread"]=> string(5) "12579" }

Time log:

1680270544.33006191: beginning

1680270544.330061 - 0.000000 (0.000000s): system start

1680270544.330428 - 0.000366 (0.000366s): core files loaded

1680270544.330507 - 0.000445 (0.000079s): kobak\debug: setting up new instance

1680270544.330523 - 0.000461 (0.000016s): kobak\debug: finished setup

1680270544.330527 - 0.000465 (0.000004s): kobak\kcfg: setting up new instance

1680270544.330578 - 0.000516 (0.000051s): kobak\kcfg: finished setup

1680270544.330581 - 0.000519 (0.000003s): kobak\controls: setting up new instance

1680270544.330587 - 0.000525 (0.000006s): kobak\controls: finished setup

1680270544.330589 - 0.000527 (0.000002s): kobak\modules: setting up new instance

1680270544.330591 - 0.000529 (0.000002s): kobak\modules: finished setup

1680270544.330638 - 0.000576 (0.000047s): kobak\modules: loading module kobak

1680270544.330642 - 0.000580 (0.000004s): kobak\pages: setting up new instance

1680270544.330651 - 0.000589 (0.000009s): kobak\pages: finished setup

1680270544.330653 - 0.000591 (0.000002s): kobak\plugins: setting up new instance

1680270544.330655 - 0.000593 (0.000002s): kobak\plugins: finished setup

1680270544.330784 - 0.000722 (0.000129s): kobak\kobak: setting up new instance

1680270544.330789 - 0.000727 (0.000005s): kobak\kobak: finished setup

1680270544.330792 - 0.000730 (0.000003s): kobak\cache: setting up new instance

1680270544.330793 - 0.000731 (0.000001s): kobak\cache: finished setup

1680270544.330795 - 0.000733 (0.000002s): kobak\cache: loading cache file: kobak_cache

1680270544.330800 - 0.000738 (0.000005s): kobak\cache: checking cache file: kobak_cache

1680270544.330951 - 0.000889 (0.000151s): kobak\current: setting up new instance

1680270544.330956 - 0.000894 (0.000005s): kobak\current: finished setup

1680270544.330959 - 0.000897 (0.000003s): kobak\errors: setting up new instance

1680270544.330965 - 0.000903 (0.000006s): kobak\errors: finished setup

1680270544.331349 - 0.001287 (0.000384s): kobak\modules: finished loading module "kobak"

1680270544.331353 - 0.001291 (0.000004s): kobak\modules: finished loading 1 module(s)

1680270544.331356 - 0.001294 (0.000003s): kobak\current: creating default rewrite rules

1680270544.331358 - 0.001296 (0.000002s): kobak\current: starting validation of current section

1680270544.331366 - 0.001304 (0.000008s): kobak\plugins: checking actions for hook "before_validation"

1680270544.331386 - 0.001324 (0.000020s): kobak\plugins: finished running 1 actions for hook "before_validation"

1680270544.331392 - 0.001330 (0.000006s): kobak\current: finished validating section

1680270544.331400 - 0.001338 (0.000008s): kobak\current: finished validating prerequisites

1680270544.331405 - 0.001343 (0.000005s): kobak\current: finished validating optional inputs

1680270544.331409 - 0.001347 (0.000004s): kobak\plugins: checking actions for hook "validation"

1680270544.331418 - 0.001356 (0.000009s): KOBAK BBS: validating board

1680270544.331468 - 0.001406 (0.000050s): kobak\ban: query: querying posts

1680270544.331475 - 0.001413 (0.000007s): kobak\db: setting up new instance

1680270544.331477 - 0.001415 (0.000002s): kobak\db: Attempting to connect to database

1680270544.332112 - 0.002050 (0.000635s): kobak\db: Successfully connected to database

1680270544.332120 - 0.002058 (0.000008s): kobak\db: finished setup

1680270544.332841 - 0.002779 (0.000721s): KOBAK BBS: loading board specific plugins

1680270544.333333 - 0.003271 (0.000492s): KOBAK BBS: finished loading board specific plugins

1680270544.333341 - 0.003279 (0.000008s): KOBAK BBS: validating thread

1680270544.333347 - 0.003285 (0.000006s): kobak\thread: query: start

1680270544.333352 - 0.003290 (0.000005s): kobak\thread: query: thread mode set, calling posts query.

1680270544.333365 - 0.003303 (0.000013s): kobak\post: query: querying posts

1680270544.335598 - 0.005536 (0.002233s): kobak\post: query: constructing new post objects

1680270544.336366 - 0.006304 (0.000768s): kobak\post: query: starting lookups for thread locked statuses

1680270544.336389 - 0.006327 (0.000023s): kobak\post: query: setting up references/backlinks

1680270544.336457 - 0.006395 (0.000068s): kobak\post: get_posts: applying quote backlinks to posts

1680270544.336673 - 0.006611 (0.000216s): kobak\post: query: finished.

1680270544.336680 - 0.006618 (0.000007s): kobak\thread: query: posts queried, setting up thread object.

1680270544.336696 - 0.006634 (0.000016s): kobak\thread: query: finished.

1680270544.336728 - 0.006666 (0.000032s): kobak\plugins: finished running 1 actions for hook "validation"

1680270544.336773 - 0.006711 (0.000045s): kobak\current: Template file found

1680270544.336776 - 0.006714 (0.000003s): kobak\current: setting up HTML vars

1680270544.336780 - 0.006718 (0.000004s): kobak\html: setting up new instance

1680270544.336802 - 0.006740 (0.000022s): kobak\html: finished setup

1680270544.336806 - 0.006744 (0.000004s): kobak\current: setting up per-page HTML vars

1680270544.336813 - 0.006751 (0.000007s): kobak\plugins: checking actions for hook "setup_html"

1680270544.336931 - 0.006869 (0.000118s): kobak\plugins: finished running 4 actions for hook "setup_html"

1680270544.336934 - 0.006872 (0.000003s): kobak\current: finished setting up all HTML vars

1680270544.336936 - 0.006874 (0.000002s): start rendering html

1680270544.336987 - 0.006925 (0.000051s): loading thread

1680270544.337023 - 0.006961 (0.000036s): called header.php

1680270544.337030 - 0.006968 (0.000007s): kobak\plugins: checking actions for hook "header_meta"

1680270544.337104 - 0.007042 (0.000074s): kobak\plugins: finished running 5 actions for hook "header_meta"

1680270544.337214 - 0.007152 (0.000110s): kobak\cache: loading cache file: kobak_banners

1680270544.337221 - 0.007159 (0.000007s): kobak\cache: checking cache file: kobak_banners

1680270544.337527 - 0.007465 (0.000306s): kobak\plugins: checking actions for hook "after_postmessage"

1680270544.337543 - 0.007481 (0.000016s): kobak\plugins: finished running 1 actions for hook "after_postmessage"

1680270544.337562 - 0.007500 (0.000019s): rendering thread 12579

1680270544.337568 - 0.007506 (0.000006s): rendering reply 12579

1680270544.337620 - 0.007558 (0.000052s): kobak\kmedia: setting up new instance

1680270544.337625 - 0.007563 (0.000005s): kobak\kmedia: finished setup

1680270544.337662 - 0.007600 (0.000037s): rendering reply 12580

1680270544.337693 - 0.007631 (0.000031s): rendering reply 12619

1680270544.337718 - 0.007656 (0.000025s): rendering reply 12895

1680270544.337739 - 0.007677 (0.000021s): rendering reply 13142

1680270544.337761 - 0.007699 (0.000022s): rendering reply 13226

1680270544.337785 - 0.007723 (0.000024s): rendering reply 13237

1680270544.337806 - 0.007744 (0.000021s): rendering reply 13258

1680270544.337829 - 0.007767 (0.000023s): rendering reply 13260

1680270544.337850 - 0.007788 (0.000021s): rendering reply 13421

1680270544.337888 - 0.007826 (0.000038s): rendering reply 13510

1680270544.337916 - 0.007854 (0.000028s): rendering reply 13511

1680270544.337936 - 0.007874 (0.000020s): rendering reply 13521

1680270544.337955 - 0.007893 (0.000019s): rendering reply 13629

1680270544.337977 - 0.007915 (0.000022s): rendering reply 13640

1680270544.338001 - 0.007939 (0.000024s): rendering reply 13649

1680270544.338020 - 0.007958 (0.000019s): rendering reply 13688

1680270544.338045 - 0.007983 (0.000025s): rendering reply 13710

1680270544.338066 - 0.008004 (0.000021s): rendering reply 13711

1680270544.338092 - 0.008030 (0.000026s): rendering reply 13712

1680270544.338111 - 0.008049 (0.000019s): rendering reply 13714

1680270544.338130 - 0.008068 (0.000019s): rendering reply 13719

1680270544.338153 - 0.008091 (0.000023s): rendering reply 13732

1680270544.338176 - 0.008114 (0.000023s): rendering reply 13907

1680270544.338201 - 0.008139 (0.000025s): rendering reply 13908

1680270544.338221 - 0.008159 (0.000020s): rendering reply 13910

1680270544.338243 - 0.008181 (0.000022s): rendering reply 15197

1680270544.338267 - 0.008205 (0.000024s): rendering reply 15198

1680270544.338287 - 0.008225 (0.000020s): rendering reply 15199

1680270544.338307 - 0.008245 (0.000020s): rendering reply 15221

1680270544.338332 - 0.008270 (0.000025s): rendering reply 25505

1680270544.338381 - 0.008319 (0.000049s): rendering reply 25526

1680270544.338405 - 0.008343 (0.000024s): rendering reply 25571

1680270544.338425 - 0.008363 (0.000020s): rendering reply 26557

1680270544.338466 - 0.008404 (0.000041s): rendering reply 26558

1680270544.338500 - 0.008438 (0.000034s): rendering reply 26559

1680270544.338541 - 0.008479 (0.000041s): rendering reply 26561

1680270544.338581 - 0.008519 (0.000040s): rendering reply 26562

1680270544.338622 - 0.008560 (0.000041s): rendering reply 26671

1680270544.338661 - 0.008599 (0.000039s): rendering reply 31661

1680270544.338690 - 0.008628 (0.000029s): rendering reply 31670

1680270544.338711 - 0.008649 (0.000021s): rendering reply 31672

1680270544.338789 - 0.008727 (0.000078s): called footer.php

1680270544.339049 - 0.008987 (0.000260s): kobak\plugins: checking actions for hook "footer_meta"

1680270544.339146 - 0.009084 (0.000097s): kobak\plugins: finished running 9 actions for hook "footer_meta"

1680270544.339149 - 0.009087 (0.000003s): completed rendering most of the html

1680270544.339201 - 0.009139 (0.000052s): starting debug

1680270544.339489 - 0.009427 (0.000288s): starting debug_t

1680270544.33999800: end

SQL statistics:

time taken: 0.00069094 SELECT *, ( SELECT GROUP_CONCAT(DISTINCT `kobaboardban`.`boardid` ORDER BY `kobaboardban`.`boardid` SEPARATOR ',') FROM `kobaboardban` WHERE `banid` = `id` GROUP BY `banid` ) AS `active_boards` FROM `kobaban` WHERE ( (`iplong1` = '03eefa49') OR (`iplong1` <= '03eefa49' AND '03eefa49' <= `iplong2`) ) AND `type` IN(-1,0,1,2,3) AND (`id` IN ( SELECT `banid` FROM `kobaboardban` WHERE `boardid` IN(20) ) OR `global` = 1) ORDER BY `type`,`expiration` DESC no rows found or affected (prepared query)
time taken: 0.00219512 SELECT p.*, r.time r_time, r.reason r_reason, r.ipmd5 r_ipmd5, r.ipcrypt r_ipcrypt, r.password r_password, r.userid r_userid, s.total_posts, s.total_files, s.total_image, s.total_audio, s.total_video, s.total_swf, s.unique_ip, s.unique_user FROM `kobapost` p LEFT JOIN `kobareport` r on p.boardid = r.boardid and p.postid = r.postid LEFT JOIN `kobapoststats` s on p.boardid = s.boardid and p.postid = s.threadid WHERE p.`boardid` = '20' AND p.`postid` = '12579' AND `parentid` = '0' AND `deleted` = '0' UNION ALL SELECT p.*, r.time r_time, r.reason r_reason, r.ipmd5 r_ipmd5, r.ipcrypt r_ipcrypt, r.password r_password, r.userid r_userid, null as total_posts, null as total_files, null as total_image, null as total_audio, null as total_video, null as total_swf, null as unique_ip, null as unique_user FROM `kobapost` p LEFT JOIN `kobareport` r on p.boardid = r.boardid and p.postid = r.postid WHERE p.`boardid` = '20' AND `parentid` = '12579' AND `deleted` = '0' ORDER BY `parentid`,`postid` no error (prepared query)


after_postmessage actions: 1 start: 1680270544.33753100 end: 1680270544.33753900 took (total): 0.000008
before_validation actions: 1 start: 1680270544.33137000 end: 1680270544.33138300 took (total): 0.000013
footer_meta actions: 9 start: 1680270544.33905200 end: 1680270544.33914300 took (total): 0.000091
header_meta actions: 5 start: 1680270544.33703300 end: 1680270544.33710100 took (total): 0.000068
setup_html actions: 4 start: 1680270544.33681700 end: 1680270544.33692700 took (total): 0.000110
validation actions: 1 start: 1680270544.33141000 end: 1680270544.33672200 took (total): 0.005312

Rewrite/redirect rules:

Redirect /^([a-z_\-0-9]*)\/0.html(?:\?.*)?$/ $matches[1]\/ 302
Redirect /^([a-z_\-0-9]*)\/1.html(?:\?.*)?$/ $matches[1]\/ 302
Rewrite: /^([a-z_\-0-9]*)\/(\d\w*)s.png$/ /$matches[1]/thumbs/$matches[2]s.png
Rewrite: /^([a-z_\-0-9]*)\/?(?:\?.*)?$/ index.php?section=index&board=$matches[1]
Rewrite: /^([a-z_\-0-9]*)\/([0-9]*)\.html(?:\?.*)?$/ index.php?section=index&board=$matches[1]&pagenumber=$matches[2]
Rewrite: /^([a-z_\-0-9]*)\/catalog\.html(?:\?.*)?$/ index.php?section=catalog&board=$matches[1]
Rewrite: /^([a-z_\-0-9]*)\/t([0-9]*)\.(html|json|rss)(?:\?.*)?$/ index.php?section=thread&board=$matches[1]&thread=$matches[2]&ext=$matches[3]
Rewrite: /^([a-z_\-0-9]*)\/posts_feed\.rss$/ index.php?section=posts_feed&board=$matches[1]&ext=rss
Rewrite: /^([a-z_\-0-9]*)\/threads_feed\.rss$/ index.php?section=threads_feed&board=$matches[1]&ext=rss
Rewrite: /^latest_posts\.rss$/ index.php?section=latest_posts&ext=rss
Rewrite: /^latest_threads\.rss$/ index.php?section=latest_threads&ext=rss
Rewrite: /^latest_nsfw_posts\.rss$/ index.php?section=latest_nsfw_posts&ext=rss
Rewrite: /^latest_nsfw_threads\.rss$/ index.php?section=latest_nsfw_threads&ext=rss

rules in htaccess format:

RewriteRule ^([a-z_\-0-9]*)\/0.html(?:\?.*)?$ $1/ [R=302,L,QSA] RewriteRule ^([a-z_\-0-9]*)\/1.html(?:\?.*)?$ $1/ [R=302,L,QSA] RewriteRule ^([a-z_\-0-9]*)\/(\d\w*)s.png$ /$1/thumbs/$2s.png [L,QSA] RewriteRule ^([a-z_\-0-9]*)/(?:\?.*)?$ index.php?section=index&board=$1 [L,QSA] RewriteRule ^([a-z_\-0-9]*)\/([0-9]*)\.html(?:\?.*)?$ index.php?section=index&board=$1&pagenumber=$2 [L,QSA] RewriteRule ^([a-z_\-0-9]*)\/catalog\.html(?:\?.*)?$ index.php?section=catalog&board=$1 [L,QSA] RewriteRule ^([a-z_\-0-9]*)\/t([0-9]*)\.(html|json|rss)(?:\?.*)?$ index.php?section=thread&board=$1&thread=$2&ext=$3 [L,QSA] RewriteRule ^([a-z_\-0-9]*)\/posts_feed\.rss$ index.php?section=posts_feed&board=$1&ext=rss [L,QSA] RewriteRule ^([a-z_\-0-9]*)\/threads_feed\.rss$ index.php?section=threads_feed&board=$1&ext=rss [L,QSA] RewriteRule ^latest_posts\.rss$ index.php?section=latest_posts&ext=rss [L,QSA] RewriteRule ^latest_threads\.rss$ index.php?section=latest_threads&ext=rss [L,QSA] RewriteRule ^latest_nsfw_posts\.rss$ index.php?section=latest_nsfw_posts&ext=rss [L,QSA] RewriteRule ^latest_nsfw_threads\.rss$ index.php?section=latest_nsfw_threads&ext=rss [L,QSA]

Current time:

2023-03-31 06:49:04